Preferred Language
Articles
/
ijs-2093
Dynamic Fault Tolerance Aware Scheduling for Healthcare System on Fog Computing
...Show More Authors

 Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR) algorithm was used, which represents a load balancing (LB) algorithm that is applied to schedule and distributes data among fog servers by reading CPU and memory values of these servers in order to improve system performance. The results proved that DWRR algorithm provides high throughput which reaches 3290 req/sec at 919 users. A lot of research is concerned with distribution of workload by using LB techniques without paying much attention to Fault Tolerance (FT), which implies that the system continues to operate even when fault occurs. Therefore, we proposed a replication FT technique called primary-backup replication based on dynamic checkpoint interval on FC. Checkpoint was used to replicate new data from a primary server to a backup server dynamically by monitoring CPU values of primary fog server, so that checkpoint occurs only when the CPU value is larger than 0.2 to reduce overhead. The results showed that the execution time of data filtering process on the FC with a dynamic checkpoint is less than the time spent in the case of the static checkpoint that is independent on the CPU status.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Qin Seal Script Character Recognition with Fuzzy and Incomplete Information
...Show More Authors

The dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Thu Jul 01 2021
Journal Name
Iraqi Journal Of Science
Applying Similarity Measures to Improve Query Expansion
...Show More Authors

The huge evolving in the information technologies, especially in the few last decades, has produced an increase in the volume of data on the World Wide Web, which is still growing significantly. Retrieving the relevant information on the Internet or any data source with a query created by a few words has become a big challenge. To override this, query expansion (QE) has an important function in improving the information retrieval (IR), where the original query of user is recreated to a new query by appending new related terms with the same importance. One of the problems of query expansion is the choosing of suitable terms. This problem leads to another challenge of how to retrieve the important documents with high precision, high recall

... Show More
View Publication Preview PDF
Crossref (1)
Scopus Crossref
Publication Date
Fri Dec 25 2009
Journal Name
Wireless Personal Communications
A N-Radon Based OFDM Trasceivers Design and Performance Simulation Over Different Channel Models
...Show More Authors

In this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Thu Nov 01 2012
Journal Name
Ijcsi International Journal Of Computer Science
Implementing a novel approach an convert audio compression to text coding via hybrid technique
...Show More Authors

Compression is the reduction in size of data in order to save space or transmission time. For data transmission, compression can be performed on just the data content or on the entire transmission unit (including header data) depending on a number of factors. In this study, we considered the application of an audio compression method by using text coding where audio compression represented via convert audio file to text file for reducing the time to data transfer by communication channel. Approach: we proposed two coding methods are applied to optimizing the solution by using CFG. Results: we test our application by using 4-bit coding algorithm the results of this method show not satisfy then we proposed a new approach to compress audio fil

... Show More
View Publication Preview PDF
Publication Date
Mon Jan 01 2018
Journal Name
International Journal Of Data Mining, Modelling And Management
Association rules mining using cuckoo search algorithm
...Show More Authors

Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.

View Publication Preview PDF
Scopus (7)
Crossref (3)
Scopus Crossref
Publication Date
Wed Mar 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Euro Dinar Trading Analysis Using WARIMA Hybrid Model
...Show More Authors

The rise in the general level of prices in Iraq makes the local commodity less able to compete with other commodities, which leads to an increase in the amount of imports and a decrease in the amount of exports, since it raises demand for foreign currencies while decreasing demand for the local currency, which leads to a decrease in the exchange rate of the local currency in exchange for an increase in the exchange rate of currencies. This is one of the most important factors affecting the determination of the exchange rate and its fluctuations. This research deals with the currency of the European Euro and its impact against the Iraqi dinar. To make an accurate prediction for any process, modern methods can be used through which

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jan 18 2022
Journal Name
Iraqi Journal Of Science
Fast Text Analysis Using Symbol Enumeration and Hashing Methodology
...Show More Authors

This paper is focusing on reducing the time for text processing operations by taking the advantage of enumerating each string using the multi hashing methodology. Text analysis is an important subject for any system that deals with strings (sequences of characters from an alphabet) and text processing (e.g., word-processor, text editor and other text manipulation systems). Many problems have been arisen when dealing with string operations which consist of an unfixed number of characters (e.g., the execution time); this due to the overhead embedded-operations (like, symbols matching and conversion operations). The execution time largely depends on the string characteristics; especially its length (i.e., the number of characters consisting

... Show More
View Publication Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Engineering
A Case Study of Bus Line Passenger Volumes of Bakrajo Bus Lines in Sulaimani City
...Show More Authors

Transit agencies constantly need information about system operations and passengers to support their regular scheduling and operation planning processes. The lack of these processes and cultural motivations to use public transportations contributes enormously to the reliance on the private cars rather than public transportation, resulting in traffic congestions. The traffic congestions occur mainly during peak hours and the accidents happening as a result of road accidents and construction works.  This study investigates the effects of weekday and weekend travel variability on peak hours of the passenger flow distribution on bus lines, which can effectively reflect the degree of traffic congestion. A study of passen

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Dec 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the method of partial least squares and the algorithm of singular values decomposion to estimate the parameters of the logistic regression model in the case of the problem of linear multiplicity by using the simulation
...Show More Authors

The logistic regression model is an important statistical model showing the relationship between the binary variable and the explanatory variables.                                                        The large number of explanations that are usually used to illustrate the response led to the emergence of the problem of linear multiplicity between the explanatory variables that make estimating the parameters of the model not accurate.    

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Dec 30 2016
Journal Name
Al-kindy College Medical Journal
Deep Vein Thrombosis Predisposing Factors Analysis Using Association Rules Mining
...Show More Authors

Background: DVT is a very common problem with a very serious complications like pulmonary embolism (PE) which carries a high mortality,and many other chronic and annoying complications ( like chronic DVT, post-phlebitic syndrome, and chronic venous insufficiency) ,and it has many risk factors that affect its course, severity ,and response to treatment. Objectives: Most of those risk factors are modifiable, and a better understanding of the relationships between them can be beneficial for better assessment for liable pfatients , prevention of disease, and the effectiveness of our treatment modalities. Male to female ratio was nearly equal , so we didn’t discuss the gender among other risk factors. Type of the study:A cross- secti

... Show More
View Publication Preview PDF