Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR) algorithm was used, which represents a load balancing (LB) algorithm that is applied to schedule and distributes data among fog servers by reading CPU and memory values of these servers in order to improve system performance. The results proved that DWRR algorithm provides high throughput which reaches 3290 req/sec at 919 users. A lot of research is concerned with distribution of workload by using LB techniques without paying much attention to Fault Tolerance (FT), which implies that the system continues to operate even when fault occurs. Therefore, we proposed a replication FT technique called primary-backup replication based on dynamic checkpoint interval on FC. Checkpoint was used to replicate new data from a primary server to a backup server dynamically by monitoring CPU values of primary fog server, so that checkpoint occurs only when the CPU value is larger than 0.2 to reduce overhead. The results showed that the execution time of data filtering process on the FC with a dynamic checkpoint is less than the time spent in the case of the static checkpoint that is independent on the CPU status.
Improving performance is an important issue in Wireless Sensor Networks (WSN). WSN has many limitations including network performance. The research question is how to reduce the amount of data transmitted to improve network performance?
The work will include one of the dictionary compression methods which is Lempel Ziv Welch(LZW). One problem with the dictionary method is that the token size is fixed. The LZW dictionary method is not very useful with little data, because it loses many byt
... Show MoreFrequent data in weather records is essential for forecasting, numerical model development, and research, but data recording interruptions may occur for various reasons. So, this study aims to find a way to treat these missing data and know their accuracy by comparing them with the original data values. The mean method was used to treat daily and monthly missing temperature data. The results show that treating the monthly temperature data for the stations (Baghdad, Hilla, Basra, Nasiriya, and Samawa) in Iraq for all periods (1980-2020), the percentage for matching between the original and the treating values did not exceed (80%). So, the period was divided into four periods. It was noted that most of the congruence values increased, re
... Show MoreThis paper aims at the analytical level to know the security topics that were used with data journalism, and the expression methods used in the statements of the Security Media Cell, as well as to identify the means of clarification used in data journalism. About the Security Media Cell, and the methods preferred by the public in presenting press releases, especially determining the strength of the respondents' attitude towards the data issued by the Security Media Cell. On the Security Media Cell, while the field study included the distribution of a questionnaire to the public of Baghdad Governorate. The study reached several results, the most important of which is the interest of the security media cell in presenting its data in differ
... Show MoreInvestigation of geotechnical vulnerability (liquefaction) and Zonation of the southern region of the Caspian Sea is my most important aim in terms of destructive earthquakes hazard potential. Past geologic events on the south coast of Caspian Sea indicates that destructive earthquakes lead to the death of numbers in this area. Remained evidence of seismic events happening indicates extensive landslides, liquefaction and soil subsidence in the residential and even natural area. Therefore, in this study determination of geotechnical vulnerability (liquefaction) intensity in southern coast of Caspian Sea against natural forces resulting from earthquakes and coastal construction via geographical information system e
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreSubstantial research has been performed on Building Information Modeling (BIM) in various topics, for instance, the use and benefit of BIM in design, construction, sustainable environment building, and Facility assets over the past several years. Although there are various studies on these topics, Building Information Modeling (BIM) awareness through facilities management is still relatively poor. The researcher's interest is increased in BIM study is based heavily upon the perception that it can facilitate the exchange and reuse of information during various project phases. This property and others can be used in the Iraqi Construction industry to motivate the government to eliminate the change resistance to use innovat
... Show MoreThis paper demonstrates the construction of a modern generalized Exponential Rayleigh distribution by merging two distributions with a single parameter. The "New generalized Exponential-Rayleigh distribution" specifies joining the Reliability function of exponential pdf with the Reliability function of Rayleigh pdf, and then adding a shape parameter for this distribution. Finally, the mathematical and statistical characteristics of such a distribution are accomplished
This paper presents an IoT smart building platform with fog and cloud computing capable of performing near real-time predictive analytics in fog nodes. The researchers explained thoroughly the internet of things in smart buildings, the big data analytics, and the fog and cloud computing technologies. They then presented the smart platform, its requirements, and its components. The datasets on which the analytics will be run will be displayed. The linear regression and the support vector regression data mining techniques are presented. Those two machine learning models are implemented with the appropriate techniques, starting by cleaning and preparing the data visualization and uncovering hidden information about the behavior of
... Show MoreThe m-consecutive-k-out-of-n: F linear and circular system consists of n sequentially connected components; the components are ordered on a line or a circle; it fails if there are at least m non-overlapping runs of consecutive-k failed components. This paper proposes the reliability and failure probability functions for both linearly and circularly m-consecutive-k-out-of-n: F systems. More precisely, the failure states of the system components are separated into two collections (the working and the failure collections); where each one is defined as a collection of finite mutual disjoint classes of the system states. Illustrative example is provided.
In this paper, we study a single stress-strength reliability system , where Ƹ and ƴ are independently Exponentiated q-Exponential distribution. There are a few traditional estimating approaches that are derived, namely maximum likelihood estimation (MLE) and the Bayes (BE) estimators of R. A wide mainframe simulation is used to compare the performance of the proposed estimators using MATLAB program. A simulation study show that the Bayesian estimator is the best estimator than other estimation method under consideration using two criteria such as the “mean squares error (MSE)” and “mean absolutely error (MAPE)”.