Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR) algorithm was used, which represents a load balancing (LB) algorithm that is applied to schedule and distributes data among fog servers by reading CPU and memory values of these servers in order to improve system performance. The results proved that DWRR algorithm provides high throughput which reaches 3290 req/sec at 919 users. A lot of research is concerned with distribution of workload by using LB techniques without paying much attention to Fault Tolerance (FT), which implies that the system continues to operate even when fault occurs. Therefore, we proposed a replication FT technique called primary-backup replication based on dynamic checkpoint interval on FC. Checkpoint was used to replicate new data from a primary server to a backup server dynamically by monitoring CPU values of primary fog server, so that checkpoint occurs only when the CPU value is larger than 0.2 to reduce overhead. The results showed that the execution time of data filtering process on the FC with a dynamic checkpoint is less than the time spent in the case of the static checkpoint that is independent on the CPU status.
Machine learning-based techniques are used widely for the classification of images into various categories. The advancement of Convolutional Neural Network (CNN) affects the field of computer vision on a large scale. It has been applied to classify and localize objects in images. Among the fields of applications of CNN, it has been applied to understand huge unstructured astronomical data being collected every second. Galaxies have diverse and complex shapes and their morphology carries fundamental information about the whole universe. Studying these galaxies has been a tremendous task for the researchers around the world. Researchers have already applied some basic CNN models to predict the morphological classes
... Show MoreCOVID-19 is a disease caused by a coronavirus spread globally, including in Iraq; infections have appeared on all Iraq lands in varying proportions. Iraq is among the higher infected world countries. Forty-six infections were simulated on 23 March 2020. Injuries on the eastern side of Baghdad city and to the right side of the Tigris River, which divides the city into two parts, are a natural barrier in quarantine and easily control the movement of people from both sides.
In this study, a model was considered a scientific and practical method by following the steps of identifying infected people using the best scientific approach for the spatial process to prevent the virus from spreading. Remote sensing techniques were
... Show MoreAccurate predictive tools for VLE calculation are always needed. A new method is introduced for VLE calculation which is very simple to apply with very good results compared with previously used methods. It does not need any physical property except each binary system need tow constants only. Also, this method can be applied to calculate VLE data for any binary system at any polarity or from any group family. But the system binary should not confirm an azeotrope. This new method is expanding in application to cover a range of temperature. This expansion does not need anything except the application of the new proposed form with the system of two constants. This method with its development is applied to 56 binary mixtures with 1120 equili
... Show MoreIn this paper, we used the maximum likelihood estimation method to find the estimation values ​​for survival and hazard rate functions of the Exponential Rayleigh distribution based on a sample of the real data for lung cancer and stomach cancer obtained from the Iraqi Ministry of Health and Environment, Department of Medical City, Tumor Teaching Hospital, depending on patients' diagnosis records and number of days the patient remains in the hospital until his death.
In this paper, the reliability of the stress-strength model is derived for probability P(Y<X) of a component having its strength X exposed to one independent stress Y, when X and Y are following Gompertz Fréchet distribution with unknown shape parameters and known parameters . Different methods were used to estimate reliability R and Gompertz Fréchet distribution parameters, which are maximum likelihood, least square, weighted least square, regression, and ranked set sampling. Also, a comparison of these estimators was made by a simulation study based on mean square error (MSE) criteria. The comparison confirms that the performance of the maximum likelihood estimator is better than that of the other estimators.
This research is a theoretical study that deals with the presentation of the literature of statistical analysis from the perspective of gender or what is called Engendering Statistics. The researcher relied on a number of UN reports as well as some foreign sources to conduct the current study. Gender statistics are defined as statistics that reflect the differences and inequality of the status of women and men overall domains of life, and their importance stems from the fact that it is an important tool in promoting equality as a necessity for the process of sustainable development and the formulation of national and effective development policies and programs. The empowerment of women and the achievement of equality between men and wome
... Show MoreThis paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o
... Show MoreThis paper presents a method to organize memory chips when they are used to build memory systems that have word size wider than 8-bit. Most memory chips have 8-bit word size. When the memory system has to be built from several memory chips of various sizes, this method gives all possible organizations of these chips in the memory system. This paper also suggests a precise definition of the term “memory bank” that is usually used in memory systems. Finally, an illustrative design problem was taken to illustrate the presented method practically
Tectonically, the location of the Al-Ma'aniyah depression area is far from active boundary zones, their tectonic features have to reflect the original depositional environments with some horizontal movement due to rearrangement of the basement blocks during different actives orogenic movements. So, the analysis of aeromagnetic data were considered to estimate the thickness and structural pattern of the sedimentary cover sequences for this area. The aeromagnetic data, which are derived from Iraqi GEOSURV to Al-Ma′aniyah region is analyzed and processed for qualitative and quantitative interpretations. The process includes reducing the aeromagnetic data to pole RTP, separation the aeromagnetic data to regional an
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show More