In this article, a numerical method integrated with statistical data simulation technique is introduced to solve a nonlinear system of ordinary differential equations with multiple random variable coefficients. The utilization of Monte Carlo simulation with central divided difference formula of finite difference (FD) method is repeated n times to simulate values of the variable coefficients as random sampling instead being limited as real values with respect to time. The mean of the n final solutions via this integrated technique, named in short as mean Monte Carlo finite difference (MMCFD) method, represents the final solution of the system. This method is proposed for the first time to calculate the numerical solution obtained for each subpopulation as a vector distribution. The numerical outputs are tabulated, graphed, and compared with previous statistical estimations for 2013, 2015, and 2030, respectively. The solutions of FD and MMCFD are found to be in good agreement with small standard deviation of the means, and small measure of difference. The new MMCFD method is useful to predict intervals of random distributions for the numerical solutions of this epidemiology model with better approximation and agreement between existing statistical estimations and FD numerical solutions.
Background: War represents a major human crisis; it destroys communities and results in ingrained consequences for public health and well-being
Objective: We set this study to shed light on the public health status in Iraq after the successive wars, sanctions, sectarian conflicts, and terrorism, in light of certain health indicators.
Design: The primary source of data for this analysis comes from the Iraqi Ministry of Health, and The World Health Organization disease surveillance.
Results: Most of the morbidity indicators are high, even those that are relatively declining recently, are still higher than those repor
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreThe choice of binary Pseudonoise (PN) sequences with specific properties, having long period high complexity, randomness, minimum cross and auto- correlation which are essential for some communication systems. In this research a nonlinear PN generator is introduced . It consists of a combination of basic components like Linear Feedback Shift Register (LFSR), ?-element which is a type of RxR crossbar switches. The period and complexity of a sequence which are generated by the proposed generator are computed and the randomness properties of these sequences are measured by well-known randomness tests.
The main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreCloud computing is a pay-as-you-go model that provides users with on-demand access to services or computing resources. It is a challenging issue to maximize the service provider's profit and, on the other hand, meet the Quality of Service (QoS) requirements of users. Therefore, this paper proposes an admission control heuristic (ACH) approach that selects or rejects the requests based on budget, deadline, and penalty cost, i.e., those given by the user. Then a service level agreement (SLA) is created for each selected request. The proposed work uses Particle Swarm Optimization (PSO) and the Salp Swarm Algorithm (SSA) to schedule the selected requests under budget and deadline constraints. Performances of PSO and SSA with and witho
... Show MoreThe research aims to improve operational performance through the application of the Holonic Manufacturing System (HMS) in the rubber products factory in Najaf. The problem was diagnosed with the weakness of the manufacturing system in the factory to meet customers' demands on time within the available resources of machines and workers, which led to time delays of Processing and delivery, increased costs, and reduced flexibility in the factory, A case study methodology used to identify the reality of the manufacturing system and the actual operational performance in the factory. The simulation was used to represent the proposed (HMS) by using (Excel 2010) based on the actual data and calculate the operational performance measures
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThis paper presents a robust algorithm for the assessment of risk priority for medical equipment based on the calculation of static and dynamic risk factors and Kohnen Self Organization Maps (SOM). Four risk parameters have been calculated for 345 medical devices in two general hospitals in Baghdad. Static risk factor components (equipment function and physical risk) and dynamics risk components (maintenance requirements and risk points) have been calculated. These risk components are used as an input to the unsupervised Kohonen self organization maps. The accuracy of the network was found to be equal to 98% for the proposed system. We conclude that the proposed model gives fast and accurate assessment for risk priority and it works as p
... Show More