Studying extreme precipitation is very important in Iraq. In particular, the last decade witnessed an increasing trend in extreme precipitation as the climate change. Some of which caused a disastrous consequences on social and economic environment in many parts of the country. In this paper a statistical analysis of rainfall data is performed. Annual maximum rainfall data obtained from monthly records for a period of 127 years (1887-2013 inclusive) at Baghdad metrology station have been analyzed. The three distributions chosen to fit the data were Gumbel, Fréchet and the generalized Extreme Value (GEV) distribution. Using the maximum likelihood method, results showed that the GEV distribution was the best followed by Fréchet distribution.
Collapsing building structures during recent earthquakes, especially in Northern and Eastern Kurdistan, including the 2003 earthquake in Cewlig; the 2011 earthquake in Van; and the 2017 earthquake near Halabja province, has raised several concerns about the safety of pre-seismic code buildings and emergency facilities in Erbil city. The seismic vulnerability assessment of the hospital buildings as emergency facilities is one of the necessities which have a critical role in the recovery period following earthquakes. This research aims to study in detail and to extend the present knowledge about the seismic vulnerability of the Rizgary public hospital building in Erbil city, which was constructed before releasing the seism
... Show MoreThe Research aims to investigate into reality in terms of planning and scheduling management process for sake the implementation and maintenance of irrigation and drainage projects in the Republic of Iraq, with an indication of the most important obstacles that impede the planning and scheduling management process for these projects and ways of addressing them and minimizing their effects. For the purpose of achieving the goal of the research, a sci
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreGreywater is a possible water source that can be improved for meeting the quality required for irrigation. Treatment of greywater can range from uncomplicated coarse filtration to advanced biological treatment. This article presents a simple design of a small scale greywater treatment plant, which is a series of physical and natural processes including screening, aeration, sedimentation, and filtration using granular activated carbon filter and differentiates its performance with sand filter. The performance of these units with the dual filter media of (activated carbon with sand) in treatment of greywater from Iraqi house in Baghdad city during 2019 and that collected from several points including washbasins, kitchen si
... Show MoreTemperature predicting is the utilization to forecast the condition of the temperature for an upcoming date for a given area. Temperature predictions are done by gathering quantitative data in regard to the current state of the atmosphere. In this study, a proposed hybrid method to predication the daily maximum and minimum air temperature of Baghdad city which combines standard backpropagation with simulated annealing (SA). Simulated Annealing Algorithm are used for weights optimization for recurrent multi-layer neural network system. Experimental tests had been implemented using the data of maximum and minimum air temperature for month of July of Baghdad city that got from local records of Iraqi Meteorological O
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
This paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreSo muchinformation keeps on being digitized and stored in several forms, web pages, scientific articles, books, etc. so the mission of discovering information has become more and more challenging. The requirement for new IT devices to retrieve and arrange these vastamounts of informationaregrowing step by step. Furthermore, platforms of e-learning are developing to meet the intended needsof students.
The aim of this article is to utilize machine learning to determine the appropriate actions that support the learning procedure and the Latent Dirichlet Allocation (LDA) so as to find the topics contained in the connections proposed in a learning session. Ourpurpose is also to introduce a course which moves toward the student's attempts a