The expansion in water projects implementations in Turkey and Syria becomes of great concern to the workers in the field of water resources management in Iraq. Such expansion with the absence of bi-lateral agreement between the three riparian countries of Tigris and Euphrates Rivers; Turkey, Syria and Iraq, is expected to lead to a substantially reduction of water inflow to the territories of Iraq. Accordingly, this study consists of two parts: first part is aiming to study the changes of the water inflow to the territory of Iraq, at Turkey and Syria borders, from 1953 to 2009; the results indicated that the annual mean inflow in Tigris River was decreased from 677 m3/sec to 526 m3/sec, after operating Turkey reservoirs, while in the Euphrates River the annual mean inflow was decreased from 1006 m3/sec to 627m3/sec after operating Syria and Turkey reservoirs. Second part is forecasting the monthly inflow and the water demand under the reduced inflow data. The results show that the future inflow of the Tigris River is expected to decrease to 57%, and reaches 301m3/sec. The Mosul reservoir will be able to supply 64% only of the water requirements to the downstream. The share of Iraq from the inflow of the Euphrates River is expected to be 58%, therefore the future inflow will reach 290 m3/sec. The Haditha reservoir will be able to supply 46% only of the water requirements to the downstream, due to reduced inflow at Iraqi border in the future.
One wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
This research study Blur groups (Fuzzy Sets) which is the perception of the most modern in the application in various practical and theoretical areas and in various fields of life, was addressed to the fuzzy random variable whose value is not real, but the numbers Millbh because it expresses the mysterious phenomena or uncertain with measurements are not assertive. Fuzzy data were presented for binocular test and analysis of variance method of random Fuzzy variables , where this method depends on a number of assumptions, which is a problem that prevents the use of this method in the case of non-realized.
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show More
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreThis research aims at answering many questions raised by the research problem concerning the view of the organizations under consideration for the concept of smart leadership and its most important dimensions, as well as the view of crisis management and its concept and most important methods through research objectives that define and clarify the smart leadership with its dimensions and methods of crisis management.
For the purpose of reaching the results of the research and testing the assumptions about the relationship between smart leadership and methods of crisis management, the researcher adopted a questionnaire, designed especially to be a criterion for the research, as the main tool for data coll
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More