The expansion in water projects implementations in Turkey and Syria becomes of great concern to the workers in the field of water resources management in Iraq. Such expansion with the absence of bi-lateral agreement between the three riparian countries of Tigris and Euphrates Rivers; Turkey, Syria and Iraq, is expected to lead to a substantially reduction of water inflow to the territories of Iraq. Accordingly, this study consists of two parts: first part is aiming to study the changes of the water inflow to the territory of Iraq, at Turkey and Syria borders, from 1953 to 2009; the results indicated that the annual mean inflow in Tigris River was decreased from 677 m3/sec to 526 m3/sec, after operating Turkey reservoirs, while in the Euphrates River the annual mean inflow was decreased from 1006 m3/sec to 627m3/sec after operating Syria and Turkey reservoirs. Second part is forecasting the monthly inflow and the water demand under the reduced inflow data. The results show that the future inflow of the Tigris River is expected to decrease to 57%, and reaches 301m3/sec. The Mosul reservoir will be able to supply 64% only of the water requirements to the downstream. The share of Iraq from the inflow of the Euphrates River is expected to be 58%, therefore the future inflow will reach 290 m3/sec. The Haditha reservoir will be able to supply 46% only of the water requirements to the downstream, due to reduced inflow at Iraqi border in the future.
In this work, a joint quadrature for numerical solution of the double integral is presented. This method is based on combining two rules of the same precision level to form a higher level of precision. Numerical results of the present method with a lower level of precision are presented and compared with those performed by the existing high-precision Gauss-Legendre five-point rule in two variables, which has the same functional evaluation. The efficiency of the proposed method is justified with numerical examples. From an application point of view, the determination of the center of gravity is a special consideration for the present scheme. Convergence analysis is demonstrated to validate the current method.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreIn this paper, image compression technique is presented based on the Zonal transform method. The DCT, Walsh, and Hadamard transform techniques are also implements. These different transforms are applied on SAR images using Different block size. The effects of implementing these different transforms are investigated. The main shortcoming associated with this radar imagery system is the presence of the speckle noise, which affected the compression results.
Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreImpressed current cathodic protection controlled by computer gives the ideal solution to the changes in environmental factors and long term coating degradation. The protection potential distribution achieved and the current demand on the anode can be regulated to protection criteria, to achieve the effective protection for the system.
In this paper, cathodic protection problem of above ground steel storage tank was investigated by an impressed current of cathodic protection with controlled potential of electrical system to manage the variation in soil resistivity. Corrosion controller has been implemented for above ground tank in LabView where tank's bottom potential to soil was manipulated to the desired set poi
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreObjective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show More