Seepage through earth dams is one of the most popular causes for earth dam collapse due to internal granule movement and seepage transfer. In earthen dams, the core plays a vital function in decreasing seepage through the dam body and lowering the phreatic line. In this research, an alternative soil to the clay soil used in the dam core has been proposed by conducting multiple experiments to test the permeability of silty and sandy soil with different additives materials. Then the selected sandy soil model was used to represent the dam experimentally, employing a permeability device to measure the amount of water that seeps through the dam's body and to represent the seepage line. A numerical model was adopted using Geo-Studio software in the branch (SEEP/W) to simulate the experimental model, examined soils with different percentages of additives, and compared the numerical and experimental results to predict the innovation model of soil. It was found that the sandy type (C) soil model has a permeability very close to that of clay soil when using 10% cement kiln dust (CKD) and 5% cement as additives. Furthermore, soil type (C) was calibrated with the core soil of HIMREEN Earth dam, which is clay soil, as well as with the core soil of HADITHA Earth dam, which is composed of dolomite. The comparison between the results of the hypothetical simulated cases and the real cases were revealed a high agreement between the two cases according to the resulted of identical phreatic (seepage) lines and the calculated amount of seepages water from these cases.
We have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreMercury, arsenic, cadmium and lead, were measured in sediment samples of river and marine environmental of Basra governorate in southern of Iraq. Sixteen sites of sediment were selected and distributed along Shatt Al-Arab River and the Iraqi marine environment. The samples were distributed among one station on Euphrates River before its confluence with Tigris River and Shatt Al-Arab formation, seven stations along Shatt Al-Arab River and eight stations were selected from the Iraqi marine region. All samples were collected from surface sediment in low tide time. ICP technique was used for the determination of mercury and arsenic for all samples, while cadmium and lead were measured for the same samples by using Atomic Absorption Spectrosc
... Show MoreAbstract:
Witness the current business environment changes rapidly reflected on the performance of the facility wishing to stay , which is no longer style reaction enough to handle installations with their environment , and quickly began to lose its luster with the emergence of a message and the vision of contemporary business environment from a set of parts interacting with each other and the concept of behavioral includes all dimensions of performance, it is imperative to adopt a system installations influence variables and positive interaction through the development of strategic plans and the use of implementation and follow-up strategies to ensure the effectiveness of the method for meas
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show More
