The measurement data of the raw water quality of Tigris River were statistically analyzed to measure the salinity value in relation to the selected raw water quality parameters. The analyzed data were collected from five water treatment plants (WTPs) assembled alongside of the Tigris River in Baghdad: Al-Karkh, Al-Karama, Al-Qadisiya, Al-Dora, and Al-Wihda for the period from 2015 to 2021. The selected parameters are total dissolved solid (TDS), electrical conductivity (EC), pH and temperature. The main objective of this research is to predicate a mathematical model using SPSS software to calculate the value of salinity along the river, in addition, the effect of electrical conductivity on the salinity value was estimated. Multiple linear regression (MLR) and artificial neural network (ANN) models were used to estimate the mathematical models for calculating water salinity value in Tigris River and to present the highest effective parameter that effect on water salinity. In general, the results showed an increase in the water salinity level downstream of the Tigris River towards the south of Baghdad and the EC is the most significant effect on water salinity, and MLR and ANN analyses present a good indication of the mathematical models with highest coefficient of correlation (R2) as (0.999 and 0.998), respectively. In addition, the regression equations proved good performance in predicting the salinity value with error percentage less than 10% for all WTPs.
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreThis research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show MoreBootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreThis research including lineament automated extraction by using PCI Geomatica program, depending on satellite image and lineament analysis by using GIS program. Analysis included density analysis, length density analysis and intersection density analysis. When calculate the slope map for the study area, found the relationship between the slope and lineament density.
The lineament density increases in the regions that have high values for the slope, show that lineament play an important role in the classification process as it isolates the class for the other were observed in Iranian territory, clearly, also show that one of the lineament hit shoulders of Galal Badra dam and the surrounding areas dam. So should take into consideration
This article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show MoreAbstract:
One of the important things provided by fuzzy model is to identify the membership functions. In the fuzzy reliability applications with failure functions of the kind who cares that deals with positive variables .There are many types of membership functions studied by many researchers, including triangular membership function, trapezoidal membership function and bell-shaped membership function. In I research we used beta function. Based on this paper study classical method to obtain estimation fuzzy reliability function for both series and parallel systems.
In this paper, we estimate the survival function for the patients of lung cancer using different nonparametric estimation methods depending on sample from complete real data which describe the duration of survivor for patients who suffer from the lung cancer based on diagnosis of disease or the enter of patients in a hospital for period of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the survival function for the lung cancer by using shrinkage method is the best
Abstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show More