Sewer sediment deposition is an important aspect as it relates to several operational and environmental problems. It concerns municipalities as it affects the sewer system and contributes to sewer failure which has a catastrophic effect if happened in trunks or interceptors. Sewer rehabilitation is a costly process and complex in terms of choosing the method of rehabilitation and individual sewers to be rehabilitated. For such a complex process, inspection techniques assist in the decision-making process; though, it may add to the total expenditure of the project as it requires special tools and trained personnel. For developing countries, Inspection could prohibit the rehabilitation proceeds. In this study, the researchers proposed an alternative method for sewer sediment accumulation calculation using predictive models harnessing multiple linear regression model (MLRM) and artificial neural network (ANN). AL-Thawra trunk sewer in Baghdad city is selected as a case study area; data from a survey done on this trunk is used in the modeling process. Results showed that MLRM is acceptable, with an adjusted coefficient of determination (adj. R2) in order of 89.55%. ANN model found to be practical with R2 of 82.3% and fit the data better throughout its range. Sensitivity analysis showed that the flow is the most influential parameter on the depth of sediment deposition.
Theresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
The majority of the environmental outputs from gas refineries are oily wastewater. This research reveals a novel combination of response surface methodology and artificial neural network to optimize and model oil content concentration in the oily wastewater. Response surface methodology based on central composite design shows a highly significant linear model with P value <0.0001 and determination coefficient R2 equal to 0.747, R adjusted was 0.706, and R predicted 0.643. In addition from analysis of variance flow highly effective parameters from other and optimization results verification revealed minimum oily content with 8.5 ± 0.7 ppm when initial oil content 991 ppm, tempe
In this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
A fluorescence microscopy considered as a powerful imaging tool in biology and medicine. In addition to useful signal obtained from fluorescence microscopy, there are some defects in its images such as random variation in brightness, noise that caused by photon detection and some background pixels in the acquired fluorescence microscopic images appear wrongly auto-fluorescence property. All these practical limitations have a negative impact on the correct vision and analysis of the fluorescent microscope users. Our research enters the field of automation of image processing and image analysis using image processing techniques and applying this processing and analysis on one of the very important experiments in biology science. This research
... Show MoreThis paper deals with constructing a model of fuzzy linear programming with application on fuels product of Dura- refinery , which consist of seven products that have direct effect ondaily consumption . After Building the model which consist of objective function represents the selling prices ofthe products and fuzzy productions constraints and fuzzy demand constraints addition to production requirements constraints , we used program of ( WIN QSB ) to find the optimal solution
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreThis research discussed, the process of comparison between the regression model of partial least squares and tree regression, where these models included two types of statistical methods represented by the first type "parameter statistics" of the partial least squares, which is adopted when the number of variables is greater than the number of observations and also when the number of observations larger than the number of variables, the second type is the "nonparametric statistic" represented by tree regression, which is the division of data in a hierarchical way. The regression models for the two models were estimated, and then the comparison between them, where the comparison between these methods was according to a Mean Square
... Show More