Sewer sediment deposition is an important aspect as it relates to several operational and environmental problems. It concerns municipalities as it affects the sewer system and contributes to sewer failure which has a catastrophic effect if happened in trunks or interceptors. Sewer rehabilitation is a costly process and complex in terms of choosing the method of rehabilitation and individual sewers to be rehabilitated. For such a complex process, inspection techniques assist in the decision-making process; though, it may add to the total expenditure of the project as it requires special tools and trained personnel. For developing countries, Inspection could prohibit the rehabilitation proceeds. In this study, the researchers proposed an alternative method for sewer sediment accumulation calculation using predictive models harnessing multiple linear regression model (MLRM) and artificial neural network (ANN). AL-Thawra trunk sewer in Baghdad city is selected as a case study area; data from a survey done on this trunk is used in the modeling process. Results showed that MLRM is acceptable, with an adjusted coefficient of determination (adj. R2) in order of 89.55%. ANN model found to be practical with R2 of 82.3% and fit the data better throughout its range. Sensitivity analysis showed that the flow is the most influential parameter on the depth of sediment deposition.
This paper deals with constructing a model of fuzzy linear programming with application on fuels product of Dura- refinery , which consist of seven products that have direct effect ondaily consumption . After Building the model which consist of objective function represents the selling prices ofthe products and fuzzy productions constraints and fuzzy demand constraints addition to production requirements constraints , we used program of ( WIN QSB ) to find the optimal solution
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreThe matter of handwritten text recognition is as yet a major challenge to mainstream researchers. A few ways deal with this challenge have been endeavored in the most recent years, for the most part concentrating on the English pre-printed or handwritten characters space. Consequently, the need to effort a research concerning to Arabic texts handwritten recognition. The Arabic handwriting presents unique technical difficulties because it is cursive, right to left in writing and the letters convert its shapes and structures when it is putted at initial, middle, isolation or at the end of words. In this study, the Arabic text recognition is developed and designed to recognize image of Arabic text/characters. The proposed model gets a single l
... Show MoreWhenever, the Internet of Things (IoT) applications and devices increased, the capability of the its access frequently stressed. That can lead a significant bottleneck problem for network performance in different layers of an end point to end point (P2P) communication route. So, an appropriate characteristic (i.e., classification) of the time changing traffic prediction has been used to solve this issue. Nevertheless, stills remain at great an open defy. Due to of the most of the presenting solutions depend on machine learning (ML) methods, that though give high calculation cost, where they are not taking into account the fine-accurately flow classification of the IoT devices is needed. Therefore, this paper presents a new model bas
... Show MoreThe use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreThe duration of sunshine is one of the important indicators and one of the variables for measuring the amount of solar radiation collected in a particular area. Duration of solar brightness has been used to study atmospheric energy balance, sustainable development, ecosystem evolution and climate change. Predicting the average values of sunshine duration (SD) for Duhok city, Iraq on a daily basis using the approach of artificial neural network (ANN) is the focus of this paper. Many different ANN models with different input variables were used in the prediction processes. The daily average of the month, average temperature, maximum temperature, minimum temperature, relative humidity, wind direction, cloud level and atmosp
... Show MoreA fluorescence microscopy considered as a powerful imaging tool in biology and medicine. In addition to useful signal obtained from fluorescence microscopy, there are some defects in its images such as random variation in brightness, noise that caused by photon detection and some background pixels in the acquired fluorescence microscopic images appear wrongly auto-fluorescence property. All these practical limitations have a negative impact on the correct vision and analysis of the fluorescent microscope users. Our research enters the field of automation of image processing and image analysis using image processing techniques and applying this processing and analysis on one of the very important experiments in biology science. This research
... Show MoreShallow foundations are usually used for structures with light to moderate loads where the soil underneath can carry them. In some cases, soil strength and/or other properties are not adequate and require improvement using one of the ground improvement techniques. Stone column is one of the common improvement techniques in which a column of stone is installed vertically in clayey soils. Stone columns are usually used to increase soil strength and to accelerate soil consolidation by acting as vertical drains. Many researches have been done to estimate the behavior of the improved soil. However, none of them considered the effect of stone column geometry on the behavior of the circular footing. In this research, finite ele
... Show MoreThe study using Nonparametric methods for roubust to estimate a location and scatter it is depending minimum covariance determinant of multivariate regression model , due to the presence of outliear values and increase the sample size and presence of more than after the model regression multivariate therefore be difficult to find a median location .
It has been the use of genetic algorithm Fast – MCD – Nested Extension and compared with neural Network Back Propagation of multilayer in terms of accuracy of the results and speed in finding median location ,while the best sample to be determined by relying on less distance (Mahalanobis distance)has the stu
... Show More