Sewer sediment deposition is an important aspect as it relates to several operational and environmental problems. It concerns municipalities as it affects the sewer system and contributes to sewer failure which has a catastrophic effect if happened in trunks or interceptors. Sewer rehabilitation is a costly process and complex in terms of choosing the method of rehabilitation and individual sewers to be rehabilitated. For such a complex process, inspection techniques assist in the decision-making process; though, it may add to the total expenditure of the project as it requires special tools and trained personnel. For developing countries, Inspection could prohibit the rehabilitation proceeds. In this study, the researchers proposed an alternative method for sewer sediment accumulation calculation using predictive models harnessing multiple linear regression model (MLRM) and artificial neural network (ANN). AL-Thawra trunk sewer in Baghdad city is selected as a case study area; data from a survey done on this trunk is used in the modeling process. Results showed that MLRM is acceptable, with an adjusted coefficient of determination (adj. R2) in order of 89.55%. ANN model found to be practical with R2 of 82.3% and fit the data better throughout its range. Sensitivity analysis showed that the flow is the most influential parameter on the depth of sediment deposition.
Digital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreMM Abdulwahhab, kufa Journal for Nursing sciences, 2017 - Cited by 1
In this paper, a new analytical method is introduced to find the general solution of linear partial differential equations. In this method, each Laplace transform (LT) and Sumudu transform (ST) is used independently along with canonical coordinates. The strength of this method is that it is easy to implement and does not require initial conditions.
The therapeutic value of the phenolic component and pure thymol was well known; this study comprised the extraction of crude phenol from two plants (Thymus vulgaris and Artemisia annua) which contain thymol with pure thymol and evaluate their effect on hematological and histological by using three different concentrations of each plant extract and pure thymol to tested them on lab mice. All the mice were allowed free access to water and feed for 21 days in laboratory conditions; orally, pure water was administered to the control mice (group I), while groups II, III, and IV were given orally with T. vulgaris, A. annua, combination of last two crude phenol plant extract 50:50 and pure thymol respectively. The levels of CHO, TRI, and HDL were
... Show MoreThe open hole well log data (Resistivity, Sonic, and Gamma Ray) of well X in Euphrates subzone within the Mesopotamian basin are applied to detect the total organic carbon (TOC) of Zubair Formation in the south part of Iraq. The mathematical interpretation of the logs parameters helped in detecting the TOC and source rock productivity. As well, the quantitative interpretation of the logs data leads to assigning to the organic content and source rock intervals identification. The reactions of logs in relation to the increasing of TOC can be detected through logs parameters. By this way, the TOC can be predicted with an increase in gamma-ray, sonic, neutron, and resistivity, as well as a decrease in the density log
... Show MoreRecently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists
... Show MoreIn this paper, some commonly used hierarchical cluster techniques have been compared. A comparison was made between the agglomerative hierarchical clustering technique and the k-means technique, which includes the k-mean technique, the variant K-means technique, and the bisecting K-means, although the hierarchical cluster technique is considered to be one of the best clustering methods. It has a limited usage due to the time complexity. The results, which are calculated based on the analysis of the characteristics of the cluster algorithms and the nature of the data, showed that the bisecting K-means technique is the best compared to the rest of the other methods used.
Image recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third
... Show MoreSignificant advances in the automated glaucoma detection techniques have been made through the employment of the Machine Learning (ML) and Deep Learning (DL) methods, an overview of which will be provided in this paper. What sets the current literature review apart is its exclusive focus on the aforementioned techniques for glaucoma detection using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines for filtering the selected papers. To achieve this, an advanced search was conducted in the Scopus database, specifically looking for research papers published in 2023, with the keywords "glaucoma detection", "machine learning", and "deep learning". Among the multiple found papers, the ones focusing
... Show More