The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed that the critical particle size was 0.01 mm, which means that most particles with diameters larger than 0.01 mm settled due to physical force, while most particles with diameters smaller than 0.01 mm settled due to flocculation process. At 10 m from the inlet zone, the removal efficiency was more than 60% of the total removal rate, indicating that increasing basin length is not a cost-effective way to improve removal efficiency. The influence of the flocculation process appears at particle sizes smaller than 0.01 mm, which is a small percentage (10%) of sieve analysis test. When the percentage reaches 20%, the difference in accumulative removal efficiency rises from +3.57% to 11.1% at the AL-Muthana sedimentation unit.
The growth curves of the children are the most commonly used tools to assess the general welfare of society. Particularity child being one of the pillars to develop society; through these tools, we can path a child's growth physiology. The Centile line is of the important tools to build these curves, which give an accurate interpretation of the information society, also respond with illustration variable age. To build standard growth curves for BMI, we use BMI as an index. LMSP method used for finding the Centile line which depends on four curves represents Median, Coefficient of Variation, Skews, and Kurtosis. These can be obtained by modeling four parameters as nonparametric Smoothing functions for the illustration variable. Ma
... Show MoreA seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreThis paper examines a new nonlinear system of multiple integro-differential equations containing symmetric matrices with impulsive actions. The numerical-analytic method of ordinary differential equations and Banach fixed point theorem are used to study the existence, uniqueness and stability of periodic solutions of impulsive integro-differential equations with piecewise continuous functions. This study is based on the Hölder condition in which the ordering , and are real numbers between 0 and 1.
Segmentation of real world images considered as one of the most challenging tasks in the computer vision field due to several issues that associated with this kind of images such as high interference between object foreground and background, complicated objects and the pixels intensities of the object and background are almost similar in some cases. This research has introduced a modified adaptive segmentation process with image contrast stretching namely Gamma Stretching to improve the segmentation problem. The iterative segmentation process based on the proposed criteria has given the flexibility to the segmentation process in finding the suitable region of interest. As well as, the using of Gamma stretching will help in separating the
... Show MoreThe theoretical analysis depends on the Classical Laminated Plate Theory (CLPT) that is based on the Von-K ráman Theory and Kirchhov Hypothesis in the deflection analysis during elastic limit as well as the Hooke's laws of calculation the stresses. New function for boundary condition is used to solve the forth degree of differential equations which depends on variety sources of advanced engineering mathematics. The behavior of composite laminated plates, symmetric and anti-symmetric of cross-ply angle, under out-of-plane loads (uniform distributed loads) with two different boundary conditions are investigated to obtain the central deflection for mid-plane by using the Ritz method. The computer programs is built using Ma
... Show MoreThe majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu
... Show More
Abstract
Due to the lack of previous statistical study of the behavior of payments, specifically health insurance, which represents the largest proportion of payments in the general insurance companies in Iraq, this study was selected and applied in the Iraqi insurance company.
In order to find the convenient model representing the health insurance payments, we initially detected two probability models by using (Easy Fit) software:
First, a single Lognormal for the whole sample and the other is a Compound Weibull for the two Sub samples (small payments and large payments), and we focused on the compoun
... Show MoreIn this article, a new efficient approach is presented to solve a type of partial differential equations, such (2+1)-dimensional differential equations non-linear, and nonhomogeneous. The procedure of the new approach is suggested to solve important types of differential equations and get accurate analytic solutions i.e., exact solutions. The effectiveness of the suggested approach based on its properties compared with other approaches has been used to solve this type of differential equations such as the Adomain decomposition method, homotopy perturbation method, homotopy analysis method, and variation iteration method. The advantage of the present method has been illustrated by some examples.
In this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreIn this paper, one of the Machine Scheduling Problems is studied, which is the problem of scheduling a number of products (n-jobs) on one (single) machine with the multi-criteria objective function. These functions are (completion time, the tardiness, the earliness, and the late work) which formulated as . The branch and bound (BAB) method are used as the main method for solving the problem, where four upper bounds and one lower bound are proposed and a number of dominance rules are considered to reduce the number of branches in the search tree. The genetic algorithm (GA) and the particle swarm optimization (PSO) are used to obtain two of the upper bounds. The computational results are calculated by coding (progr
... Show More