The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed that the critical particle size was 0.01 mm, which means that most particles with diameters larger than 0.01 mm settled due to physical force, while most particles with diameters smaller than 0.01 mm settled due to flocculation process. At 10 m from the inlet zone, the removal efficiency was more than 60% of the total removal rate, indicating that increasing basin length is not a cost-effective way to improve removal efficiency. The influence of the flocculation process appears at particle sizes smaller than 0.01 mm, which is a small percentage (10%) of sieve analysis test. When the percentage reaches 20%, the difference in accumulative removal efficiency rises from +3.57% to 11.1% at the AL-Muthana sedimentation unit.
Many fuzzy clustering are based on within-cluster scatter with a compactness measure , but in this paper explaining new fuzzy clustering method which depend on within-cluster scatter with a compactness measure and between-cluster scatter with a separation measure called the fuzzy compactness and separation (FCS). The fuzzy linear discriminant analysis (FLDA) based on within-cluster scatter matrix and between-cluster scatter matrix . Then two fuzzy scattering matrices in the objective function assure the compactness between data elements and cluster centers .To test the optimal number of clusters using validation clustering method is discuss .After that an illustrate example are applied.
The usage of remote sensing techniques in managing and monitoring the environmental areas is increasing due to the improvement of the sensors used in the observation satellites around the earth. Resolution merge process is used to combine high resolution one band image with another one that have low resolution multi bands image to produce one image that is high in both spatial and spectral resolution. In this work different merging methods were tested to evaluate their enhancement capabilities to extract different environmental areas; Principle component analysis (PCA), Brovey, modified (Intensity, Hue ,Saturation) method and High Pass Filter methods were tested and subjected to visual and statistical comparison for evaluation. Both visu
... Show MoreIn this article, the inverse source problem is determined by the partition hyperbolic equation under the left end flux tension of the string, where the extra measurement is considered. The approximate solution is obtained in the form of splitting and applying the finite difference method (FDM). Moreover, this problem is ill-posed, dealing with instability of force after adding noise to the additional condition. To stabilize the solution, the regularization matrix is considered. Consequently, it is proved by error estimates between the regularized solution and the exact solution. The numerical results show that the method is efficient and stable.
A seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreSegmentation of real world images considered as one of the most challenging tasks in the computer vision field due to several issues that associated with this kind of images such as high interference between object foreground and background, complicated objects and the pixels intensities of the object and background are almost similar in some cases. This research has introduced a modified adaptive segmentation process with image contrast stretching namely Gamma Stretching to improve the segmentation problem. The iterative segmentation process based on the proposed criteria has given the flexibility to the segmentation process in finding the suitable region of interest. As well as, the using of Gamma stretching will help in separating the
... Show MoreThe theoretical analysis depends on the Classical Laminated Plate Theory (CLPT) that is based on the Von-K ráman Theory and Kirchhov Hypothesis in the deflection analysis during elastic limit as well as the Hooke's laws of calculation the stresses. New function for boundary condition is used to solve the forth degree of differential equations which depends on variety sources of advanced engineering mathematics. The behavior of composite laminated plates, symmetric and anti-symmetric of cross-ply angle, under out-of-plane loads (uniform distributed loads) with two different boundary conditions are investigated to obtain the central deflection for mid-plane by using the Ritz method. The computer programs is built using Ma
... Show MoreThe majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu
... Show MoreThis paper examines a new nonlinear system of multiple integro-differential equations containing symmetric matrices with impulsive actions. The numerical-analytic method of ordinary differential equations and Banach fixed point theorem are used to study the existence, uniqueness and stability of periodic solutions of impulsive integro-differential equations with piecewise continuous functions. This study is based on the Hölder condition in which the ordering , and are real numbers between 0 and 1.
In this article, a new efficient approach is presented to solve a type of partial differential equations, such (2+1)-dimensional differential equations non-linear, and nonhomogeneous. The procedure of the new approach is suggested to solve important types of differential equations and get accurate analytic solutions i.e., exact solutions. The effectiveness of the suggested approach based on its properties compared with other approaches has been used to solve this type of differential equations such as the Adomain decomposition method, homotopy perturbation method, homotopy analysis method, and variation iteration method. The advantage of the present method has been illustrated by some examples.
Abstract
Due to the lack of previous statistical study of the behavior of payments, specifically health insurance, which represents the largest proportion of payments in the general insurance companies in Iraq, this study was selected and applied in the Iraqi insurance company.
In order to find the convenient model representing the health insurance payments, we initially detected two probability models by using (Easy Fit) software:
First, a single Lognormal for the whole sample and the other is a Compound Weibull for the two Sub samples (small payments and large payments), and we focused on the compoun
... Show More