In this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
The Falluja residents had resorted to the underground water as an alternative to the surface waters of the Euphrates river passing near the city, through digging wells inside gardens of Mosques in the city during spring 2005. The present study aims to indicate the quality of these waters and demonstrates the extent of their suitability for drinking . For this purpose, 21 randomly distributed wells were chosen during August 2005. The water characters were measured ; the average values of 21 wells were as follows : Water temp .(22.6C ْ ◌ ) , EC (4,11 msem .\cm ), pH (7.15 ) and concentration of cations : Na (439mg\l) ;K (275mg\l) ; Li (0,28mg\l), Ba (15.2 mg\l) and (133mg\l). These character is ties were compared with the
... Show MoreThis study examined the effects of water scarcity on rural household economy in El Fashir Rural Council / North Darfur State- western Sudan. Both quantitative and qualitative methods were used as to get a deeper understanding of the impact of water scarcity on the rural house economy in the study area. 174 households out of 2017 were selected from 45 villages which were distributed in eight village councils forming the study area. Statistical methods were used to manipulate the data of the study. The obtained results revealed that water scarcity negatively affected the rural household economy in the study area in many features. These include the followings: much family efforts and time were directed to fetch for water consequentl
... Show MoreFormation of emulsions during oil production is a costly problem, and decreased water content in emulsions leads to increases productivity and reduces the potential for pipeline corrosion and equipment used. The chemical demulsification process of crude oil emulsions is one of the methods used for reducing water content. The demulsifier presence causes the film layer between water droplets and the crude oil emulsion that to become unstable, leading to the accelerated of water coalescence. This research was performed to study the performance of a chemical demulsifier Chimec2439 (commercial) a blend of non-ionic oil-soluble surfactants. The crude oils used in these experiments were Basrah and Kirkuk Iraqi crude oil. These
... Show MoreIn this study, we fabricated nanofiltration membranes using the electrospinning technique, employing pure PAN and a mixed matrix of PAN/HPMC. The PAN nanofibrous membranes with a concentration of 13wt% were prepared and blended with different concentrations of HPMC in the solvent N, N-Dimethylformamide (DMF). We conducted a comprehensive analysis of these membranes' surface morphology, chemical composition, wettability, and porosity and compared the results. The findings indicated that the inclusion of HPMC in the PAN membranes led to a reduction in surface porosity and fiber size. The contact angle decreased, indicating increased surface hydrophilicity, which can enhance flux and reduce fouling tendencies. Subsequently, we evaluated the e
... Show MoreElectrocoagulation is an electrochemical method for treatment of different types of wastewater whereby sacrificial anodes corrode to release active coagulant (usually aluminium or iron cations) into solution, while simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation or settling. The Taguchi method was applied as an experimental design and to determine the best conditions for chromium (VI) removal from wastewater. Various parameters in a batch stirred tank by iron metal electrodes: pH, initial chromium concentration, current density, distance between electrodes and KCl concentration were investigated, and the results have been analyzed using signal-to-noise (S/N) ratio. It was found that the r
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreGreen nanotechnology is a thrilling and rising place of technology and generation that braces
the ideas of inexperienced chemistry with ability advantages for sustainability, protection, and
the general protection from the race human. The inexperienced chemistry method introduces a
proper technique for the production, processing, and alertness of much less dangerous chemical
substances to lessen threats to human fitness and the environment. The technique calls for inintensity expertise of the uncooked materials, particularly in phrases in their creation into
nanomaterials and the resultant bioactivities that pose very few dangerous outcomes for people
and the environment. In the twenty-first century, nanotec
Canonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreIn this study, we investigate about the estimation improvement for Autoregressive model of the third order, by using Levinson-Durbin Recurrence (LDR) and Weighted Least Squares Error ( WLSE ).By generating time series from AR(3) model when the error term for AR(3) is normally and Non normally distributed and when the error term has ARCH(q) model with order q=1,2.We used different samples sizes and the results are obtained by using simulation. In general, we concluded that the estimation improvement for Autoregressive model for both estimation methods (LDR&WLSE), would be by increasing sample size, for all distributions which are considered for the error term , except the lognormal distribution. Also we see that the estimation improve
... Show More