Combining different treatment strategies successively or simultaneously has become recommended to achieve high purification standards for the treated discharged water. The current work focused on combining electrocoagulation, ion-exchange, and ultrasonication treatment approaches for the simultaneous removal of copper, nickel, and zinc ions from water. The removal of the three studied ions was significantly enhanced by increasing the power density (4–10 mA/cm2) and NaCl salt concentration (0.5–1.5 g/L) at a natural solution pH. The simultaneous removal of these metal ions at 4 mA/cm2 and 1 g NaCl/L was highly improved by introducing 1 g/L of mordenite zeolite as an ion-exchanger. A remarkable removal of heavy metals was reported, as the initial concentration of each metal decreased from approximately 50 ppm to 1.19 for nickel, 3.06 for zinc, and less than 1 ppm for copper. In contrast, ultrasonication did not show any improvement in the treatment process. The extended Langmuir isotherm model convincingly described the experimental data; the Temkin and Dubinin-Radushkevich isotherm models have proven that the removal processes were physical and exothermic. Finally, the pseudo-second-order kinetics model appropriately explained the kinetics of the process with correlation coefficients of 0.9337 and 0.9016, respectively.
The process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material
... Show MoreWater covers more than 75% of the earth's surface in the form of the ocean. The ocean investigation is far-fetched because the underwater environment has distinct phenomenal activities. The expansion of human activities inside underwater environments includes environmental monitoring, offshore field exploration, tactical surveillance, scientific data collection, and port security. This led to increased demand for underwater application communication systems. Therefore, the researcher develops many methods for underwater VLC Visible Light Communications. The new technology of blue laser is a type of VLC that has benefits in the application of underwater communications. This research article investigated the benefits of underwater blu
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreTexture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
The Hopfield network is one of the easiest types, and its architecture is such that each neuron in the network connects to the other, thus called a fully connected neural network. In addition, this type is considered auto-associative memory, because the network returns the pattern immediately upon recognition, this network has many limitations, including memory capacity, discrepancy, orthogonally between patterns, weight symmetry, and local minimum. This paper proposes a new strategy for designing Hopfield based on XOR operation; A new strategy is proposed to solve these limitations by suggesting a new algorithm in the Hopfield network design, this strategy will increase the performance of Hopfield by modifying the architecture of t
... Show Moren this study, data or X-ray images Fixable Image Transport System (FITS) of objects were analyzed, where energy was collected from the body by several sensors; each sensor receives energy within a specific range, and when energy was collected from all sensors, the image was formed carrying information about that body. The images can be transferred and stored easily. The images were analyzed using the DS9 program to obtain a spectrum for each object,an energy corresponding to the photons collected per second. This study analyzed images for two types of objects (globular and open clusters). The results showed that the five open star clusters contain roughly t
... Show MoreTime series have gained great importance and have been applied in a manner in the economic, financial, health and social fields and used in the analysis through studying the changes and forecasting the future of the phenomenon. One of the most important models of the black box is the "ARMAX" model, which is a mixed model consisting of self-regression with moving averages with external inputs. It consists of several stages, namely determining the rank of the model and the process of estimating the parameters of the model and then the prediction process to know the amount of compensation granted to workers in the future in order to fulfil the future obligations of the Fund. , And using the regular least squares method and the frequ
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show More