Combining different treatment strategies successively or simultaneously has become recommended to achieve high purification standards for the treated discharged water. The current work focused on combining electrocoagulation, ion-exchange, and ultrasonication treatment approaches for the simultaneous removal of copper, nickel, and zinc ions from water. The removal of the three studied ions was significantly enhanced by increasing the power density (4–10 mA/cm2) and NaCl salt concentration (0.5–1.5 g/L) at a natural solution pH. The simultaneous removal of these metal ions at 4 mA/cm2 and 1 g NaCl/L was highly improved by introducing 1 g/L of mordenite zeolite as an ion-exchanger. A remarkable removal of heavy metals was reported, as the initial concentration of each metal decreased from approximately 50 ppm to 1.19 for nickel, 3.06 for zinc, and less than 1 ppm for copper. In contrast, ultrasonication did not show any improvement in the treatment process. The extended Langmuir isotherm model convincingly described the experimental data; the Temkin and Dubinin-Radushkevich isotherm models have proven that the removal processes were physical and exothermic. Finally, the pseudo-second-order kinetics model appropriately explained the kinetics of the process with correlation coefficients of 0.9337 and 0.9016, respectively.
n Segmented Optical Telescope (NGST) with hexagonal segment of spherical primary mirror can provide a 3 arc minutes field of view. Extremely Large Telescopes (ELT) in the 100m dimension would have such unprecedented scientific effectiveness that their construction would constitute a milestone comparable to that of the invention of the telescope itself and provide a truly revolutionary insight into the universe. The scientific case and the conceptual feasibility of giant filled aperture telescopes was our interested. Investigating the requirements of these imply for possible technical options in the case of a 100m telescope. For this telescope the considerable interest is the correction of the optical aberrations for the coming wavefront, th
... Show More
Cutting forces are important factors for determining machine serviceability and product quality. Factors such as speed feed, depth of cut and tool noise radius affect on surface roughness and cutting forces in turning operation. The artificial neural network model was used to predict cutting forces with related to inputs including cutting speed (m/min), feed rate (mm/rev), depth of cut (mm) and work piece hardness (Map). The outputs of the ANN model are the machined cutting force parameters, the neural network showed that all (outputs) of all components of the processing force cutting force FT (N), feed force FA (N) and radial force FR (N) perfect accordance with the experimental data. Twenty-five samp
... Show MoreAstronomy image is regarded main source of information to discover outer space, therefore to know the basic contain for galaxy (Milky way), it was classified using Variable Precision Rough Sets technique to determine the different region within galaxy according different color in the image. From classified image we can determined the percentage for each class and then what is the percentage mean. In this technique a good classified image result and faster time required to done the classification process.
Background: The treatment of dental tissues proceeding to adhesive procedures is a crucial step in the bonding protocol and decides the clinical success ofrestorations. This study was conducted in vitro, with the aim of evaluating thenanoleakage on the interface between the adhesive system and the dentine treated by five surface modalities using scanning electron microscopy and energydispersiveX-ray spectrometry. Materials and methods: Twenty five extracted premolars teeth were selected in the study. Standardized class V cavities were prepared on the buccal and lingual surfaces then the teeth divided into five main groups of (5 teeth in each group n=10) according to the type of dentine surface treatment that was used: Group (A): dentine was
... Show MoreOften phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreThe purpose of this paper to discriminate between the poetic poems of each poet depending on the characteristics and attribute of the Arabic letters. Four categories used for the Arabic letters, letters frequency have been included in a multidimensional contingency table and each dimension has two or more levels, then contingency coefficient calculated.
The paper sample consists of six poets from different historical ages, and each poet has five poems. The method was programmed using the MATLAB program, the efficiency of the proposed method is 53% for the whole sample, and between 90% and 95% for each poet's poems.