Electrocoagulation is an electrochemical method for treatment of different types of wastewater whereby sacrificial anodes corrode to release active coagulant (usually aluminium or iron cations) into solution, while simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation or settling. The Taguchi method was applied as an experimental design and to determine the best conditions for chromium (VI) removal from wastewater. Various parameters in a batch stirred tank by iron metal electrodes: pH, initial chromium concentration, current density, distance between electrodes and KCl concentration were investigated, and the results have been analyzed using signal-to-noise (S/N) ratio. It was found that the removal efficiency of chromium increased with increasing current density and KCl concentration, and decreases with increasing initial chromium concentration and distance between electrodes, while pH shows peak performance curve. Experimental work have been performed for synthetic solutions and real industrial effluent. The results showed that the removal efficiency of synthetic solution is higher than industrial wastewater, the maximum removal for prepared solution is 91.72 %, while it was 73.54 % for industrial wastewater for the same conditions.
In the present work theoretical relations are derived for the efficiency evaluation for the generation of the third and the fourth harmonics u$ing crystal cascading configuration. These relations can be applied to a wide class of nonlinear optical materials. Calculations are made for beta barium borate (BBO) crystal with ruby laser /.=694.3 nm . The case study involves producing the third harmonics at X. =231.4 nm of the fundamental beam. The formula of efficiency involves many parameters, which can be changed to enhance the efficiency. The results showed that the behavior of the efficiency is not linear with the crystal length. It is found that the efficiency increases when the input power increases. 'I'he walk-off length is calculated for
... Show MoreSemantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
In the present work, pattern recognition is carried out by the contrast and relative variance of clouds. The K-mean clustering process is then applied to classify the cloud type; also, texture analysis being adopted to extract the textural features and using them in cloud classification process. The test image used in the classification process is the Meteosat-7 image for the D3 region.The K-mean method is adopted as an unsupervised classification. This method depends on the initial chosen seeds of cluster. Since, the initial seeds are chosen randomly, the user supply a set of means, or cluster centers in the n-dimensional space.The K-mean cluster has been applied on two bands (IR2 band) and (water vapour band).The textural analysis is used
... Show MoreIn this paper, we will present proposed enhance process of image compression by using RLE algorithm. This proposed yield to decrease the size of compressing image, but the original method used primarily for compressing a binary images [1].Which will yield increasing the size of an original image mostly when used for color images. The test of an enhanced algorithm is performed on sample consists of ten BMP 24-bit true color images, building an application by using visual basic 6.0 to show the size after and before compression process and computing the compression ratio for RLE and for the enhanced RLE algorithm.
Many academics have concentrated on applying machine learning to retrieve information from databases to enable researchers to perform better. A difficult issue in prediction models is the selection of practical strategies that yield satisfactory forecast accuracy. Traditional software testing techniques have been extended to testing machine learning systems; however, they are insufficient for the latter because of the diversity of problems that machine learning systems create. Hence, the proposed methodologies were used to predict flight prices. A variety of artificial intelligence algorithms are used to attain the required, such as Bayesian modeling techniques such as Stochastic Gradient Descent (SGD), Adaptive boosting (ADA), Decision Tre
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreOrthogonal polynomials and their moments have significant role in image processing and computer vision field. One of the polynomials is discrete Hahn polynomials (DHaPs), which are used for compression, and feature extraction. However, when the moment order becomes high, they suffer from numerical instability. This paper proposes a fast approach for computing the high orders DHaPs. This work takes advantage of the multithread for the calculation of Hahn polynomials coefficients. To take advantage of the available processing capabilities, independent calculations are divided among threads. The research provides a distribution method to achieve a more balanced processing burden among the threads. The proposed methods are tested for va
... Show MoreMode filtering technique is one of the most desired techniques in optical fiber communication systems, especially for multiple input multiple output (MIMO) coherent optical communications that have mode-dependent losses in communication channels. In this work, a special type of optical fiber sensing head was used, where it utilizes DCF13 that is made by Thorlabs and has two numerical apertures (NA’s). One is for core and 1st cladding region, while the 2nd relates the 1st cladding to the 2nd cladding. Etching process using 40 % hydro-fluoric (HF) acid was performed on the DCF13 with variable time in minutes. Investigation of the correlation between the degree of etching and the re
Abstract
In this study, modified organic solvent (organosolv) method was applied to remove high lignin content in the date palm fronds (type Al-Zahdi) which was taken from the Iraqi gardens. In modified organosolv, lignocellulosic material is fractionated into its constituents (lignin, cellulose and hemicellulose). In this process, solvent (organic)-water is brought into contact with the lignocellulosic biomass at high temperature, using stainless steel reactor (digester). Therefor; most of hemicellulose will remove from the biomass, while the solid residue (mainly cellulose) can be used in various industrial fields. Three variables were studied in this process: temperature, ratio of ethano
... Show More