Indirect electrochemical oxidation of phenol and its derivatives was investigated by using MnO2 rotating cylinder electrode. Taguchi experimental design method was employed to find the best conditions for the removal efficiency of phenol and its derivatives generated during the process. Two main parameters were investigated, current density (C.D.) and electrolysis time. The removal efficiency was considered as a response for the phenol and other organics removal. An orthogonal array L16, the signal to noise (S/N) ratio, and the analysis of variance were used to test the effect of designated process factors and their levels on the performance of phenol and other organics removal efficiency. The results showed that the current density has the higher influence on performance of organics removal while the electrolysis time has the lower impact on the removal performance. Multiple regressions was utilized to acquire the equation that describes the process and the predicted equation has a correlation coefficient (R2) equal to 98.77%. The best conditions were found to get higher removal efficiency. Removal efficiency higher than 95% can be obtained in the range of C.D. of 96-100 mA/cm2 and electrolysis time of 3.2 to 5 h. The behavior of the chemical oxygen demand (COD) mineralization denotes to a zero order reaction and the rate of reaction controlled by active chlorine reaction not by mass transfer of phenol towards the anode.
The subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreThe goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed
Nanosilica was extracted from rice husk, which was locally collected from the Iraqi mill at Al-Mishikhab district in Najaf Governorate, Iraq. The precipitation method was used to prepared Nanosilica powder from rice husk ash, after treating it thermally at 700°C, followed by dissolving the silica in the alkaline solution and getting a sodium silicate solution. Two samples of the final solution were collected to study the effect of filtration on the purity of the sample by X-ray fluorescence spectrometry (XRF). The result shows that the filtered samples have purity above while the non-filtered sample purity was around The structure analysis investigated by the X-ray diffraction (XRD), found that the Nanosilica powder has an amorphous
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreDue to the rapid advancement of technology and the technology of things, modern industries start to need a highprecision equipment and surface finishing, so many finishing processes began to develop. One of the modern processes is Magnetic Abrasive Finishing (MAF), which is a high-precision process for internal and external finishing under the influence of a magnetic field of abrasive particles. Boron Carbide (B4C) ceramics was tested by mixing it with iron (Fe) and produced abrasive particles to reduce the intensity of scraping on the surface, reduce the economic cost and achieve a high finishing addition to remove the edges at the same time. The material selected for the samples was mild steel (ASTM E415) under (Quantity of Abrasives, Mac
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show More