The Electro-Fenton oxidation process is one of the essential advanced electrochemical oxidation processes used to treat Phenol and its derivatives in wastewater. The Electro-Fenton oxidation process was carried out at an ambient temperature at different current density (2, 4, 6, 8 mA/cm2) for up to 6 h. Sodium Sulfate at a concentration of 0.05M was used as a supporting electrolyte, and 0.4 mM of Ferrous ion concentration (Fe2+) was used as a catalyst. The electrolyte cell consists of graphite modified by an electrodepositing layer of PbO2 on its surface as anode and carbon fiber modified with Graphene as a cathode. The results indicated that Phenol concentration decreases with an increase in current density, and the minimum Phenol concentration obtained after 6 h of electrolysis at 8 mA/cm2 is equal to 7.82 ppm starting from an initial concentration about 155 ppm. The results obtained from the kinetic study of Phenol oxidation at different current density showed that the reaction followed pseudo first-order kinetics regarding current density. Energetic parameters like specific power consumption and current efficiency were also estimated at different current density. The results showed that an increase in current density caused an increase in the specific power consumption of the process and decreased current efficiency.
Flexible pavements are considered an essential element of transportation infrastructure. So, evaluations of flexible pavement performance are necessary for the proper management of transportation infrastructure. Pavement condition index (PCI) and international roughness index (IRI) are common indices applied to evaluate pavement surface conditions. However, the pavement condition surveys to calculate PCI are costly and time-consuming as compared to IRI. This article focuses on developing regression models that predict PCI from IRI. Eighty-three flexible pavement sections, with section length equal to 250 m, were selected in Al-Diwaniyah, Iraq, to develop PCI-IRI relationships. In terms of the quantity and severity of eac
... Show MoreParasitological examination of gills of three species of sparid fishes in the territorial waters of Iraq was performed, two diplectanid monogenoids were isolated and described; Lamellodiscus indicus Tripathi, 1959 from both Haffara seabream Rhabdosargus haffara (Forsskål, 1775) and Goldline seabream R. sarba (Forsskål, 1775) and Protolamellodiscus senilobatus Kritsky, Jiménez-Ruiz and Sey, 2000 from King soldierbream Argyrops spinifer (Forsskål, 1775). The record of the parasites is considered new to the parasite fauna of Iraq. The redescription of L. indicus for the first time which is collected from a new distribution area (Arabian Gulf). R. haffara is considered a new host record .
Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreReinforcing asphalt concrete with polyester fibers considered as an active remedy to alleviate the harmful impact of fatigue deterioration. This study covers the investigation of utilizing two shapes of fibers size, 6.35 mm by 3.00 mm and 12.70 mm by 3.00 mm with mutual concentrations equal to 0.25 %, 0.50 % and 0.75 % by weight of mixture. Composition of asphalt mixture consists of different optimum (40-50) asphalt cement content, 12.50 mm nominal aggregate maximum size with limestone dust as a filler. Following the traditional asphalt cement and aggregate tests, three essential test were carried out on mixtures, namely: Marshall test (105 cylindrical specimens), indirect tensile strength test (21 cylindrical specimens)
... Show More