Preferred Language
Articles
/
FBYnC4sBVTCNdQwCC8fv
Phenol removal by electro-Fenton process using a 3D electrode with iron foam as particles and carbon fibre modified with graphene
...Show More Authors

The 3D electro-Fenton technique is, due to its high efficiency, one of the technologies suggested to eliminate organic pollutants in wastewater. The type of particle electrode used in the 3D electro-Fenton process is one of the most crucial variables because of its effect on the formation of reactive species and the source of iron ions. The electrolytic cell in the current study consisted of graphite as an anode, carbon fiber (CF) modified with graphene as a cathode, and iron foam particles as a third electrode. A response surface methodology (RSM) approach was used to optimize the 3D electro-Fenton process. The RSM results revealed that the quadratic model has a high R2 of 99.05 %. At 4 g L-1 iron foam particles, time of 5 h, and 1 g of graphene, the maximum efficiency of phenol removal of 92.58 % and chemical oxygen demand (COD) of 89.33 % were achieved with 32.976 kWh kg-1 phenol of consumed power. Based on the analysis of variance (ANOVA) results, the time has the highest impact on phenol removal efficiency, followed by iron foam and graphene dosage. In the present study, the 3D electro-Fenton technique with iron foam partials and carbon fiber modified with graphene was detected as a great choice for removing phenol from aqueous solutions due to its high efficiency, formation of highly reactive species, with excellent iron ions source electrode.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Jul 31 2022
Journal Name
Iraqi Journal Of Science
Solving the Created Equations from Power Function Distribution
...Show More Authors

      In this paper, a new class of ordinary differential equations is designed for some functions such as probability density function, cumulative distribution function, survival function and hazard function of power function distribution, these functions are used of the class under the study. The benefit of our work is that the equations ,which are generated from some probability distributions, are used to model and find  the  solutions  of problems in our lives, and that the solutions of these equations are a solution to these problems, as the solutions of the equations under the study are the closest and the most reliable to reality. The existence and uniqueness of solutions the obtained equations in the current study are dis

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Oct 20 2020
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Employ Shrinkage Estimation Technique for the Reliability System in Stress-Strength Models: special case of Exponentiated Family Distribution
...Show More Authors

       A reliability system of the multi-component stress-strength model R(s,k) will be considered in the present paper ,when the stress and strength are independent and non-identically distribution have the Exponentiated Family Distribution(FED) with the unknown  shape parameter α and known scale parameter λ  equal to two and parameter θ equal to three. Different estimation methods of R(s,k) were introduced corresponding to Maximum likelihood and Shrinkage estimators. Comparisons among the suggested estimators were prepared depending on simulation established on mean squared error (MSE) criteria.

View Publication Preview PDF
Crossref
Publication Date
Tue Oct 30 2018
Journal Name
Journal Of Engineering
Statistical Equations to Estimate the In-situ Concrete Compressive Strength from Non-destructive Tests
...Show More Authors

The aim of this study is to propose reliable equations to estimate the in-situ concrete compressive strength from the non-destructive test. Three equations were proposed: the first equation considers the number of rebound hummer only, the second equation consider the ultrasonic pulse velocity only, and the third equation combines the number of rebound hummer and the ultrasonic pulse velocity. The proposed equations were derived from non-linear regression analysis and they were calibrated with the test results of 372 concrete specimens compiled from the literature. The performance of the proposed equations was tested by comparing their strength estimations with those of related existing equations from literature. Comparis

... Show More
View Publication Preview PDF
Crossref (4)
Crossref
Publication Date
Sat Jun 04 2022
Journal Name
Al–bahith Al–a'alami
Speaking Truth to Power: Core Principles for Advancing International Journalism Education
...Show More Authors

A confluence of forces has brought journalism and journalism education to a precipice. The rise of fascism, the advance of digital technology, and the erosion of the economic foundation of news media are disrupting journalism and mass communication (JMC) around the world. Combined with the increasingly globalized nature of journalism and media, these forces are posing extraordinary challenges to and opportunities for journalism and media education. This essay outlines 10 core principles to guide and reinvigorate international JMC education. We offer a concluding principle for JMC education as a foundation for the general education of college students.

View Publication Preview PDF
Crossref
Publication Date
Wed Jun 29 2022
Journal Name
Journal Of Al-rafidain University College For Sciences ( Print Issn: 1681-6870 ,online Issn: 2790-2293 )
The Use Of Genetic Algorithm In Estimating The Parameter Of Finite Mixture Of Linear Regression
...Show More Authors

The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To

... Show More
View Publication
Crossref
Publication Date
Sat Jun 01 2024
Journal Name
Journal Of Engineering
Intelligent Dust Monitoring System Based on IoT
...Show More Authors

Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system

... Show More
View Publication
Publication Date
Wed Jan 01 2020
Journal Name
Iop Conference Series: Materials Science And Engineering
Determination of naturally occurring radionuclides in Disi aquifer water of Jordan
...Show More Authors

The Disi water samples were collected from different Disi aquifer wells in Jordan using a clean polyethylene container of 10-liter size. A hyper-pure germanium (HPGe) detector with high- resolution gamma-ray spectroscopy and a low background counting system was used for the identification of unknown gamma-rays emitting from radionuclides in the environmental samples. The ranges of specific activity concentrations of 226Ra and 228Ra in the Disi aquifer water were found to be from 0.302 ± 0.085 to 0.723 ± 0.207 and from 0.047 ± 0.010 to 0.525 ± 0.138 Bq L−1, with average values of 0.516 ± 0.090 and 0.287 ± 0.091 Bq L−1, respectively. The average combined radium (226Ra + 228Ra) activity and radium activity ratio (228Ra/226Ra) in Disi

... Show More
Scopus (1)
Scopus
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Choose the best model to measure the impact of human capital on labor productivityIn the manufacturing sector in Iraq
...Show More Authors

In this paper all possible regressions procedure as well as stepwise regression procedure were applied to select the best regression equation that explain the effect of human capital represented by different levels of human cadres on the productivity of the processing industries sector in Iraq by employing the data of a time series consisting of 21 years period. The statistical program SPSS was used to perform the required calculations.

View Publication Preview PDF
Crossref
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
An algorithm for binary codebook design based on the average bitmap replacement error (ABPRE)
...Show More Authors

In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
Development Binary Search Algorithm
...Show More Authors

There are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com

... Show More
View Publication Preview PDF
Crossref