Preferred Language
Articles
/
ijcpe-236
Chromium (VI) Removal from Wastewater by Electrocoagulation Process Using Taguchi Method: Batch Experiments
...Show More Authors

Electrocoagulation is an electrochemical method for treatment of different types of wastewater  whereby sacrificial anodes corrode to release active coagulant (usually aluminium or iron cations) into solution, while simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation or settling. The Taguchi method was applied as an experimental design and to determine the best conditions for chromium (VI) removal from wastewater. Various parameters in a batch stirred tank by iron metal electrodes: pH, initial chromium concentration, current density, distance between electrodes and KCl concentration were investigated, and the results have been analyzed using signal-to-noise (S/N) ratio. It was found that the removal efficiency of chromium increased with  increasing current density and KCl concentration, and decreases with increasing initial chromium concentration and distance between electrodes, while pH shows peak performance curve. Experimental work have been performed for synthetic solutions and real industrial effluent. The results showed that the removal efficiency of synthetic solution is higher than industrial wastewater, the maximum removal for prepared solution is 91.72 %, while it was 73.54 % for industrial wastewater for the same conditions.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
Accounting Mining Data Using Neural Networks (Case study)
...Show More Authors

Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Feb 01 2023
Journal Name
Trends Technological And Science ,engineering
Automated Sorting for Tomatoes using Artificial Neural Network
...Show More Authors

A .technology analysis image using crops agricultural of grading and sorting the test to conducted was experiment The device coupling the of sensor a with camera a and 75 * 75 * 50 dimensions with shape cube studio made-factory locally the study to studio the in taken were photos and ,)blue-green - red (lighting triple with equipped was studio The .used were neural artificial and technology processing image using maturity and quality ,damage of fruits the of characteristics external value the quality 0.92062, of was value regression the damage predict to used was network neural artificial The .network the using scheme regression a of means by 0.98654 of was regression the of maturity and 0.97981 of was regression the of .algorithm Marr

... Show More
Publication Date
Tue Dec 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Using Artificial Neural Network Models For Forecasting & Comparison
...Show More Authors

The Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Journal Of Engineering
Risk Assessment in BOT Contracts using AHP Technique
...Show More Authors

The process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method

... Show More
View Publication Preview PDF
Crossref (5)
Crossref
Publication Date
Sat Mar 01 2008
Journal Name
Iraqi Journal Of Physics
Smoothing of Image using adaptive Lowpass Spatial Filtering
...Show More Authors

Lowpass spatial filters are adopted to match the noise statistics of the degradation seeking
good quality smoothed images. This study imply different size and shape of smoothing
windows. The study shows that using a window square frame shape gives good quality
smoothing and at the same time preserving a certain level of high frequency components in
comparsion with standard smoothing filters.

View Publication Preview PDF
Publication Date
Fri Sep 09 2022
Journal Name
Research Anthology On Improving Medical Imaging Techniques For Analysis And Intervention
Groupwise Non-Rigid Image Alignment Using Few Parameters
...Show More Authors

Groupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff

... Show More
View Publication
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Sun Oct 01 2023
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Intelligence framework dust forecasting using regression algorithms models
...Show More Authors

<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c

... Show More
View Publication
Scopus (3)
Scopus Crossref
Publication Date
Mon Oct 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Nurse Scheduling Problem Using Hybrid Simulated Annealing Algorithm
...Show More Authors

Nurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Journal Of Engineering
Face-based Gender Classification Using Deep Learning Model
...Show More Authors

Gender classification is a critical task in computer vision. This task holds substantial importance in various domains, including surveillance, marketing, and human-computer interaction. In this work, the face gender classification model proposed consists of three main phases: the first phase involves applying the Viola-Jones algorithm to detect facial images, which includes four steps: 1) Haar-like features, 2) Integral Image, 3) Adaboost Learning, and 4) Cascade Classifier. In the second phase, four pre-processing operations are employed, namely cropping, resizing, converting the image from(RGB) Color Space to (LAB) color space, and enhancing the images using (HE, CLAHE). The final phase involves utilizing Transfer lea

... Show More
View Publication Preview PDF
Crossref (2)
Crossref