The application of ultrafiltration (UF) and nanofiltration (NF) processes in the handling of raw produced water have been investigated in the present study. Experiments of both ultrafiltration and nanofiltration processes are performed in a laboratory unit, which is operated in a cross-flow pattern. Various types of hollow fiber membranes were utilized in this study such as poly vinyl chloride (PVC) UF membrane, two different polyether sulfone (PES) NF membranes, and poly phenyl sulfone PPSU NF membrane. It was found that the turbidity of the treated water is higher than 95 % by using UF and NF membranes. The chemical oxygen demand COD (160 mg/l) and Oil content (26.8 mg/l) were found after treatment according to the allowable limits set by means of world health organization WHO water quality standards. The final composition of SO4-2 (110 mg/l) and NO3 (48.4 mg/l) components within the produced water after treatment were agreed with the permissible limits of WHO, whereas Cl-1 (8900 mg/l) component is not in the allowable limits. Finally by the use of PVC, PES and PPSU hollow fiber membranes; this method is seen to be not sufficient to remove the salinity of the produced water.
Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MorePavement crack and pothole identification are important tasks in transportation maintenance and road safety. This study offers a novel technique for automatic asphalt pavement crack and pothole detection which is based on image processing. Different types of cracks (transverse, longitudinal, alligator-type, and potholes) can be identified with such techniques. The goal of this research is to evaluate road surface damage by extracting cracks and potholes, categorizing them from images and videos, and comparing the manual and the automated methods. The proposed method was tested on 50 images. The results obtained from image processing showed that the proposed method can detect cracks and potholes and identify their severity levels wit
... Show MoreDifferent solvents (light naphtha, n-heptane, and n-hexane) are used to treat Iraqi Atmospheric oil residue by the deasphalting process. Oil residue from Al-Dura refinery with specific gravity 0.9705, API 14.9, and 0.5 wt. % sulfur content was used. Deasphalting oil (DAO) was examined on a laboratory scale by using solvents with different operation conditions (temperature, concentration of solvent, solvent to oil ratio, and duration time). This study investigates the effects of these parameters on asphaltene yield. The results show that an increase in temperature for all solvents increases the extraction of asphaltene yield. The higher reduction in asphaltene content is obtained with hexane solvent at operating conditions of (90 °C, 4/1
... Show MoreThe adsorption isotherms and kinetic uptakes of Carbon Dioxide (CO2) on fabricated electrospun nonwoven activated carbon nanofiber sheets were investigated at two different temperatures, 308 K and 343 K, over a pressure range of 1 to 7 bar. The activated carbon nanofiber-based on polymer (PAN) precursor was fabricated via electrospinning technique followed by thermal treatment to obtain the carboneous nanofibers. The obtained data of CO2 adsorption isotherm was fitted to various models, including Langmuir, Freundlich, and Temkin. Based on correlation coefficients, the Langmuir isotherm model presented the best fitting with CO2 adsorption isotherms’ experimental data. Raising the equ
In this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
The present paper describes and analyses three proposed cogeneration plants include back pressure steam-turbine system, gas turbine system, diesel-engine system, and the present Dura refinery plant. Selected actual operating data are employed for analysis. The same amount of electrical and thermal product outputs is considered for all systems to facilitate comparisons. The theoretical analysis was done according to 1st and 2nd law of thermodynamic. The results demonstrate that exergy analysis is a useful tool in performance analysis of cogeneration systems and permits meaningful comparisons of different cogeneration systems based on their merits, also the result showed that the back pressure steam-turbine is more efficient than other pro
... Show MoreThis paper is an attempt to foster creative performance of students in essay writing through using tips. Prewriting tips are a series of strategies ( outline in essay writing). These tips enable :::e Iraqi students to be aware of the process of writing as a guideline to what is expected from them as good essay writers. The study aims at: I .Finding out whether college students are aware of using these tips in fostering creativity performance in their essay writing? 2. To what extent can the application of these tips contribute in developing students' essay writing? To achieve these aims. 2 questionnaire and a test have been conducted and distributed on two parallel th
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreA .technology analysis image using crops agricultural of grading and sorting the test to conducted was experiment The device coupling the of sensor a with camera a and 75 * 75 * 50 dimensions with shape cube studio made-factory locally the study to studio the in taken were photos and ,)blue-green - red (lighting triple with equipped was studio The .used were neural artificial and technology processing image using maturity and quality ,damage of fruits the of characteristics external value the quality 0.92062, of was value regression the damage predict to used was network neural artificial The .network the using scheme regression a of means by 0.98654 of was regression the of maturity and 0.97981 of was regression the of .algorithm Marr
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show More