Two types of adsorbents were used to treat oily wastewater, activated carbon and zeolite. The removal efficiencies of these materials were compared to each other. The results showed that activated carbon performed some better properties in removal of oil. The experimental methods which were employed in this investigation included batch and column studies. The former was used to evaluate the rate and equilibrium of carbon and zeolie adsorption, while the latter was used to determine treatment efficiencies and performance characteristics. Expanded bed adsorber was constructed in the column studies. In this study, the adsorption behavior of vegetable oil (corn oil) onto activated carbon and zeolite was examined as a function of the concentration of the adsorbate, contact time, adsorbent dosage and amount of coagulant salt(calcium sulphate) added . The adsorption data was modeled with Freundlich and Langmuir adsorption isotherms. and it was found that the adsorption process on activated carbon and zeolite fit the Freundlich isotherm model. The amount of oil adsorbed increased with increasing the contact time, but longer mixing duration did not increase residual oil removal from wastewater due to the coverage of the adsorbent surface with oil molecules. It was found that as the dosage of adsorbent increased, the percentage of residual oil removal also increased. The effects of adsorbent type and amount of coagulant salt(calcium sulphate) added on the breakthrough curve were studied in details in the column studies. Expanded bed behavior was modeled using the Richardson-Zaki correlation between the superficial velocity of the feed stream and the void fraction of the bed at moderate Reynolds number.
The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MorePavement crack and pothole identification are important tasks in transportation maintenance and road safety. This study offers a novel technique for automatic asphalt pavement crack and pothole detection which is based on image processing. Different types of cracks (transverse, longitudinal, alligator-type, and potholes) can be identified with such techniques. The goal of this research is to evaluate road surface damage by extracting cracks and potholes, categorizing them from images and videos, and comparing the manual and the automated methods. The proposed method was tested on 50 images. The results obtained from image processing showed that the proposed method can detect cracks and potholes and identify their severity levels wit
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
The present paper describes and analyses three proposed cogeneration plants include back pressure steam-turbine system, gas turbine system, diesel-engine system, and the present Dura refinery plant. Selected actual operating data are employed for analysis. The same amount of electrical and thermal product outputs is considered for all systems to facilitate comparisons. The theoretical analysis was done according to 1st and 2nd law of thermodynamic. The results demonstrate that exergy analysis is a useful tool in performance analysis of cogeneration systems and permits meaningful comparisons of different cogeneration systems based on their merits, also the result showed that the back pressure steam-turbine is more efficient than other pro
... Show MoreThis paper is an attempt to foster creative performance of students in essay writing through using tips. Prewriting tips are a series of strategies ( outline in essay writing). These tips enable :::e Iraqi students to be aware of the process of writing as a guideline to what is expected from them as good essay writers. The study aims at: I .Finding out whether college students are aware of using these tips in fostering creativity performance in their essay writing? 2. To what extent can the application of these tips contribute in developing students' essay writing? To achieve these aims. 2 questionnaire and a test have been conducted and distributed on two parallel th
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreA .technology analysis image using crops agricultural of grading and sorting the test to conducted was experiment The device coupling the of sensor a with camera a and 75 * 75 * 50 dimensions with shape cube studio made-factory locally the study to studio the in taken were photos and ,)blue-green - red (lighting triple with equipped was studio The .used were neural artificial and technology processing image using maturity and quality ,damage of fruits the of characteristics external value the quality 0.92062, of was value regression the damage predict to used was network neural artificial The .network the using scheme regression a of means by 0.98654 of was regression the of maturity and 0.97981 of was regression the of .algorithm Marr
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreThe process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show More