Ex-situ bioremediation of 2,4-D herbicide-contaminated soil was studied using a slurry bioreactor operate at aerobic conditions. The performance of the slurry bioreactor was tested for three types of soil (sand, sandy loam and clay) contaminated with different concentration of 2,4-D, 200,300and500mg/kg soil. Sewage sludge was used as an inexpensive source of microorganisms which is available in large quantities in wastewater treatment plants. The results show that all biodegradation experiments demonstrated a significant decreases in 2,4-D concentration in the tested soils. The degradation efficiency in the slurry bioreactor decreases as the initial concentration of 2,4-D in the soils increases.A 100 % removal was achieved at initial concentration of 200mg 2,4-D/kg of sandy soil after 12 days and 92 % at 500mg 2,4-D/kg sandy soil after 14 days.Clay soil represented minimum removal efficiency among the three soils, 82 % at initial concentration of 200mg 2,4-D/kg clay soil after 12 days and 72 % for 500mg 2,4-D/kg clay soil after
14 days. Abiotic conditions were performed to investigate the desorption efficiency of the contaminant from soil to liquid phase through the three soils. In abiotic reactor the results showed that the rate of desorption for sand and sandy loam soils were nearly the same, it varied between0.102-0.135 day-1 at different initial concentration of 2,4-D. While for clay soil the desorption rate varied between 0.042- 0.031 day-1 at different initial concentration of 2,4-D. The decrease in desorption rate in clay soil refers to the characteristic of clay soil, (fine texture, high organic matter and high cation exchange capacity compared with the other soils) that may retain the 2,4-D in the organic matter and the clay minerals.
It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MorePorosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in r
... Show MoreIn present work examined the oxidation desulfurization in batch system for model fuels with 2250 ppm sulfur content using air as the oxidant and ZnO/AC composite prepared by thermal co-precipitation method. Different factors were studied such as composite loading 1, 1.5 and 2.5 g, temperature 25 oC, 30 oC and 40 oC and reaction time 30, 45 and 60 minutes. The optimum condition is obtained by using Tauguchi experiential design for oxidation desulfurization of model fuel. the highest percent sulfur removal is about 33 at optimum conditions. The kinetic and effect of internal mass transfer were studied for oxidation desulfurization of model fuel, also an empirical kinetic model was calculated for model fuels
... Show MoreThis review investigates the practice and influence of chatbots and ChatGPT as employable tools in writing for scientific academic purposes. A primary collection of 150 articles was gathered from academic databases, but it was systematically chosen and refined to include 30 studies that focused on the use of ChatGPT and chatbot technology in academic writing contexts. Chatbots and ChatGPT in writing enhancement, support for student learning at higher education institutions, scientific and medical writing, and the evolution of research and academic publishing are some of the topics covered in the reviewed literature. The review finds these tools helpful, with their greatest advantages being in areas such as structuring writings, gram
... Show MoreIn this work, we first construct Hermite wavelets on the interval [0,1) with it’s product, Operational matrix of integration 2^k M×2^k M is derived, and used it for solving nonlinear Variational problems with reduced it to a system of algebric equations and aid of direct method. Finally, some examples are given to illustrate the efficiency and performance of presented method.
S a mples of compact magnesia and alumina were evaporated
using CO2-laser .The
Processed powders were characterized by electron microscopy
and both scanning and transmission electron microscope. The results
indicated that the particle size for both powders have reduced largely
to 0.003 nm and 0.07 nm for MgO and Al2O3, with increasing in
shape sphericity.
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show More