Ex-situ bioremediation of 2,4-D herbicide-contaminated soil was studied using a slurry bioreactor operate at aerobic conditions. The performance of the slurry bioreactor was tested for three types of soil (sand, sandy loam and clay) contaminated with different concentration of 2,4-D, 200,300and500mg/kg soil. Sewage sludge was used as an inexpensive source of microorganisms which is available in large quantities in wastewater treatment plants. The results show that all biodegradation experiments demonstrated a significant decreases in 2,4-D concentration in the tested soils. The degradation efficiency in the slurry bioreactor decreases as the initial concentration of 2,4-D in the soils increases.A 100 % removal was achieved at initial concentration of 200mg 2,4-D/kg of sandy soil after 12 days and 92 % at 500mg 2,4-D/kg sandy soil after 14 days.Clay soil represented minimum removal efficiency among the three soils, 82 % at initial concentration of 200mg 2,4-D/kg clay soil after 12 days and 72 % for 500mg 2,4-D/kg clay soil after
14 days. Abiotic conditions were performed to investigate the desorption efficiency of the contaminant from soil to liquid phase through the three soils. In abiotic reactor the results showed that the rate of desorption for sand and sandy loam soils were nearly the same, it varied between0.102-0.135 day-1 at different initial concentration of 2,4-D. While for clay soil the desorption rate varied between 0.042- 0.031 day-1 at different initial concentration of 2,4-D. The decrease in desorption rate in clay soil refers to the characteristic of clay soil, (fine texture, high organic matter and high cation exchange capacity compared with the other soils) that may retain the 2,4-D in the organic matter and the clay minerals.
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
Due to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The simulation shows the behavior of optical
... Show MoreIn this paper, new brain tumour detection method is discovered whereby the normal slices are disassembled from the abnormal ones. Three main phases are deployed including the extraction of the cerebral tissue, the detection of abnormal block and the mechanism of fine-tuning and finally the detection of abnormal slice according to the detected abnormal blocks. Through experimental tests, progress made by the suggested means is assessed and verified. As a result, in terms of qualitative assessment, it is found that the performance of proposed method is satisfactory and may contribute to the development of reliable MRI brain tumour diagnosis and treatments.
Data mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreFacial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreIn many industries especially oil companies in Iraq consumed large quantities of water which will produce oil-contaminated water which can cause major pollution in agricultural lands and rivers. The aim of the present work is to enhance the efficiency of dispersed air flotation technique by using highly effective and cost-efficient coagulant to treating gas oil emulsion. The experimental work was carried out using bubble column made of Perspex glass (5cm I.D, 120cm height). A liquid was at depth of 60cm. Different dosage of sawdust +bentonite at ratio 2:1 (0.5+ 0.25; 1+ 0.5 and 2+1) gm and alum at concentration (10,20and30mg/l) at different pH ( 4 and 7) were used to determine optimum dosages of coagulant. Jar test exper
... Show More