Ex-situ bioremediation of 2,4-D herbicide-contaminated soil was studied using a slurry bioreactor operate at aerobic conditions. The performance of the slurry bioreactor was tested for three types of soil (sand, sandy loam and clay) contaminated with different concentration of 2,4-D, 200,300and500mg/kg soil. Sewage sludge was used as an inexpensive source of microorganisms which is available in large quantities in wastewater treatment plants. The results show that all biodegradation experiments demonstrated a significant decreases in 2,4-D concentration in the tested soils. The degradation efficiency in the slurry bioreactor decreases as the initial concentration of 2,4-D in the soils increases.A 100 % removal was achieved at initial concentration of 200mg 2,4-D/kg of sandy soil after 12 days and 92 % at 500mg 2,4-D/kg sandy soil after 14 days.Clay soil represented minimum removal efficiency among the three soils, 82 % at initial concentration of 200mg 2,4-D/kg clay soil after 12 days and 72 % for 500mg 2,4-D/kg clay soil after
14 days. Abiotic conditions were performed to investigate the desorption efficiency of the contaminant from soil to liquid phase through the three soils. In abiotic reactor the results showed that the rate of desorption for sand and sandy loam soils were nearly the same, it varied between0.102-0.135 day-1 at different initial concentration of 2,4-D. While for clay soil the desorption rate varied between 0.042- 0.031 day-1 at different initial concentration of 2,4-D. The decrease in desorption rate in clay soil refers to the characteristic of clay soil, (fine texture, high organic matter and high cation exchange capacity compared with the other soils) that may retain the 2,4-D in the organic matter and the clay minerals.
The paper presents a neural synchronization into intensive study in order to address challenges preventing from adopting it as an alternative key exchange algorithm. The results obtained from the implementation of neural synchronization with this proposed system address two challenges: namely the verification of establishing the synchronization between the two neural networks, and the public initiation of the input vector for each party. Solutions are presented and mathematical model is developed and presented, and as this proposed system focuses on stream cipher; a system of LFSRs (linear feedback shift registers) has been used with a balanced memory to generate the key. The initializations of these LFSRs are neural weights after achiev
... Show MoreThe present work includes design, construction and operates of a prototype solar absorption refrigeration system, using methanol as a refrigerant to avoid any refrigerant that cause global warming and greenhouse effect. Flat plate collector was used because it’s easy, ninexpensive and efficient. Many test runs (more than 50) were carried out on the system from May to October, 2013; the main results were taken between the period of July 15, 2013 to August 15, 2013 to find the maximum C.O.P, cooling, temperature and pressure of the system. The system demonstrates a maximum generator temperature of 93.5 oC, on July 18, 2013 at 2:30 pm, and the average mean generator temperature Tgavr was 74.7 °C, for this period. The maximum pressure Pg
... Show MoreFerritin is a key organizer of protected deregulation, particularly below risky hyperferritinemia, by straight immune-suppressive and pro-inflammatory things. , We conclude that there is a significant association between levels of ferritin and the harshness of COVID-19. In this paper we introduce a semi- parametric method for prediction by making a combination between NN and regression models. So, two methodologies are adopted, Neural Network (NN) and regression model in design the model; the data were collected from مستشفى دار التمريض الخاص for period 11/7/2021- 23/7/2021, we have 100 person, With COVID 12 Female & 38 Male out of 50, while 26 Female & 24 Male non COVID out of 50. The input variables of the NN m
... Show MoreMedical image segmentation is one of the most actively studied fields in the past few decades, as the development of modern imaging modalities such as magnetic resonance imaging (MRI) and computed tomography (CT), physicians and technicians nowadays have to process the increasing number and size of medical images. Therefore, efficient and accurate computational segmentation algorithms become necessary to extract the desired information from these large data sets. Moreover, sophisticated segmentation algorithms can help the physicians delineate better the anatomical structures presented in the input images, enhance the accuracy of medical diagnosis and facilitate the best treatment planning. Many of the proposed algorithms could perform w
... Show MoreIn unpredicted industrial environment, being able to adapt quickly and effectively to the changing is key in gaining a competitive advantage in the global market. Agile manufacturing evolves new ways of running factories to react quickly and effectively to changing markets, driven by customized requirement. Agility in manufacturing can be successfully achieved via integration of information system, people, technologies, and business processes. This article presents the conceptual model of agility in three dimensions named: driving factor, enabling technologies and evaluation of agility in manufacturing system. The conceptual model was developed based on a review of the literature. Then, the paper demonstrates the agility
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
The present paper describes and analyses three proposed cogeneration plants include back pressure steam-turbine system, gas turbine system, diesel-engine system, and the present Dura refinery plant. Selected actual operating data are employed for analysis. The same amount of electrical and thermal product outputs is considered for all systems to facilitate comparisons. The theoretical analysis was done according to 1st and 2nd law of thermodynamic. The results demonstrate that exergy analysis is a useful tool in performance analysis of cogeneration systems and permits meaningful comparisons of different cogeneration systems based on their merits, also the result showed that the back pressure steam-turbine is more efficient than other pro
... Show MoreThe process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method
... Show MoreSentiment analysis is one of the major fields in natural language processing whose main task is to extract sentiments, opinions, attitudes, and emotions from a subjective text. And for its importance in decision making and in people's trust with reviews on web sites, there are many academic researches to address sentiment analysis problems. Deep Learning (DL) is a powerful Machine Learning (ML) technique that has emerged with its ability of feature representation and differentiating data, leading to state-of-the-art prediction results. In recent years, DL has been widely used in sentiment analysis, however, there is scarce in its implementation in the Arabic language field. Most of the previous researches address other l
... Show More