A phytoremediation experiment was carried out with kerosene as a model for total petroleum hydrocarbons. A constructed wetland of barley was exposed to kerosene pollutants at varying concentrations (1, 2, and 3% v/v) in a subsurface flow (SSF) system. After a period of 42 days of exposure, it was found that the average ability to eliminate kerosene ranged from 56.5% to 61.2%, with the highest removal obtained at a kerosene concentration of 1% v/v. The analysis of kerosene at varying initial concentrations allowed the kinetics of kerosene to be fitted with the Grau model, which was closer than that with the zero order, first order, or second order kinetic models. The experimental study showed that the barley plant designed in a subsurface flow phytoremediation system would have great potential for the reclamation of kerosene-contaminated water.
In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreDecision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show More<p>Combating the COVID-19 epidemic has emerged as one of the most promising healthcare the world's challenges have ever seen. COVID-19 cases must be accurately and quickly diagnosed to receive proper medical treatment and limit the pandemic. Imaging approaches for chest radiography have been proven in order to be more successful in detecting coronavirus than the (RT-PCR) approach. Transfer knowledge is more suited to categorize patterns in medical pictures since the number of available medical images is limited. This paper illustrates a convolutional neural network (CNN) and recurrent neural network (RNN) hybrid architecture for the diagnosis of COVID-19 from chest X-rays. The deep transfer methods used were VGG19, DenseNet121
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreThe permeability determination in the reservoirs that are anisotropic and heterogeneous is a complicated problem due to the limited number of wells that contain core samples and well test data. This paper presents hydraulic flow units and flow zone indicator for predicting permeability of rock mass from core for Nahr-Umr reservoir/ Subba field. The Permeability measurement is better found in the laboratory work on the cored rock that taken from the formation. Nahr-Umr Formation is the main lower cretaceous sandstone reservoir in southern of Iraq. This formation is made up mainly of sandstone. Nahr-Umr formation was deposited on a gradually rising basin floor. The digenesis of Nahr-Umr sediments is very important du
... Show More