In this paper, wavelets were used to study the multivariate fractional Brownian motion through the deviations of the random process to find an efficient estimation of Hurst exponent. The results of simulations experiments were shown that the performance of the proposed estimator was efficient. The estimation process was made by taking advantage of the detail coefficients stationarity from the wavelet transform, as the variance of this coefficient showed the power-low behavior. We use two wavelet filters (Haar and db5) to manage minimizing the mean square error of the model.
This paper focuses on the most important element of scientific research: the research problem which is confined to the concept of concern or concern surrounding the researcher about any event or phenomenon or issue paper and need to be studied and addressed in order to find solutions for them, to influence the most scientific research steps from asking questions and formulating hypotheses, to employ suitable methods and tools to choose the research and sample community, to employ measurement and analysis tools. This problem calls for a great effort by the researcher intellectually or materially to develop solutions.
Bacterial contamination of AL-Habania reservoir was studied during the period from February 2005 to January 2006; samples were collected from four stations (AL-Warrar, AL-Theban regulator, middle of the reservoir and the fourth was towards AL-Razzaza reservoir). Coliform bacteria, faecal Coliforms, Streptococci, and faecal Streptococci were used as parameters of bacterial contamination in waters through calculating the most probable number. Highest count of Coliform bacteria (1500 cell/100ml) was recorded at AL-Razaza during August, and the lowest count was less than (300 cell/100ml) in the rest of the collection stations for all months. Fecal Coliform bacteria ranged between less than 300 cells/100ml in all stations for all months to 700 c
... Show MoreA field-pilot scale slow sand filter (SSF) was constructed at Al-Rustamiya Sewage Treatment Plant (STP) in Baghdad city to investigate the removal efficiency in terms of Biochemical Oxygen Demand (BOD5), Chemical oxygen demand (COD), Total Suspended Solids (TSS) and Chloride concentrations for achieving better secondary effluent quality from this treatment plant. The SSF was designed at a 0.2 m/h filtration rate with filter area 1 m2 and total filter depth of 2.3 m. A filter sand media 0.35 mm in size and 1 m depth was supported by 0.2 m layer of gravel of size 5 mm. The secondary effluent from Al-Rustamiya STP was used as the influent to the slow sand filter. The results showed that the removal of BOD5, COD, TSS, and Chloride were
... Show MoreThe interlaminar fracture toughness of polymer blends reinforced by glass fiber has
been investigated. Epoxy (EP), unsaturated polyester(UPE), polystyrene (PS),
polyurethane (PU) and their blends with different ratios (10%PS/90%EP),
(20%PS/80%EP), (20%PU/80%EP) and (20%PU/80%UPE) were chosen as a matrices A
sheet of composites were prepared using hand lay -up method, these sheet were cut as the
double cantilever beam (DCB) specimen to determine interlaminar fracture toughness of
these composites .Its found that, blending of EP,UPE with 20% of PU will improve the
interlaminar fracture toughness ,but the adding of 10% PS, 20%PS to EP will decrease
the interlaminar toughness of these composites.
In petroleum reservoir engineering, history matching refers to the calibration process in which a reservoir simulation model is validated through matching simulation outputs with the measurement of observed data. A traditional history matching technique is performed manually by engineering in which the most uncertain observed parameters are changed until a satisfactory match is obtained between the generated model and historical information. This study focuses on step by step and trial and error history matching of the Mishrif reservoir to constrain the appropriate simulated model. Up to 1 January 2021, Buzurgan Oilfield, which has eighty-five producers and sixteen injectors and has been under production for 45 years when it started
... Show MoreLow salinity (LS) water flooding is a promising EOR method which has been examined by many experimental studies and field pilots for a variety of reservoirs and oils. This paper investigates applying LS flooding to a heavy oil. Increasing the LS water temperature improves heavy oil recovery by achieving higher sweep efficiency and improving oil mobility by lowering its viscosity. Steam flooding projects have reported many problems such as steam gravity override, but override can be lessened if the steam is is alternated with hot LS water. In this study, a series of reservoir sandstone cores were obtained from Bartlesville Sandstone (in Eastern Kansas) and aged with heavy crude oil (from the same reservoir) at 95°C for 45 days. Five reservo
... Show MoreAd-Hoc Networks are a generation of networks that are truly wireless, and can be easily constructed without any operator. There are protocols for management of these networks, in which the effectiveness and the important elements in these networks are the Quality of Service (QoS). In this work the evaluation of QoS performance of MANETs is done by comparing the results of using AODV, DSR, OLSR and TORA routing protocols using the Op-Net Modeler, then conduct an extensive set of performance experiments for these protocols with a wide variety of settings. The results show that the best protocol depends on QoS using two types of applications (+ve and –ve QoS in the FIS evaluation). QoS of the protocol varies from one prot
... Show MoreThe limitations of wireless sensor nodes are power, computational capabilities, and memory. This paper suggests a method to reduce the power consumption by a sensor node. This work is based on the analogy of the routing problem to distribute an electrical field in a physical media with a given density of charges. From this analogy a set of partial differential equations (Poisson's equation) is obtained. A finite difference method is utilized to solve this set numerically. Then a parallel implementation is presented. The parallel implementation is based on domain decomposition, where the original calculation domain is decomposed into several blocks, each of which given to a processing element. All nodes then execute computations in parall
... Show More