Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain method for multiple eliminations. The method is tested on a fake reflection event to authorize their validity, and applied to a real field X-profile 2D seismic data from southern Iraq. The results ensure the possibility of internal multiple types existing in the deep reflection data in Iraq and have to remove. So that the interpretation for the true reflectors be valid. The final processed stacked seismic data using normal move out- frequency-wavenumber domain technique shows good, clear, and sharp reflectors in comparison with the conventional normal move out stack data. Open-source Madagascar reproducible package is used for processing all steps of this study and the package is very efficient, accurate, and easy to implement normal move out, frequency-wavenumber domain, Dip-filter programs. The aim of the current study is to separate internal multiples and noise from the real 2D seismic data.
This study focused on treating wastewater to remove phosphorus by adsorption onto naturaland local materials. Burned kaolin, porcelinite, bauxite and limestone were selected to be testedas adsorption materials.The adsorption isotherms were evaluated by batch experiments, studyingthe effects of pH, temperature and initial phosphorus concentration. The results showed that at pH6, temperature 20°C and 300 mg/l initial phosphorus concentration; the sorption capacity was0.61, 9, 10 and 13 mg/g at 10 h contact time, for burned kaolin, porcelanite, limestone and bauxiterespectively. As the pH increased from 2 to 10 the removal efficiency for the materials differs inbehaviour. The removal efficiency increased from 40 to 90 % for limestone, and dec
... Show MoreThe presence of residual antibiotics in water results in the development of antibiotics resistant genes. The available wastewater treatment systems are not capable of removing such antibiotics from sewage. Thus, antibiotics need to be removed before the discharge of wastewater. Adsorption is among the promising techniques for the wastewater treatment to aid the removal of a wide range of organic and inorganic pollutants. The present work is a contribution to the search for an economical method for the removal of low concentrations of amoxicillin (AMX) from water by adsorption on water treatment residue, WTR, taken from a local drinking water facility. The chemical composition and the adsorptive characteristics of the material were first
... Show MoreThe removal of boron from aqueous solution was carried out by electrocoagulation (EC) using magnesium electrodes as anode and stainless steel electrodes as cathode. Several operating parameters on the removal efficiency of boron were investigated, such as initial pH, current density, initial boron ion concentration, NaCl concentration, spacing between electrodes, electrode material, and presence of carbonate concentration. The optimum removal efficiency of 91. 5 % was achieved at a current density of 3 mA/cm² and pH = 7 using (Mg/St. St. ) electrodes, within 45 min of operating time. The concentration of NaCl was o. 1 g/l with a 0.5cm spacing between the electrodes. First and second order rate equation were applied to study adsorp
... Show MoreIn the present work advanced oxidation process, photo-Fenton (UV/H2O2/Fe+2) system, for the treatment of wastewater contaminated with oil was investigated. The reaction was influenced by the input concentration of hydrogen peroxide H2O2, the initial amount of the iron catalyst Fe+2, pH, temperature and the concentration of oil in the wastewater. The removal efficiency for the system UV/ H2O2/Fe+2 at the optimal conditions and dosage (H2O2 = 400mg/L, Fe+2 = 40mg/L, pH=3, temperature =30o C) for 1000mg/L load was found to be 72%.
The removal of Anit-Inflammatory drugs, namely; Acetaminophen (ACTP), from wastewater by bulk liquid membrane (BLM) process using Aliquat 336 (QCl) as a carrier was investigated. The effects of several parameters on the extraction efficiency were studied in this research, such as the initial feed phase concentration (10-50) ppm of ACTP, stripping phase (NaCl) concentration (0.3,0.5,0.7 M), temperature (30-50oC), the volume ratio of feed phase to membrane phase (200-400ml/80ml), agitation speed of the feed phase (75-125 rpm), membrane stirring speed (0, 100, 150 rpm), carrier concentration (1, 5, 9 wt%), the pH of feed (2, 4, 6, 8, 10), and solvent type (CCl4 and n-Heptane). The study shows that high ext
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreA number of compression schemes were put forward to achieve high compression factors with high image quality at a low computational time. In this paper, a combined transform coding scheme is proposed which is based on discrete wavelet (DWT) and discrete cosine (DCT) transforms with an added new enhancement method, which is the sliding run length encoding (SRLE) technique, to further improve compression. The advantages of the wavelet and the discrete cosine transforms were utilized to encode the image. This first step involves transforming the color components of the image from RGB to YUV planes to acquire the advantage of the existing spectral correlation and consequently gaining more compression. DWT is then applied to the Y, U and V col
... Show More