Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain method for multiple eliminations. The method is tested on a fake reflection event to authorize their validity, and applied to a real field X-profile 2D seismic data from southern Iraq. The results ensure the possibility of internal multiple types existing in the deep reflection data in Iraq and have to remove. So that the interpretation for the true reflectors be valid. The final processed stacked seismic data using normal move out- frequency-wavenumber domain technique shows good, clear, and sharp reflectors in comparison with the conventional normal move out stack data. Open-source Madagascar reproducible package is used for processing all steps of this study and the package is very efficient, accurate, and easy to implement normal move out, frequency-wavenumber domain, Dip-filter programs. The aim of the current study is to separate internal multiples and noise from the real 2D seismic data.
ABSTRACT
This study aimed to choose top stocks through technical analysis tools specially the indicator called (ratio of William index), and test the ability of technical analysis tools in building a portfolio of shares efficient in comparison with the market portfolio. These one technical tools were used for building one portfolios in 21 companies on specific preview conditions and choose 10 companies for the period from (March 2015) to (June 2017). Applied results of the research showed that Portfolio yield for companies selected according to the ratio of William index indicator (0.0406) that
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
Due to the easily access to the satellite images, Google Earth (GE) images have become more popular than other online virtual globes. However, the popularity of GE is not an indication of its accuracy. A considerable amount of literature has been published on evaluating the positional accuracy of GE data; however there are few studies which have investigated the subject of improving the GE accuracy. In this paper, a practical method for enhancing the horizontal positional accuracy of GE is suggested by establishing ten reference points, in University of Baghdad main campus, using different Global Navigation Satellite System (GNSS) observation techniques: Rapid Static, Post-Processing Kinematic, and Network. Then, the GE image for the study
... Show MoreThis article explores the process of VGI collection by assessing the relative usability and accuracy of a range of different methods (Smartphone GPS, Tablet, and analogue maps) for data collection amongst different demographic and educational groups, and in different geographical contexts. Assessments are made of positional accuracy, completeness, and data collectors’ experiences with reference to the official cadastral data and the administration system in a case-study region of Iraq. Ownership data was validated by crowd agreement. The result shows that successful VGI projects have access to varying data collection methods.
This paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreIn this study, from a total of 856 mastitis cases in lactating ewes, only 34 Streptococcus agalactiae isolates showed various types of resistance to three types of antibiotics (Penicillin, Erythromycin and Tetracycline). St. agalactiae isolates were identified according to the standard methods, including a new suggested technique called specific Chromogenic agar. It was found that antibiotic bacterial resistance was clearly identified by using MIC-microplate assay (dilution method). Also, by real-time PCR technique, it was determined that there were three antibiotics genes resistance ( pbp2b, tetO and mefA ). The high percentage of isolate carried of a single gene which was the Tetracycline (20.59%) followed by percentage Penicillin was
... Show More