Prediction of the formation of pore and fracture pressure before constructing a drilling wells program are a crucial since it helps to prevent several drilling operations issues including lost circulation, kick, pipe sticking, blowout, and other issues. IP (Interactive Petrophysics) software is used to calculate and measure pore and fracture pressure. Eaton method, Matthews and Kelly, Modified Eaton, and Barker and Wood equations are used to calculate fracture pressure, whereas only Eaton method is used to measure pore pressure. These approaches are based on log data obtained from six wells, three from the north dome; BUCN-52, BUCN-51, BUCN-43 and the other from the south dome; BUCS-49, BUCS-48, BUCS-47. Along with the overburden pressure gradient and clay volume, which were also established first, data such as gamma ray, density, resistivity, and sonic log data are also required. A key consideration in the design of certain wells is the forecasting of fracture pressure for wells drilled in the southern Iraqi oilfield of Buzurgan. The pressure abnormality is found in MA, MB21, MC1 and MC2 units by depending on pore pressures calculated from resistivity log. In these units, depths and its equivalent normal and abnormal pressure are detected for all sex selected wells; BUCS-47, BUCS-48, BUCS-49, BUCN-43, BUCN-51 and BBCN-52. For MA, MB21, MC1, and MC2 units, the highest difference in pore pressure values are 1698 psi @ 3750 m (BUCN-51), 3420 psi @ 3900 m (BUCN-51), 788 psi @ 3980 m (BUCS-49), and 5705 psi @ 4020 m (BUCN-52). On other hands, MB11 and MB12 units have normal pressure trend in all studied wells. Finally, the results show that the highest pore and fracture pressure values is existed in North dome, in comparison with that obtained in south dome of Mishrif reservoir at Buzurgan oilfield.
The current study was designed to investigate the impact of the missense Single Nucleotide Polymorphism (SNP), Asn291Ser (c.872A>G: rs12470652), of LHR gene (Luteinizing hormone receptor gene) in peripheral blood samples of Iraqi infertile women diagnosed with premature ovarian failure (POF) and normosmic idiopathic hypogonadotropic hypogonadism(niHH, patients with normal sense of smell). Following the hormonal analysis, fifty women diagnosed with premature ovarian failure and fifty women diagnosed with normosmic idiopathic hypogonadotropic hypogonadism were included as patient groups, while fifty healthy fertile women were enrolled as a control group. The blood samples were obtained from patient and control groups at Kamal Al-Samarra
... Show MoreThe study aims to display the scientific benefit offered by modern electronic programs for various scientific research methods, while determining the positive scientific role played by these programs in modernizing the methodologies and logic of scientific thinking, especially with the rapid development of the sciences and their curricula.
These programs link accurately with scientific results. The importance of the study is to provide practical mechanisms to highlight the scientific projection of the electronic programs in various steps of scientific research.
A case study was used for Tropes version 8.4, which analyzes written, audio and visual semantic texts and presents a set of statistical results that facilitate the difficult
Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreMukdadiya Formation represents one of the formations that cover a huge area of Iraq. It contains several clastic deposits, such as sandstone, siltstone, and a noticeable amount of gravels. The gravels are considered as the hallmark to differentiate between Injana and Mukdadiya formations. Therefore, the current study focused on these facies to determine the petrography, paleontology , and origin of Mukdadiya deposits. The results of SEM-EDX and XRD analyses showed two types of gravels, namely the siliceous and lime gravels. The highest percentage of gravels belonged to the sedimentary origin (limestone). The elements of Si, Ca, and Fe represented the common elements that formed the studied gravels. The pale
... Show MoreThis paper proposed to build an authentication system between business partners on e-commerce application to prevent the frauds operations based on visual cryptography shares encapsulated by chen’s hyperchaotic key sequence. The proposed system consist of three phases, the first phase based on the color visual cryptography without complex computations, the second phase included generate sequence of DNA rules numbers and finally encapsulation phase is implemented based on use the unique initial value that generate in second phase as initial condition with Piecewise Linear Chaotic Maps to generate sequences of DNA rules numbers. The experimental results demonstrate the proposed able to overcome on cheating a
... Show MoreThe Non-Photorealistic Rendering (NPR) demands are increased with the development of electronic devices. This paper presents a new model for a cartooning system, as an essential category of the NPR. It is used the concept of vector quantization and Logarithmic Image Processing (LIP). An enhancement of Kekre Median Codebook Generation (KMCG) algorithm has been proposed and used by the system. Several metrics utilized to evaluate the time and quality of the system. The results showed that the proposed system reduced the time of cartoon production. Additionally, it enhanced the quality of several aspects like smoothing, color reduction, and brightness.
Hiding secret information in the image is a challenging and painstaking task in computer security and steganography system. Certainly, the absolute intricacy of attacks to security system makes it more attractive.in this research on steganography system involving information hiding,Huffman codding used to compress the secret code before embedding which provide high capacity and some security. Fibonacci decomposition used to represent the pixels in the cover image, which increase the robustness of the system. One byte used for mapping all the pixels properties. This makes the PSNR of the system higher due to random distribution of embedded bits. Finally, three kinds of evaluation are applied such as PSNR, chi-square attack, a
... Show MoreThe fact that the signature is widely used as a means of personal verification
emphasizes the need for an automatic verification system. Verification can be
performed either Offline or Online based on the application. Offline systems work on
the scanned image of a signature. In this paper an Offline Verification of handwritten
signatures which use set of simple shape based geometric features. The features used
are Mean, Occupancy Ratio, Normalized Area, Center of Gravity, Pixel density,
Standard Deviation and the Density Ratio. Before extracting the features,
preprocessing of a scanned image is necessary to isolate the signature part and to
remove any spurious noise present. Features Extracted for whole signature
Automatic recognition of individuals is very important in modern eras. Biometric techniques have emerged as an answer to the matter of automatic individual recognition. This paper tends to give a technique to detect pupil which is a mixture of easy morphological operations and Hough Transform (HT) is presented in this paper. The circular area of the eye and pupil is divided by the morphological filter as well as the Hough Transform (HT) where the local Iris area has been converted into a rectangular block for the purpose of calculating inconsistencies in the image. This method is implemented and tested on the Chinese Academy of Sciences (CASIA V4) iris image database 249 person and the IIT Delhi (IITD) iris
... Show MoreImage content verification is to confirm the validity of the images, i.e. . To test if the image has experienced any alteration since it was made. Computerized watermarking has turned into a promising procedure for image content verification in light of its exceptional execution and capacity of altering identification.
In this study, a new scheme for image verification reliant on two dimensional chaotic maps and Discrete Wavelet Transform (DWT) is introduced. Arnold transforms is first applied to Host image (H) for scrambling as a pretreatment stage, then the scrambled host image is partitioned into sub-blocks of size 2×2 in which a 2D DWT is utilized on ea
... Show More