Nasryia oil field is located about 38 Km to the north-west of Nasryia city. The field was discovered in 1975 after doing seismic by Iraqi national oil company. Mishrif formation is a carbonate rock (Limestone and Dolomite) and its thickness reach to 170m. The main reservoir is the lower Mishrif (MB) layer which has medium permeability (3.5-100) md and good porosity (10-25) %. Form well logging interpretation, it has been confirmed the rock type of Mishrif formation as carbonate rock. A ten meter shale layer is separating the MA from MB layer. Environmental corrections had been applied on well logs to use the corrected one in the analysis. The combination of Neutron-Density porosity has been chosen for interpretation as it is close to core porosity. Archie equation had been used to calculate water saturation using corrected porosity from shale effect and Archie parameters which are determined using Picket plot. Using core analysis with log data lead to establish equations to estimate permeability and porosity for non-cored wells. Water saturation form Archie was used to determine the oil-water contact which is very important in oil in place calculation. PVT software was used to choose the best fit PVT correlation that describes reservoir PVT properties which will be used in reservoir and well modeling. Numerical software was used to generate reservoir model using all geological and petrophysical properties. Using production data to do history matching and determine the aquifer affect as weak water drive. Reservoir model calculate 6.9 MMMSTB of oil as initial oil in place, this value is very close to that measured by Chevron study on same reservoir which was 7.1 MMMSTB. [1] Field production strategy had been applied to predict the reservoir behavior and production rate for 34 years. The development strategy used water injection to support reservoir pressure and to improve oil recovery. The result shows that the reservoir has the ability to produce oil at apparently stable rate equal to 85 Kbbl/d, also the recovery factor is about 14%.
There are many varied studies that dealt with the dramatic construction, especially books and studies that addressed drama in its construction and the method of writing it, that no textbook or a general cultural content is void of tackling the dramatic text in its construction and how the dramatic action develops in it. Therefore, a question occurs to the mind about the feasibility of dealing with the dramatic construction in this time, where many contemporary studies of dramatology and its relation and the contemporary critical directions are accumulating. This question many have two realistic aspects, yet the novelty and originality that this research shows lie in addressing a refined linguistic text in its style and connotations, such
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services through our ca
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreNovel has recently received the attention of readers and writers greatly, because of the role they play, and this indicates an important rule, which is whenever there is an art or creativity, there must be a respective criticism, and this criticism is certainly not less important than the author. So there are critics who have a prestigious literary position in the follow-up story development, and trying to describe the transformation of its elements. One of these critics is professor Fadhel Thamer, and who wants to approach one of the elements of the novel exploring, must stop on the visions of this critic about it , and that’s why we take the (character) element following the most important opinions of this critic about
... Show MoreIn our research, we seek to shed a light on one of the most important and sensitive issues, namely, the Sufi influence in the Iraqi novel through the lame maqam of the novelist Jumaa Al-Lami, the Sufi discourse contains many semantic paradoxes between the text's apparent pronunciation and its interpretation of the format and the context that produced these patterns, and incited them, which concludes different results from the prevailing provisions and fixed ideas from the narrative text.The Arabic and Iraqi novel in particular became inspired by the power of Sufi discourse by talking about several Sufi figures by referring to it openly, or implicitly inspired by unauthorized concealment, in employing some of the ideas, or summoning
... Show MoreOne of the most important features of the Amazon Web Services (AWS) cloud is that the program can be run and accessed from any location. You can access and monitor the result of the program from any location, saving many images and allowing for faster computation. This work proposes a face detection classification model based on AWS cloud aiming to classify the faces into two classes: a non-permission class, and a permission class, by training the real data set collected from our cameras. The proposed Convolutional Neural Network (CNN) cloud-based system was used to share computational resources for Artificial Neural Networks (ANN) to reduce redundant computation. The test system uses Internet of Things (IoT) services th
... Show MoreMeasurement of construction performance is essential to a clear image of the present situation. This monitoring by the management team is necessary to identify locations where performance is exceptionally excellent or poor and to identify the primary reasons so that the lessons gained may be exported to the firm and its progress strengthened. This research attempts to construct an integrated mathematical model utilizing one of the recent methodologies for dealing with the fuzzy representation of experts’ knowledge and judgment considering hesitancy called spherical fuzzy analytic hierarchy process (SFAHP) method to assess the contractor’s performance per the project performance pa
This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time t . The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method t
... Show More