As the reservoir conditions are in continuous changing during its life, well production rateand its performance will change and it needs to re-model according to the current situationsand to keep the production rate as high as possible.Well productivity is affected by changing in reservoir pressure, water cut, tubing size andwellhead pressure. For electrical submersible pump (ESP), it will also affected by numberof stages and operating frequency.In general, the production rate increases when reservoir pressure increases and/or water cutdecreases. Also the flow rate increase when tubing size increases and/or wellhead pressuredecreases. For ESP well, production rate increases when number of stages is increasedand/or pump frequency is increased.In this study, a nodal analysis software was used to design one well with natural flow andother with ESP. Reservoir, fluid and well information are taken from actual data of Mishrifformation-Nasriya oil field/ NS-5 well. Well design steps and data required in the modelwill be displayed and the optimization sensitivity keys will be applied on the model todetermine the effect of each individual parameter or when it combined with another one.
In this study, gamma-ray spectrometry with an HPGe detector was used to measure the specific activity concentrations of 226Ra, 232Th, and 40K in soil samples collected from IT1 oil reservoirs in Kirkuk city, northeast Iraq. The “spectral line Gp” gamma analysis software package was used to analyze the spectral data. 226Ra specific activity varies from 9 0.34 Bq.kg-1 to 17 0.47 Bq.kg-1. 232Th specific activity varies from 6.2 0.08 Bq.kg-1 to 18 0.2 Bq.kg-1. 40K specific activity varies from 25 0.19 Bq.kg-1 to 118 0.41 Bq.kg-1. The radiological hazard due to the radiation emitted from natural r
... Show MoreKE Sharquie, AA Noaimi, GA Ibrahim, AS Al-Husseiny, Our Dermatology Online, 2016 - Cited by 3
Land Use / Land Cover (LULC) classification is considered one of the basic tasks that decision makers and map makers rely on to evaluate the infrastructure, using different types of satellite data, despite the large spectral difference or overlap in the spectra in the same land cover in addition to the problem of aberration and the degree of inclination of the images that may be negatively affect rating performance. The main objective of this study is to develop a working method for classifying the land cover using high-resolution satellite images using object based method. Maximum likelihood pixel based supervised as well as object approaches were examined on QuickBird satellite image in Karbala, Iraq. This study illustrated that
... Show MoreThis study is an approach to assign the land area of Kirkuk city [ a city located in the northern of Iraq, 236 kilometers north of Baghdad and 83 kilometers south of Erbil [ Climatic atlas of Iraq, 1941-1970 ] into different multi zones by using Satellite image and Arc Map10.3, zones of different traffic noise pollutions. Land zonings process like what achieved in this paper will help and of it’s of a high interest point for the future of Kirkuk city especially urban
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThe purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThe Carbonate-clastic succession in this study is represented by the Shuaiba and Nahr Umr Formations deposited during the Albian - Aptian Sequence. The present study includes petrography, microfacies analyses, and studying reservoir characterizations for 5 boreholes within West Qurna oil field in the study area. According to the type of study succession (clastic – Carbonate) there are two types of facies analyses:-Carbonate facies analysis, which showed five major microfacies were recognized in the succession of the Shuaiba Formation, bioclastic mudstones to wackstone, Orbitolina wackestone to packstone, Miliolids wackestone, Peloidal wackestone to packstone and mudstone to wackestone identified as an open shelf toward the deep basin.
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show More