Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThis paper deal with the estimation of the shape parameter (a) of Generalized Exponential (GE) distribution when the scale parameter (l) is known via preliminary test single stage shrinkage estimator (SSSE) when a prior knowledge (a0) a vailable about the shape parameter as initial value due past experiences as well as suitable region (R) for testing this prior knowledge.
The Expression for the Bias, Mean squared error [MSE] and Relative Efficiency [R.Eff(×)] for the proposed estimator are derived. Numerical results about beha
... Show MoreA stochastic process {Xk, k = 1, 2, ...} is a doubly geometric stochastic process if there exists the ratio (a > 0) and the positive function (h(k) > 0), so that {α 1 h-k }; k ak X k = 1, 2, ... is a generalization of a geometric stochastic process. This process is stochastically monotone and can be used to model a point process with multiple trends. In this paper, we use nonparametric methods to investigate statistical inference for doubly geometric stochastic processes. A graphical technique for determining whether a process is in agreement with a doubly geometric stochastic process is proposed. Further, we can estimate the parameters a, b, μ and σ2 of the doubly geometric stochastic process by using the least squares estimate for Xk a
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThe current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances. From the diversity of Big Data variables comes many challenges that can be interesting to the researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreThis study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope