Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
Methods of estimating statistical distribution have attracted many researchers when it comes to fitting a specific distribution to data. However, when the data belong to more than one component, a popular distribution cannot be fitted to such data. To tackle this issue, mixture models are fitted by choosing the correct number of components that represent the data. This can be obvious in lifetime processes that are involved in a wide range of engineering applications as well as biological systems. In this paper, we introduce an application of estimating a finite mixture of Inverse Rayleigh distribution by the use of the Bayesian framework when considering the model as Markov chain Monte Carlo (MCMC). We employed the Gibbs sampler and
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreStatisticians often use regression models like parametric, nonparametric, and semi-parametric models to represent economic and social phenomena. These models explain the relationships between different variables in these phenomena. One of the parametric model techniques is conic projection regression. It helps to find the most important slopes for multidimensional data using prior information about the regression's parameters to estimate the most efficient estimator. R algorithms, written in the R language, simplify this complex method. These algorithms are based on quadratic programming, which makes the estimations more accurate.
Diabetes mellitus type 2 (T2DM) is a chronic and progressive condition, which affects people all around the world. The risk of complications increases with age if the disease is not managed properly. Diabetic neuropathy is caused by excessive blood glucose and lipid levels, resulting in nerve damage. Apelin is a peptide hormone that is found in different human organs, including the central nervous system and adipose tissue. The aim of this study is to estimate Apelin levels in diabetes type 2 and Diabetic peripheral Neuropathy (DPN) Iraqi patients and show the extent of peripheral nerve damage. The current study included 120 participants: 40 patients with Diabetes Mellitus, 40 patients with Diabetic peripheral Neuropathy, and 40 healthy
... Show More<p>Recently, reconfigurable intelligent surfaces have an increasing role to enhance the coverage and quality of mobile networks especially when the received signal level is very weak because of obstacles and random fluctuation. This motivates the researchers to add more contributions to the fields of reconfigurable intelligent surfaces (RIS) in wireless communications. A substantial issue in reconfigurable intelligent surfaces is the huge overhead for channel state information estimation which limits the system’s performance, oppressively. In this work, a newly proposed method is to estimate the angle of arrival and path loss at the RIS side and then send short information to the base station rather than huge overhe
... Show MoreWhen the depth of stressed soil is rather small, Plate Load Test (PLT) becomes the most efficient test to estimate the soil properties for design purposes. Among these properties, modulus of subgrade reaction is the most important one that usually employed in roads and concrete pavement design. Two methods are available to perform PLT: static and dynamic methods. Static PLT is usually adopted due to its simplicity and time saving to be performs in comparison with cyclic (dynamic) method. The two methods are described in ASTM standard.
In this paper the effect of the test method used in PLT in estimation of some mechanical soil properties was distinguished via a series of both test methods applied in a same site. The comparison of
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
This study employs evolutionary optimization and Artificial Intelligence algorithms to determine an individual’s age using a single-faced image as the basis for the identification process. Additionally, we used the WIKI dataset, widely considered the most comprehensive collection of facial images to date, including descriptions of age and gender attributes. However, estimating age from facial images is a recent topic of study, even though much research has been undertaken on establishing chronological age from facial photographs. Retrained artificial neural networks are used for classification after applying reprocessing and optimization techniques to achieve this goal. It is possible that the difficulty of determining age could be reduce
... Show More