Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assum
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreBackground: Generally, genetic disorders are a leading cause of spontaneous abortion, neonatal death, increased morbidity and mortality in children and adults as well. They a significant health care and psychosocial burden for the patient, the family, the healthcare system and the community as a whole. Chromosomal abnormalities occur much more frequently than is generally appreciated. It is estimated that approximately 1 of 200 newborn infants had some form of chromosomal abnormality. The figure is much higher in fetuses that do not survive to term. It is estimated that in 50% of first trimester abortions, the fetus has a chromosomal abnormality. Aim of the study: This study aims to shed some light on the results of chromosomal studies per
... Show MoreWith the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
The regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MoreThe Atmospheric Infrared Sounder (AIRS) on EOS/Aqua satellite provides diverse measurements of Methane (CH4) distribution at different pressure levels in the Earth's atmosphere. The focus of this research is to analyze the vertical variations of (CH4) volume mixing ratio (VMR) time-series data at four Standard pressure levels SPL (925, 850, 600, and 300 hPa) in the troposphere above six cities in Iraq from January 2003 to September 2016. The analysis results of monthly average CH4VMR time-series data show a significant increase between 2003 and 2016, especially from 2009 to 2016; the minimum values of CH4 were in 2003 while the maximum values were in 2016. The vertical distribution of CH4<
... Show MoreThe stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery
... Show MoreOpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtain
... Show MoreThis paper aims at the analytical level to know the security topics that were used with data journalism, and the expression methods used in the statements of the Security Media Cell, as well as to identify the means of clarification used in data journalism. About the Security Media Cell, and the methods preferred by the public in presenting press releases, especially determining the strength of the respondents' attitude towards the data issued by the Security Media Cell. On the Security Media Cell, while the field study included the distribution of a questionnaire to the public of Baghdad Governorate. The study reached several results, the most important of which is the interest of the security media cell in presenting its data in differ
... Show MoreThis study aimed to investigate the role of Big Data in forecasting corporate bankruptcy and that is through a field analysis in the Saudi business environment, to test that relationship. The study found: that Big Data is a recently used variable in the business context and has multiple accounting effects and benefits. Among the benefits is forecasting and disclosing corporate financial failures and bankruptcies, which is based on three main elements for reporting and disclosing that, these elements are the firms’ internal control system, the external auditing, and financial analysts' forecasts. The study recommends: Since the greatest risk of Big Data is the slow adaptation of accountants and auditors to these technologies, wh
... Show More