Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To verify the reliability of training data for zone-by-zone modeling, we split the scenario into two scenarios and applied them to seven wells' worth of data. Moreover, all wellbore intervals were processed, for instance, all five units of Mishrif formation. According to the findings, the more information we have, the more accurate our forecasting model becomes. Multi-resolution graph-based clustering has demonstrated its forecasting stability in two instances by comparing it to the other five machine learning models.
An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreCNC machine is used to machine complex or simple shapes at higher speed with maximum accuracy and minimum error. In this paper a previously designed CNC control system is used to machine ellipses and polylines. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD® or 3D MAX and is saved in a well-known file format (DXF) then that file is fed to the CNC machine controller by the CNC operator then that part will be machined by the CNC machine. The CNC controller using developed algorithms that reads the DXF file feeds to the machine, extracts the shapes from the file and generates commands to move the CNC machine axes so that these shapes can be machined.
Geomechanical modelling and simulation are introduced to accurately determine the combined effects of hydrocarbon production and changes in rock properties due to geomechanical effects. The reservoir geomechanical model is concerned with stress-related issues and rock failure in compression, shear, and tension induced by reservoir pore pressure changes due to reservoir depletion. In this paper, a rock mechanical model is constructed in geomechanical mode, and reservoir geomechanics simulations are run for a carbonate gas reservoir. The study begins with assessment of the data, construction of 1D rock mechanical models along the well trajectory, the generation of a 3D mechanical earth model, and runni
Rock mechanical properties are critical parameters for many development techniques related to tight reservoirs, such as hydraulic fracturing design and detecting failure criteria in wellbore instability assessment. When direct measurements of mechanical properties are not available, it is helpful to find sufficient correlations to estimate these parameters. This study summarized experimentally derived correlations for estimating the shear velocity, Young's modulus, Poisson's ratio, and compressive strength. Also, a useful correlation is introduced to convert dynamic elastic properties from log data to static elastic properties. Most of the derived equations in this paper show good fitting to measured data, while some equations show scatters
... Show MoreIntelligent or smart completion wells vary from conventional wells. They have downhole flow control devices like Inflow Control Devices (ICD) and Interval Control Valves (ICV) to enhance reservoir management and control, optimizing hydrocarbon output and recovery. However, to explain their adoption and increase their economic return, a high level of justification is necessary. Smart horizontal wells also necessitate optimizing the number of valves, nozzles, and compartment length. A three-dimensional geological model of the As reservoir in AG oil field was used to see the influence of these factors on cumulative oil production and NPV. After creating the dynamic model for the As reservoir using the program Petrel (2017.4), we
... Show MoreThe vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More