With growing global demand for hydrocarbons and decreasing conventional reserves, the gas industry is shifting its focus in the direction of unconventional reservoirs. Tight gas reservoirs have typically been deemed uneconomical due to their low permeability which is understood to be below 0.1mD, requiring advanced drilling techniques and stimulation to enhance hydrocarbons. However, the first step in determining the economic viability of the reservoir is to see how much gas is initially in place. Numerical simulation has been regarded across the industry as the most accurate form of gas estimation, however, is extremely costly and time consuming. The aim of this study is to provide a framework for a simple analytical method to estimate gas. Usually during production three variables are readily accessible: production rate, production time, and pressure-volume-temperature properties. This paper develops an analytical approach derived from the dynamic material balance proposing a new methodology to calculate pseudo time, with an interactive technique. This model encompasses pseudo functions accounting for pressure dependent fluid and rock variables. With the dynamic material balance yielding weak results in the linear flow regimes, an additional methodology derived from the volumetric tank model has been taken into consideration whereby equivalent drainage area is linked to total reservoir area. It has been shown even with short production data this volumetric approach yields accurate results. This proposed methodology has been validated against previous literature and additional cases considered to determine the sensitivity of each of it to reservoir parameters. Finally, it is shown that this method works for both fractured and unfractured wells in tight gas reservoirs, however, it is sensitive to the quantity of data based within the pseudo steady state flow period.
The oil and gas industry relies heavily on IT innovations to manage business processes, but the exponential generation of data has led to concerns about processing big data, generating valuable insights, and making timely decisions. Many companies have adopted Big Data Analytics (BDA) solutions to address these challenges. However, determining the adoption of BDA solutions requires a thorough understanding of the contextual factors influencing these decisions. This research explores these factors using a new Technology-Organisation-Environment (TOE) framework, presenting technological, organisational, and environmental factors. The study used a Delphi research method and seven heterogeneous panelists from an Oman oil and gas company
... Show MoreWith the development of computer architecture and its technologies in recent years, applications like e-commerce, e-government, e-governance and e-finance are widely used, and they act as active research areas. In addition, in order to increase the quality and quantity of the ordinary everyday transactions, it is desired to migrate from the paper-based environment to a digital-based computerized environment. Such migration increases efficiency, saves time, eliminates paperwork, increases safety and reduces the cost in an organization. Digital signatures are playing an essential role in many electronic and automatic based systems and facilitate this migration. The digital signatures are used to provide many services and s
... Show MoreThis research sheds light on the physical environment role in creating the place attachment, by discussing one of the important factors in the attachment creation, it is the concept place dependence, consisting of two important dimensions: the place quality and the place expectation; they contain a number of the supporter physical environment sub-indicators for place attachment. Eight physical indicators were reached; they were found to have a close relationship to the place attachment, including: the open and green spaces existence, land use diversity, diversity of housing types, dwelling / population density, accessibility, transport network development degree, transport multiple mo
Information security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-
... Show MoreThe aim of this work is to design an algorithm which combines between steganography andcryptography that can hide a text in an image in a way that prevents, as much as possible, anysuspicion of the hidden textThe proposed system depends upon preparing the image data for the next step (DCT Quantization)through steganographic process and using two levels of security: the RSA algorithm and the digitalsignature, then storing the image in a JPEG format. In this case, the secret message will be looked asplaintext with digital signature while the cover is a coloured image. Then, the results of the algorithmare submitted to many criteria in order to be evaluated that prove the sufficiency of the algorithm andits activity. Thus, the proposed algorit
... Show MoreFree-Space Optical (FSO) can provide high-speed communications when the effect of turbulence is not serious. However, Space-Time-Block-Code (STBC) is a good candidate to mitigate this seriousness. This paper proposes a hybrid of an Optical Code Division Multiple Access (OCDMA) and STBC in FSO communication for last mile solutions, where access to remote areas is complicated. The main weakness effecting a FSO link is the atmospheric turbulence. The feasibility of employing STBC in OCDMA is to mitigate these effects. The current work evaluates the Bit-Error-Rate (BER) performance of OCDMA operating under the scintillation effect, where this effect can be described by the gamma-gamma model. The most obvious finding to emerge from the analysis
... Show MoreAbstract
Most of the industrial organization in the world became suffering from the problem of the pollution of the poisonous chemicals things, this urged to depend on the principle of the responsible production, because it has the positive role by dealing with these chemical things and to safe the health of the society, due to the main goal of this study is to restrict the role responsible production in accomplishing the system of the environmental management through an actual study in the northern gas company in Kirkuk province, the topic has acquired a big importance bacause there were a limited number of studies and res
... Show MoreWater pollution as a result of contamination with dye-contaminating effluents is a severe issue for water reservoirs, which instigated the study of biodegradation of Reactive Red 195 and Reactive Blue dyes by E. coli and Bacillus sp. The effects of occupation time, solution pH, initial dyes concentrations, biomass loading, and temperature were investigated via batch-system experiments by using the Design of Experiment (DOE) for 2 levels and 5 factors response surface methodology (RSM). The operational conditions used for these factors were optimized using quadratic techniques by reducing the number of experiments. The results revealed that the two types of bacteria had a powerful effect on biodegradable dyes. The regression analysis reveale
... Show More