With growing global demand for hydrocarbons and decreasing conventional reserves, the gas industry is shifting its focus in the direction of unconventional reservoirs. Tight gas reservoirs have typically been deemed uneconomical due to their low permeability which is understood to be below 0.1mD, requiring advanced drilling techniques and stimulation to enhance hydrocarbons. However, the first step in determining the economic viability of the reservoir is to see how much gas is initially in place. Numerical simulation has been regarded across the industry as the most accurate form of gas estimation, however, is extremely costly and time consuming. The aim of this study is to provide a framework for a simple analytical method to estimate gas. Usually during production three variables are readily accessible: production rate, production time, and pressure-volume-temperature properties. This paper develops an analytical approach derived from the dynamic material balance proposing a new methodology to calculate pseudo time, with an interactive technique. This model encompasses pseudo functions accounting for pressure dependent fluid and rock variables. With the dynamic material balance yielding weak results in the linear flow regimes, an additional methodology derived from the volumetric tank model has been taken into consideration whereby equivalent drainage area is linked to total reservoir area. It has been shown even with short production data this volumetric approach yields accurate results. This proposed methodology has been validated against previous literature and additional cases considered to determine the sensitivity of each of it to reservoir parameters. Finally, it is shown that this method works for both fractured and unfractured wells in tight gas reservoirs, however, it is sensitive to the quantity of data based within the pseudo steady state flow period.
Multilocus haplotype analysis of candidate variants with genome wide association studies (GWAS) data may provide evidence of association with disease, even when the individual loci themselves do not. Unfortunately, when a large number of candidate variants are investigated, identifying risk haplotypes can be very difficult. To meet the challenge, a number of approaches have been put forward in recent years. However, most of them are not directly linked to the disease-penetrances of haplotypes and thus may not be efficient. To fill this gap, we propose a mixture model-based approach for detecting risk haplotypes. Under the mixture model, haplotypes are clustered directly according to their estimated d
The load shedding scheme has been extensively implemented as a fast solution for unbalance conditions. Therefore, it's crucial to investigate supply-demand balancing in order to protect the network from collapsing and to sustain stability as possible, however its implementation is mostly undesirable. One of the solutions to minimize the amount of load shedding is the integration renewable energy resources, such as wind power, in the electric power generation could contribute significantly to minimizing power cuts as it is ability to positively improving the stability of the electric grid. In this paper propose a method for shedding the load base on the priority demands with incorporating the wind po
... Show MoreThe continuous advancement in the use of the IoT has greatly transformed industries, though at the same time it has made the IoT network vulnerable to highly advanced cybercrimes. There are several limitations with traditional security measures for IoT; the protection of distributed and adaptive IoT systems requires new approaches. This research presents novel threat intelligence for IoT networks based on deep learning, which maintains compliance with IEEE standards. Interweaving artificial intelligence with standardization frameworks is the goal of the study and, thus, improves the identification, protection, and reduction of cyber threats impacting IoT environments. The study is systematic and begins by examining IoT-specific thre
... Show MoreStarting from the term (forbidden montage) initiated by the French critic (Andre Bazin) as a method of processing the movies that depend on (mise en scene) achieved by the action of the camera and its ability to photograph and employ the depth of the field, in addition to the possibility of free movement without interruption in the filming environment in order to avoid montage as much as possible (the montage that distorts focus and distracts attention and moves away from realism, which is the most important theoretical pillar of Bazin in photography). The pursuit was behind a cinema that depicts its topics in one integrated snapshot with all its details thus approximating reality without any interference of montage. Our study sta
... Show MoreAbstract
The current research aims to examine the effectiveness of a training program for children with autism and their mothers based on the Picture Exchange Communication System to confront some basic disorders in a sample of children with autism. The study sample was (16) children with autism and their mothers in the different centers in Taif city and Tabuk city. The researcher used the quasi-experimental approach, in which two groups were employed: an experimental group and a control group. Children aged ranged from (6-9) years old. In addition, it was used the following tools: a list of estimation of basic disorders for a child with autism between (6-9) years, and a training program for children with autism
... Show MoreA frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co
... Show MoreImage recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third
... Show More