With growing global demand for hydrocarbons and decreasing conventional reserves, the gas industry is shifting its focus in the direction of unconventional reservoirs. Tight gas reservoirs have typically been deemed uneconomical due to their low permeability which is understood to be below 0.1mD, requiring advanced drilling techniques and stimulation to enhance hydrocarbons. However, the first step in determining the economic viability of the reservoir is to see how much gas is initially in place. Numerical simulation has been regarded across the industry as the most accurate form of gas estimation, however, is extremely costly and time consuming. The aim of this study is to provide a framework for a simple analytical method to estimate gas. Usually during production three variables are readily accessible: production rate, production time, and pressure-volume-temperature properties. This paper develops an analytical approach derived from the dynamic material balance proposing a new methodology to calculate pseudo time, with an interactive technique. This model encompasses pseudo functions accounting for pressure dependent fluid and rock variables. With the dynamic material balance yielding weak results in the linear flow regimes, an additional methodology derived from the volumetric tank model has been taken into consideration whereby equivalent drainage area is linked to total reservoir area. It has been shown even with short production data this volumetric approach yields accurate results. This proposed methodology has been validated against previous literature and additional cases considered to determine the sensitivity of each of it to reservoir parameters. Finally, it is shown that this method works for both fractured and unfractured wells in tight gas reservoirs, however, it is sensitive to the quantity of data based within the pseudo steady state flow period.
Objectives Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficiency
... Show MoreObjectives: Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods: The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficie
... Show MoreThe objective of this study was tointroduce a recursive least squares (RLS) parameter estimatorenhanced by using a neural network (NN) to facilitate the computing of a bit error rate (BER) (error reduction) during channels estimation of a multiple input-multiple output orthogonal frequency division multiplexing (MIMO-OFDM) system over a Rayleigh multipath fading channel.Recursive least square is an efficient approach to neural network training:first, the neural network estimator learns to adapt to the channel variations then it estimates the channel frequency response. Simulation results show that the proposed method has better performance compared to the conventional methods least square (LS) and the original RLS and it is more robust a
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreThe prostaglandins inside inflamed tissues are produced by cyclooxygenase-2 (COX-2), making it an important target for improving anti-inflammatory medications over a long period. Adverse effects have been related to the traditional usage of non-steroidal anti-inflammatory drugs (NSAIDs) for the treatment of inflammation, mainly centered around gastrointestinal (GI) complications. The current research involves the creation of a virtual library of innovative molecules showing similar drug properties via a structure-based drug design. A library that includes five novel derivatives of Diclofenac was designed. Subsequently, molecular docking through the Glide module and determining the binding free energy implementing the P
... Show MoreThe research aims to identify the effect of the training program that is based on integrating futuristic thinking skills with classroom interaction patterns on mathematics teachers in order to provide their students with creative solution skills. The research sample consisted of 31teachers (15 teachers for the experimental group and 16 for the control groups). The researcher developed a measure for the academic self-efficacy consisting of (39) items. Its validity, reliability, coefficient of difficulty and discriminatory power were estimated. To analyze the findings, the researcher adopted the Mann-Whitney (U) test and the effect size, and the findings were as follows: There is a statistically significant difference at the significance leve
... Show MoreThis paper aims to improve the voltage profile using the Static Synchronous Compensator (STATCOM) in the power system in the Kurdistan Region for all weak buses. Power System Simulation studied it for Engineers (PSS\E) software version 33.0 to apply the Newton-Raphson (NR) method. All bus voltages were recorded and compared with the Kurdistan region grid index (0.95≤V ≤1.05), simulating the power system and finding the optimal size and suitable location of Static Synchronous Compensator (STATCOM)for bus voltage improvement at the weakest buses. It shows that Soran and New Koya substations are the best placement for adding STATCOM with the sizes 20 MVAR and 40 MVAR. After adding STATCOM with the sizes [20MVAR and 40MV
... Show MoreA robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video str
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreProtecting information sent through insecure internet channels is a significant challenge facing researchers. In this paper, we present a novel method for image data encryption that combines chaotic maps with linear feedback shift registers in two stages. In the first stage, the image is divided into two parts. Then, the locations of the pixels of each part are redistributed through the random numbers key, which is generated using linear feedback shift registers. The second stage includes segmenting the image into the three primary colors red, green, and blue (RGB); then, the data for each color is encrypted through one of three keys that are generated using three-dimensional chaotic maps. Many statistical tests (entropy, peak signa
... Show More