With growing global demand for hydrocarbons and decreasing conventional reserves, the gas industry is shifting its focus in the direction of unconventional reservoirs. Tight gas reservoirs have typically been deemed uneconomical due to their low permeability which is understood to be below 0.1mD, requiring advanced drilling techniques and stimulation to enhance hydrocarbons. However, the first step in determining the economic viability of the reservoir is to see how much gas is initially in place. Numerical simulation has been regarded across the industry as the most accurate form of gas estimation, however, is extremely costly and time consuming. The aim of this study is to provide a framework for a simple analytical method to estimate gas. Usually during production three variables are readily accessible: production rate, production time, and pressure-volume-temperature properties. This paper develops an analytical approach derived from the dynamic material balance proposing a new methodology to calculate pseudo time, with an interactive technique. This model encompasses pseudo functions accounting for pressure dependent fluid and rock variables. With the dynamic material balance yielding weak results in the linear flow regimes, an additional methodology derived from the volumetric tank model has been taken into consideration whereby equivalent drainage area is linked to total reservoir area. It has been shown even with short production data this volumetric approach yields accurate results. This proposed methodology has been validated against previous literature and additional cases considered to determine the sensitivity of each of it to reservoir parameters. Finally, it is shown that this method works for both fractured and unfractured wells in tight gas reservoirs, however, it is sensitive to the quantity of data based within the pseudo steady state flow period.
A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreAbstract. In this paper, a high order extended state observer (HOESO) based a sliding mode control (SMC) is proposed for a flexible joint robot (FJR) system in the presence of time varying external disturbance. A composite controller is integrated the merits of both HOESO and SMC to enhance the tracking performance of FJR system under the time varying and fast lumped disturbance. First, the HOESO estimator is constructed based on only one measured state to precisely estimate unknown system states and lumped disturbance with its high order derivatives in the FJR system. Second, the SMC scheme is designed based on such accurate estimations to govern the nominal FJR system by well compensating the estimation errors in the states and the lumped
... Show MoreIn this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show MoreThis paper presents a cognition path planning with control algorithm design for a nonholonomic wheeled mobile robot based on Particle Swarm Optimization (PSO) algorithm. The aim of this work is to propose the circular roadmap (CRM) method to plan and generate optimal path with free navigation as well as to propose a nonlinear MIMO-PID-MENN controller in order to track the wheeled mobile robot on the reference path. The PSO is used to find an online tune the control parameters of the proposed controller to get the best torques actions for the wheeled mobile robot. The numerical simulation results based on the Matlab package show that the proposed structure has a precise and highly accurate distance of the generated refere
... Show MoreIntrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreThe present study aims to investigate the various request constructions used in Classical Arabic and Modern Arabic language by identifying the differences in their usage in these two different genres. Also, the study attempts to trace the cases of felicitous and infelicitous requests in the Arabic language. Methodologically, the current study employs a web-based corpus tool (Sketch Engine) to analyze different corpora: the first one is Classical Arabic, represented by King Saud University Corpus of Classical Arabic, while the second is The Arabic Web Corpus “arTenTen” representing Modern Arabic. To do so, the study relies on felicity conditions to qualitatively interpret the quantitative data, i.e., following a mixed mode method
... Show More