Experimental activity coefficients at infinite dilution are particularly useful for calculating the parameters needed in an expression for the excess Gibbs energy. If reliable values of γ∞1 and γ∞2 are available, either from direct experiment or from a correlation, it is possible to predict the composition of the azeotrope and vapor-liquid equilibrium over the entire range of composition. These can be used to evaluate two adjustable constants in any desired expression for G E. In this study MOSCED model and SPACE model are two different methods were used to calculate γ∞1 and γ∞2
The harvest of hydrocarbon from the depleted reservoir is crucial during field development. Therefore, drilling operations in the depleted reservoir faced several problems like partial and total lost circulation. Continuing production without an active water drive or water injection to support reservoir pressure will decrease the pore and fracture pressure. Moreover, this depletion will affect the distribution of stress and change the mud weight window. This study focused on vertical stress, maximum and minimum horizontal stress redistributions in the depleted reservoirs due to decreases in pore pressure and, consequently, the effect on the mud weight window. 1D and 4D robust geomechanical models are
Inferential methods of statistical distributions have reached a high level of interest in recent years. However, in real life, data can follow more than one distribution, and then mixture models must be fitted to such data. One of which is a finite mixture of Rayleigh distribution that is widely used in modelling lifetime data in many fields, such as medicine, agriculture and engineering. In this paper, we proposed a new Bayesian frameworks by assuming conjugate priors for the square of the component parameters. We used this prior distribution in the classical Bayesian, Metropolis-hasting (MH) and Gibbs sampler methods. The performance of these techniques were assessed by conducting data which was generated from two and three-component mixt
... Show MorePredicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes be
... Show More
Urban land uses of all kinds are the constituent elements of the urban spatial structure. Because of the influence of economic and social factors, cities in general are characterized by the dynamic state of their elements over time. Urban functions occur in a certain way with different spatial patterns. Hence, urban planners and the relevant urban management teams should understand the future spatial pattern of these changes by resorting to quantitative models in spatial planning. This is to ensure that future predictions are made with a high level of accuracy so that appropriate strategies can be used to address the problems arising from such changes. The Markov chain method is one of the quantitative models used in spatial planning to ana
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThis study sought to investigate the impacts of big data, artificial intelligence (AI), and business intelligence (BI) on Firms' e-learning and business performance at Jordanian telecommunications industry. After the samples were checked, a total of 269 were collected. All of the information gathered throughout the investigation was analyzed using the PLS software. The results show a network of interconnections can improve both e-learning and corporate effectiveness. This research concluded that the integration of big data, AI, and BI has a positive impact on e-learning infrastructure development and organizational efficiency. The findings indicate that big data has a positive and direct impact on business performance, including Big
... Show More