Predicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes beyond simply predicting lithology to provide a detailed quantification of primary minerals (e.g., calcite and dolomite) as well as secondary ones (e.g., shale and anhydrite). The results show important lithological contrast with the high-porosity layers correlating to possible reservoir areas. The richness of Quanti-Elan's interpretations goes beyond what log analysis alone can reveal. The methodology is described in-depth, discussing the approaches used to train neural networks (e.g., data processing, network architecture). A case study where output of neural network predictions of permeability in a particular oil well are compared with core measurements. The results indicate an exceptional closeness between predicted and actual values, further emphasizing the power of this approach. An extrapolated neural network model using lithology (dolomite and limestone) and porosity as input emphasizes the close match between predicted vs. observed carbonate reservoir permeability. This case study demonstrated the ability of neural networks to accurately characterize and predict permeability in complex carbonate systems. Therefore, the results confirmed that neural networks are a reliable and transformative technology tool for oil reservoirs management, which can help to make future predictive methodologies more efficient hydrocarbon recovery operations.
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreDue to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info
... Show MoreCurrently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of
... Show MoreThe Sonic Scanner is a multifunctional instrument designed to log wells, assess elastic characteristics, and support reservoir characterisation. Furthermore, it facilitates comprehension of rock mechanics, gas detection, and well positioning, while also furnishing data for geomechanical computations and sand management. The present work involved the application of the Sonic Scanner for both basic and advanced processing of oil-well-penetrating carbonate media. The study aimed to characterize the compressional, shear, Stoneley slowness, rock mechanical properties, and Shear anisotropy analysis of the formation. Except for intervals where significant washouts are encountered, the data quality of the Monopole, Dipole, and Stoneley modes is gen
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreThis study focused on spectral clustering (SC) and three-constraint affinity matrix spectral clustering (3CAM-SC) to determine the number of clusters and the membership of the clusters of the COST 2100 channel model (C2CM) multipath dataset simultaneously. Various multipath clustering approaches solve only the number of clusters without taking into consideration the membership of clusters. The problem of giving only the number of clusters is that there is no assurance that the membership of the multipath clusters is accurate even though the number of clusters is correct. SC and 3CAM-SC aimed to solve this problem by determining the membership of the clusters. The cluster and the cluster count were then computed through the cluster-wise J
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreUnderwater Wireless Sensor Networks (UWSNs) have emerged as a promising technology for a wide range of ocean monitoring applications. The UWSNs suffer from unique challenges of the underwater environment, such as dynamic and sparse network topology, which can easily lead to a partitioned network. This results in hotspot formation and the absence of the routing path from the source to the destination. Therefore, to optimize the network lifetime and limit the possibility of hotspot formation along the data transmission path, the need to plan a traffic-aware protocol is raised. In this research, we propose a traffic-aware routing protocol called PG-RES, which is predicated on the ideas of Pressure Gradient and RESistance concept. The proposed
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show More