Predicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes beyond simply predicting lithology to provide a detailed quantification of primary minerals (e.g., calcite and dolomite) as well as secondary ones (e.g., shale and anhydrite). The results show important lithological contrast with the high-porosity layers correlating to possible reservoir areas. The richness of Quanti-Elan's interpretations goes beyond what log analysis alone can reveal. The methodology is described in-depth, discussing the approaches used to train neural networks (e.g., data processing, network architecture). A case study where output of neural network predictions of permeability in a particular oil well are compared with core measurements. The results indicate an exceptional closeness between predicted and actual values, further emphasizing the power of this approach. An extrapolated neural network model using lithology (dolomite and limestone) and porosity as input emphasizes the close match between predicted vs. observed carbonate reservoir permeability. This case study demonstrated the ability of neural networks to accurately characterize and predict permeability in complex carbonate systems. Therefore, the results confirmed that neural networks are a reliable and transformative technology tool for oil reservoirs management, which can help to make future predictive methodologies more efficient hydrocarbon recovery operations.
Coagulation is the most important process in drinking water treatment. Alum coagulant increases the aluminum residuals, which have been linked in many studies to Alzheimer's disease. Therefore, it is very important to use it with the very optimal dose. In this paper, four sets of experiments were done to determine the relationship between raw water characteristics: turbidity, pH, alkalinity, temperature, and optimum doses of alum [ .14 O] to form a mathematical equation that could replace the need for jar test experiments. The experiments were performed under different conditions and under different seasonal circumstances. The optimal dose in every set was determined, and used to build a gene expression model (GEP). The models were co
... Show MoreThe precise classification of DNA sequences is pivotal in genomics, holding significant implications for personalized medicine. The stakes are particularly high when classifying key genetic markers such as BRAC, related to breast cancer susceptibility; BRAF, associated with various malignancies; and KRAS, a recognized oncogene. Conventional machine learning techniques often necessitate intricate feature engineering and may not capture the full spectrum of sequence dependencies. To ameliorate these limitations, this study employs an adapted UNet architecture, originally designed for biomedical image segmentation, to classify DNA sequences.The attention mechanism was also tested LONG WITH u-Net architecture to precisely classify DNA sequences
... Show MoreThis study aim to identify the concept of web based information systems since its one of the important topics that is usually omitted by our organizations, in addition to, designing a web based information system in order to manage the customers data of Al- Rasheed bank, as a unified information system that is specialized to the banking deals of the customers with the bank, and providing a suggested model to apply the virtual private network as a tool that is to protect the transmitted data through the web based information system.
This study is considered important because it deals with one of the vital topics nowadays, namely: how to make it possible to use a distributed informat
... Show MoreThis study produces an image of theoretical and experimental case of high loading stumbling condition for hip prosthesis. Model had been studied namely Charnley. This model was modeled with finite element method by using ANSYS software, the effect of changing the design parameters (head diameter, neck length, neck ratio, stem length) on Charnley design, for stumbling case as impact load where the load reach to (8.7* body weight) for impact duration of 0.005sec.An experimental rig had been constructed to test the hip model, this rig consist of a wood box with a smooth sliding shaft where a load of 1 pound is dropped from three heights.
The strain produced by this impact is measured by using rosette strain gauge connected to Wheatstone
Portable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fu
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
<p>Energy and memory limitations are considerable constraints of sensor nodes in wireless sensor networks (WSNs). The limited energy supplied to network nodes causes WSNs to face crucial functional limitations. Therefore, the problem of limited energy resource on sensor nodes can only be addressed by using them efficiently. In this research work, an energy-balancing routing scheme for in-network data aggregation is presented. This scheme is referred to as Energy-aware and load-Balancing Routing scheme for Data Aggregation (hereinafter referred to as EBR-DA). The EBRDA aims to provide an energy efficient multiple-hop routing to the destination on the basis of the quality of the links between the source and destination. In
... Show MoreThe dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s
... Show MoreRation power plants, to generate power, have become common worldwide. One such one is the steam power plant. In such plants, various moving parts of heavy machines generate a lot of noise. Operators are subjected to high levels of noise. High noise level exposure leads to psychological as well physiological problems; different kinds of ill effects. It results in deteriorated work efficiency, although the exact nature of work performance is still unknown. To predict work efficiency deterioration, neuro-fuzzy tools are being used in research. It has been established that a neuro-fuzzy computing system helps in identification and analysis of fuzzy models. The last decade has seen substantial growth in development of various neuro-fuzzy systems
... Show MoreForest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show More