Predicting permeability is a cornerstone of petroleum reservoir engineering, playing a vital role in optimizing hydrocarbon recovery strategies. This paper explores the application of neural networks to predict permeability in oil reservoirs, underscoring their growing importance in addressing traditional prediction challenges. Conventional techniques often struggle with the complexities of subsurface conditions, making innovative approaches essential. Neural networks, with their ability to uncover complicated patterns within large datasets, emerge as a powerful alternative. The Quanti-Elan model was used in this study to combine several well logs for mineral volumes, porosity and water saturation estimation. This model goes beyond simply predicting lithology to provide a detailed quantification of primary minerals (e.g., calcite and dolomite) as well as secondary ones (e.g., shale and anhydrite). The results show important lithological contrast with the high-porosity layers correlating to possible reservoir areas. The richness of Quanti-Elan's interpretations goes beyond what log analysis alone can reveal. The methodology is described in-depth, discussing the approaches used to train neural networks (e.g., data processing, network architecture). A case study where output of neural network predictions of permeability in a particular oil well are compared with core measurements. The results indicate an exceptional closeness between predicted and actual values, further emphasizing the power of this approach. An extrapolated neural network model using lithology (dolomite and limestone) and porosity as input emphasizes the close match between predicted vs. observed carbonate reservoir permeability. This case study demonstrated the ability of neural networks to accurately characterize and predict permeability in complex carbonate systems. Therefore, the results confirmed that neural networks are a reliable and transformative technology tool for oil reservoirs management, which can help to make future predictive methodologies more efficient hydrocarbon recovery operations.
Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreSkull image separation is one of the initial procedures used to detect brain abnormalities. In an MRI image of the brain, this process involves distinguishing the tissue that makes up the brain from the tissue that does not make up the brain. Even for experienced radiologists, separating the brain from the skull is a difficult task, and the accuracy of the results can vary quite a little from one individual to the next. Therefore, skull stripping in brain magnetic resonance volume has become increasingly popular due to the requirement for a dependable, accurate, and thorough method for processing brain datasets. Furthermore, skull stripping must be performed accurately for neuroimaging diagnostic systems since neither no
... Show MoreImage retrieval is an active research area in image processing, pattern recognition, and
computer vision. In this proposed method, there are two techniques to extract the feature
vector, the first one is applying the transformed algorithm on the whole image and the second
is to divide the image into four blocks and then applying the transform algorithm on each part
of the image. In each technique there are three transform algorithm that have been applied
(DCT, Walsh Transform, and Kekre’s Wavelet Transform) then finding the similarity and
indexing the images, useing the correlation between feature vector of the query image and
images in database. The retrieved method depends on higher indexing number. <
This study aims to enhance the RC5 algorithm to improve encryption and decryption speeds in devices with limited power and memory resources. These resource-constrained applications, which range in size from wearables and smart cards to microscopic sensors, frequently function in settings where traditional cryptographic techniques because of their high computational overhead and memory requirements are impracticable. The Enhanced RC5 (ERC5) algorithm integrates the PKCS#7 padding method to effectively adapt to various data sizes. Empirical investigation reveals significant improvements in encryption speed with ERC5, ranging from 50.90% to 64.18% for audio files and 46.97% to 56.84% for image files, depending on file size. A substanti
... Show MoreThe agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreIron is one of the abundant elements on earth that is an essential element for humans and may be a troublesome element in water supplies. In this research an AAN model was developed to predict iron concentrations in the location of Al- Wahda water treatment plant in Baghdad city by water quality assessment of iron concentrations at seven WTPs up stream Tigris River. SPSS software was used to build the ANN model. The input data were iron concentrations in the raw water for the period 2004-2011. The results indicated the best model predicted Iron concentrations at Al-Wahda WTP with a coefficient of determination 0.9142. The model used one hidden layer with two nodes and the testing error was 0.834. The ANN model coul
... Show MoreThe combined system of electrocoagulation (EC) and electro-oxidation (EO) is one of the most promising methods in dye removal. In this work, a solution of 200 mg/l of Congo red was used to examine the removal of anionic dye using an EC-EO system with three stainless steel electrodes as the auxiliary electrodes and an aluminum electrode as anode for the EC process, Cu-Mn-Ni Nanocomposite as anode for the EO process. This composite oxide was simultaneously synthesized by anodic and cathodic deposition of Cu (NO3)2, MnCl2, and Ni (NO3)2 salts with 0.075 M as concentrations of each salt with a fixed molar ratio (1:1:1) at a constant current density of 25 mA/cm2. The characteristics structure and surface morphology of the depo
... Show MoreMost companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show More