The map of permeability distribution in the reservoirs is considered one of the most essential steps of the geologic model building due to its governing the fluid flow through the reservoir which makes it the most influential parameter on the history matching than other parameters. For that, it is the most petrophysical properties that are tuned during the history matching. Unfortunately, the prediction of the relationship between static petrophysics (porosity) and dynamic petrophysics (permeability) from conventional wells logs has a sophisticated problem to solve by conventional statistical methods for heterogeneous formations. For that, this paper examines the ability and performance of the artificial intelligence method in permeability prediction and compared its results with the flow zone indicator methods for a carbonate heterogeneous Iraqi formation. The methodology of the research can be Summarized by permeability was estimated by using two methods: Flow zone indicator and Artificial intelligence, two reservoir models are built, where the difference between them is in permeability method estimation, and the simulation run will be conducted on both of the models, and the permeability estimation methods will be examined by comparing their effect on the model history matching. The results showed that the model with permeability predicted by using artificial intelligence matched the observed data for different reservoir responses more accurately than the model with permeability predicted by the flow zone indicator method. That conclusion is represented by good matching between observed data and simulated results for all reservoir responses such for the artificial intelligence model than the flow zone indicator model.
Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreThe production of power using the process of pressure–retarded osmosis (PRO) has been studied both experimentally and theoretically for simulated sea water vs. river water and deionized water under two cases: the first is for simulated real conditions of sea water and river water and second under low brine solution concentration to examine the full profile of the power- pressure. The influence of concentration polarization (CP) on water flux has been examined as well.
Wildfire risk has globally increased during the past few years due to several factors. An efficient and fast response to wildfires is extremely important to reduce the damaging effect on humans and wildlife. This work introduces a methodology for designing an efficient machine learning system to detect wildfires using satellite imagery. A convolutional neural network (CNN) model is optimized to reduce the required computational resources. Due to the limitations of images containing fire and seasonal variations, an image augmentation process is used to develop adequate training samples for the change in the forest’s visual features and the seasonal wind direction at the study area during the fire season. The selected CNN model (Mob
... Show MoreMalaysia's growing population and industrialisation have increased solid waste accumulation in landfills, leading to a rise in leachate production. Leachate, a highly contaminated liquid from landfills, poses environmental risks and affects water quality. Conventional leachate treatments are costly and time-consuming due to the need for additional chemicals. Therefore, the Electrocoagulation process could be used as an alternative method. Electrocoagulation is an electrochemical method of treating water by eliminating impurities by applying an electric current. In the present study, the optimisation of contaminant removal was investigated using Response Surface Methodology. Three parameters were considered for optimisation: the curr
... Show MoreAmplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreSteganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show More