Reservoir permeability plays a crucial role in characterizing reservoirs and predicting the present and future production of hydrocarbon reservoirs. Data logging is a good tool for assessing the entire oil well section's continuous permeability curve. Nuclear magnetic resonance logging measurements are minimally influenced by lithology and offer significant benefits in interpreting permeability. The Schlumberger-Doll-Research model utilizes nuclear magnetic resonance logging, which accurately estimates permeability values. The approach of this investigation is to apply artificial neural networks and core data to predict permeability in wells without a nuclear magnetic resonance log. The Schlumberger-Doll-Research permeability is used to train the model, where the model prediction result is validated with core permeability. Seven oil well logs were used as input parameters, and the model was constructed with Techlog software. The predicted permeability with the model compared with Schlumberger-Doll-Research permeability as a cross plot, which results in the correlation coefficient of 94%, while the predicted permeability validated with the core permeability of the well, which obtains good agreement where R2 equals 80%. The model was utilized to forecast permeability in a well that did not have a nuclear magnetic resonance log, and the predicted permeability was cross-plotted against core permeability as a validation step, with a correlation coefficient of 77%. As a result, the low percentage of matching was due to data limitations, which demonstrated that as the amount of data used to train the model increased, so did the precision.
Offline handwritten signature is a type of behavioral biometric-based on an image. Its problem is the accuracy of the verification because once an individual signs, he/she seldom signs the same signature. This is referred to as intra-user variability. This research aims to improve the recognition accuracy of the offline signature. The proposed method is presented by using both signature length normalization and histogram orientation gradient (HOG) for the reason of accuracy improving. In terms of verification, a deep-learning technique using a convolution neural network (CNN) is exploited for building the reference model for a future prediction. Experiments are conducted by utilizing 4,000 genuine as well as 2,000 skilled forged signatu
... Show MoreA content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreThe Optical Fiber sensor based on the Surface Plasmon Resonance (SPR) technology has
been a successful performance sensing and presents high sensitivity. This thesis investigates the
performance of several structure of SPR sensor in field of refractive index and chemical
applications. A structure of Multi-Mode Fiber- Single Mode Fiber- Multi Mode Fiber (MMFSMF-MMF)
The Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Software Defined Networking (SDN) with centralized control provides a global view and achieves efficient network resources management. However, using centralized controllers has several limitations related to scalability and performance, especially with the exponential growth of 5G communication. This paper proposes a novel traffic scheduling algorithm to avoid congestion in the control plane. The Packet-In messages received from different 5G devices are classified into two classes: critical and non-critical 5G communication by adopting Dual-Spike Neural Networks (DSNN) classifier and implementing it on a Virtualized Network Function (VNF). Dual spikes identify each class to increase the reliability of the classification
... Show MoreThe present study aimed to use the magnetic field and nanotechnology in the field of water purification, which slots offering high efficiency to the possibility of removing biological contaminants such as viruses and bacteria rather than the use of chemical and physical transactions such as chlorine and bromine, and ultraviolet light and boiling and sedimentation and distillation, ozone and others that have a direct negative impact on human safety and the environment. Where they were investigating the presence in water samples under study Coli phages using Single agar layer method and then treated samples positive for phages to three types of magnetic field fixed as follows (North Pole - South Pole - Bipolar) and compare the re
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show MorePrediction of daily rainfall is important for flood forecasting, reservoir operation, and many other hydrological applications. The artificial intelligence (AI) algorithm is generally used for stochastic forecasting rainfall which is not capable to simulate unseen extreme rainfall events which become common due to climate change. A new model is developed in this study for prediction of daily rainfall for different lead times based on sea level pressure (SLP) which is physically related to rainfall on land and thus able to predict unseen rainfall events. Daily rainfall of east coast of Peninsular Malaysia (PM) was predicted using SLP data over the climate domain. Five advanced AI algorithms such as extreme learning machine (ELM), Bay
... Show More