Porosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in resulting binary image. This paper presents the possibilities of using image processing for determining digital 2D rock samples porosity in carbonate reservoir rocks. MATLAB code created which automatically segment and determine the digital rock porosity, based on the OTSU's thresholding algorithm. In this work, twenty-two samples of 2D thin section petrographic image reservoir rocks of one Iraqi oil field are studied. The examples of thin section images are processed and digitized, utilizing MATLAB programming. In the present study, we have focused on determining of micro and macroporosity of the digital image. Also, some pore void characteristics, such as area and perimeter, were calculated. Digital 2D image analysis results are compared to laboratory core investigation results to determine the strength and restrictions of the digital image interpretation techniques. Thin microscopic image porosity determined using OTSU technique showed a moderate match with core porosity.
In this study, a new technique is considered for solving linear fractional Volterra-Fredholm integro-differential equations (LFVFIDE's) with fractional derivative qualified in the Caputo sense. The method is established in three types of Lagrange polynomials (LP’s), Original Lagrange polynomial (OLP), Barycentric Lagrange polynomial (BLP), and Modified Lagrange polynomial (MLP). General Algorithm is suggested and examples are included to get the best effectiveness, and implementation of these types. Also, as special case fractional differential equation is taken to evaluate the validity of the proposed method. Finally, a comparison between the proposed method and other methods are taken to present the effectiveness of the proposal meth
... Show MoreMeloxicam (MLX) is non-steroidal anti -inflammatory, poorly water soluble, highly permeable drug and the rate of its oral absorption is often controlled by the dissolution rate in the gastrointestinal tract. Solid dispersion (SD) is an effective technique for enhancing the solubility and dissolution rate of such drug.
The present study aims to enhance the solubility and the dissolution rate of MLX by SD technique by solvent evaporation method using sodium alginate (SA), hyaluronic acid (HA), collagen and xyloglucan (XG) as gastro-protective hydrophilic natural polymers.
Twelve formulas were prepared in different drug: polymer ratios and evaluated for their, percentage yield, drug content, water so
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreDrip irrigation is one of the conservative irrigation techniques since it implies supplying water directly on the soil through the emitter; it can supply water and fertilizer directly into the root zone. An equation to estimate the wetted area in unsaturated soil is taking into calculating the water absorption by roots is simulated numerically using HYDRUS (2D/3D) software. In this paper, HYDRUS comprises analytical types of the estimate of different soil hydraulic properties. Used one soil type, sandy loam, with three types of crops; (corn, tomato, and sweet sorghum), different drip discharge, different initial soil moisture content was assumed, and different time durations. The relative error for the different hydrauli
... Show MoreFor modeling a photovoltaic module, it is necessary to calculate the basic parameters which control the current-voltage characteristic curves, that is not provided by the manufacturer. Generally, for mono crystalline silicon module, the shunt resistance is generally high, and it is neglected in this model. In this study, three methods are presented for four parameters model. Explicit simplified method based on an analytical solution, slope method based on manufacturer data, and iterative method based on a numerical resolution. The results obtained for these methods were compared with experimental measured data. The iterative method was more accurate than the other two methods but more complexity. The average deviation of
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreWildfire risk has globally increased during the past few years due to several factors. An efficient and fast response to wildfires is extremely important to reduce the damaging effect on humans and wildlife. This work introduces a methodology for designing an efficient machine learning system to detect wildfires using satellite imagery. A convolutional neural network (CNN) model is optimized to reduce the required computational resources. Due to the limitations of images containing fire and seasonal variations, an image augmentation process is used to develop adequate training samples for the change in the forest’s visual features and the seasonal wind direction at the study area during the fire season. The selected CNN model (Mob
... Show More