The main aim of this study is to evaluate the remaining oil in previously produced zones, locate the water productive zone and look for any bypassed oil behind casing in not previously perforated intervals. Initial water saturation was calculated from digitized open hole logs using a cut-off value of 10% for irreducible water saturation. The integrated analysis of the thermal capture cross section, Sigma and Carbon/oxygen ratio was conducted and summarized under well shut-in and flowing conditions. The logging pass zone run through sandstone Zubair formation at north Rumaila oil field. The zones where both the Sigma and the C/O analysis show high remaining oil saturation similar to the open hole oil saturation, could be good oil zones that do not appear to be water flooded. The zones where the Sigma analysisshows high residual oil saturation, which is close to open hole oil saturation and the C/O analysis, show medium residual oil saturation; this could indicate that these zones were fresh water flooded to a certain extent and they still keep some residual oil. If the C/O analysis shows low residual oil saturation, it indicates that these zones were probably fresh water flooded thoroughly. If both Sigma analysis and the C/O analysis show medium residual oil saturation, most probably these zones were saline water flooded to a certain extent and they still keep some residual oil.
Recently, the development and application of the hydrological models based on Geographical Information System (GIS) has increased around the world. One of the most important applications of GIS is mapping the Curve Number (CN) of a catchment. In this research, three softwares, such as an ArcView GIS 9.3 with ArcInfo, Arc Hydro Tool and Geospatial Hydrologic Modeling Extension (Hec-GeoHMS) model for ArcView GIS 9.3, were used to calculate CN of (19210 ha) Salt Creek watershed (SC) which is located in Osage County, Oklahoma, USA. Multi layers were combined and examined using the Environmental Systems Research Institute (ESRI) ArcMap 2009. These layers are soil layer (Soil Survey Geographic SSURGO), 30 m x 30 m resolution of Digital Elevati
... Show MoreIn this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MorePlagiarism is described as using someone else's ideas or work without their permission. Using lexical and semantic text similarity notions, this paper presents a plagiarism detection system for examining suspicious texts against available sources on the Web. The user can upload suspicious files in pdf or docx formats. The system will search three popular search engines for the source text (Google, Bing, and Yahoo) and try to identify the top five results for each search engine on the first retrieved page. The corpus is made up of the downloaded files and scraped web page text of the search engines' results. The corpus text and suspicious documents will then be encoded as vectors. For lexical plagiarism detection, the system will
... Show MoreThe goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed
The aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show MoreA super pixel can be defined as a group of pixels, which have similar characteristics, which can be very helpful for image segmentation. It is generally color based segmentation as well as other features like texture, statistics…etc .There are many algorithms available to segment super pixels like Simple Linear Iterative Clustering (SLIC) super pixels and Density-Based Spatial Clustering of Application with Noise (DBSCAN). SLIC algorithm essentially relay on choosing N random or regular seeds points covering the used image for segmentation. In this paper Split and Merge algorithm was used instead to overcome determination the seed point's location and numbers as well as other used parameters. The overall results were better from the SL
... Show More<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream.
... Show More