Reservoir permeability plays a crucial role in characterizing reservoirs and predicting the present and future production of hydrocarbon reservoirs. Data logging is a good tool for assessing the entire oil well section's continuous permeability curve. Nuclear magnetic resonance logging measurements are minimally influenced by lithology and offer significant benefits in interpreting permeability. The Schlumberger-Doll-Research model utilizes nuclear magnetic resonance logging, which accurately estimates permeability values. The approach of this investigation is to apply artificial neural networks and core data to predict permeability in wells without a nuclear magnetic resonance log. The Schlumberger-Doll-Research permeability is used to train the model, where the model prediction result is validated with core permeability. Seven oil well logs were used as input parameters, and the model was constructed with Techlog software. The predicted permeability with the model compared with Schlumberger-Doll-Research permeability as a cross plot, which results in the correlation coefficient of 94%, while the predicted permeability validated with the core permeability of the well, which obtains good agreement where R2 equals 80%. The model was utilized to forecast permeability in a well that did not have a nuclear magnetic resonance log, and the predicted permeability was cross-plotted against core permeability as a validation step, with a correlation coefficient of 77%. As a result, the low percentage of matching was due to data limitations, which demonstrated that as the amount of data used to train the model increased, so did the precision.
This research aims to clarify the importance of an accounting information system that uses artificial intelligence to detect earnings manipulation. The research problem stems from the widespread manipulation of earning in economic entities, especially at the local level, exacerbated by the high financial and administrative corruption rates in Iraq due to fraudulent accounting practices. Since earning manipulation involves intentional fraudulent acts, it is necessary to implement preventive measures to detect and deter such practices. The main hypothesis of the research assumes that an accounting information system based on artificial intelligence cannot effectively detect the manipulation of profits in Iraqi economic entities. The researche
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThis paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
In the latest years there has been a profound evolution in computer science and technology, which incorporated several fields. Under this evolution, Content Base Image Retrieval (CBIR) is among the image processing field. There are several image retrieval methods that can easily extract feature as a result of the image retrieval methods’ progresses. To the researchers, finding resourceful image retrieval devices has therefore become an extensive area of concern. Image retrieval technique refers to a system used to search and retrieve images from digital images’ huge database. In this paper, the author focuses on recommendation of a fresh method for retrieving image. For multi presentation of image in Convolutional Neural Network (CNN),
... Show Morein this paper the notion of threshold relations by using resemblance relation are introduced to get a similarity relation from a resemnblance relation R
This study aims at discussing the theoretical and applicable parts of the reading comprehension to help the teachers of Arabic. This study shows that the students have a general weakness in reading comprehension. The researcher handles issues related to reading comprehension and to practicing exercises of the training strategies such as: dialogue, discussion, discussion questions, continuous training, group works. Such skills will be used to analyze a poem to see the level of the students’ reading comprehension and to develop the students’ skills. The study answers two questions:
1- What is the reading comprehension from a theoretical perspective?
2- How can we develop the skills of the reading comprehension from the practical
The purpose of this paper is applying the robustness in Linear programming(LP) to get rid of uncertainty problem in constraint parameters, and find the robust optimal solution, to maximize the profits of the general productive company of vegetable oils for the year 2019, through the modify on a mathematical model of linear programming when some parameters of the model have uncertain values, and being processed it using robust counterpart of linear programming to get robust results from the random changes that happen in uncertain values of the problem, assuming these values belong to the uncertainty set and selecting the values that cause the worst results and to depend buil
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show More