The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutting-edge machine learning techniques, our methodology shows a notable improvement in the precision and effectiveness of well-log predictions. Standard well logs from a reference well were used to train machine learning models. Additionally, conventional wireline logs were used as input to estimate facies for unclassified wells lacking core data. R-squared analysis and goodness-of-fit tests provide a numerical assessment of model performance, strengthening the validation process. The multi-resolution graph-based clustering and similarity threshold approaches have demonstrated notable results, achieving an accuracy of nearly 98%. Applying these techniques to data from eighteen wells produced precise results, demonstrating the effectiveness of our approach in enhancing the reliability and quality of well-log production.
The availability of different processing levels for satellite images makes it important to measure their suitability for classification tasks. This study investigates the impact of the Landsat data processing level on the accuracy of land cover classification using a support vector machine (SVM) classifier. The classification accuracy values of Landsat 8 (LS8) and Landsat 9 (LS9) data at different processing levels vary notably. For LS9, Collection 2 Level 2 (C2L2) achieved the highest accuracy of (86.55%) with the polynomial kernel of the SVM classifier, surpassing the Fast Line-of-Sight Atmospheric Analysis of Spectral Hypercubes (FLAASH) at (85.31%) and Collection 2 Level 1 (C2L1) at (84.93%). The LS8 data exhibits similar behavior. Conv
... Show MoreThe study aimed to determine the impact of energy for the north and south magnetic poles on the the growth of bacteria isolated from cases of tooth decay, 68 swabs were collected from surfaces of faulty tooth, the detected of Staphylococcus aureus
... Show MoreThe research aims to achieve a set of objectives, the most important of which is determining the extent to which the auditors of the research sample in the Federal Bureau of Financial Supervision adhere to the requirements of the quality control system according to the Iraqi Audit Manual No. The federal financial / research sample with the quality control system according to the Iraqi audit guide No. 7), and the researcher seeks to test the main research hypothesis and sub-hypotheses, and to achieve this, a questionnaire was designed by (Google Form) and distributed electronically to the elements of the research sample, Through the statistical package program (SPSS), the results of the questionnaire were analysed. In light of the applied
... Show MoreThis paper studies the investment project evaluation under the condition of uncertainty. Evaluation of investment project under risk and uncertainty is possible to be carried out through application of various methods and techniques. The best known methods are : Risk-adjusted discount rate , certainty equivalent method , Sensitivity analysis and Simulation method The objective of this study is using the sensitivity analysis in evaluation Glass Bottles project in Anbar province under the condition of risk and uncertainty.
After applying sensitivity analysis we found that the glass bottles project sensitive to the following factors (cash flow, the cost of investment, and the pro
... Show MoreAbstract
Objective: the idea of this study to improve transdermal permeability of Methotrexate using eucalyptus oil, olive oil and peppermint oil as enhancers.
Method: eucalyptus oil (2% and 4%), peppermint oil (2% and 4%) and olive oil (2% and 4%) all used as natural enhancers to develop transdermal permeability of Methotrexate via gel formulation. The gel was subjected to many physiochemical properties tests. In-vitro release and permeability studies for the drug were done by Franz cell diffusion across synthetic membrane, kinetic model was studied via korsmeyer- peppas equation.
Result: the results demonstrate that safe, nonirritant or cause necrosis to rats' skin and stable till 60 days gel was successfully formulated.<
The current problem is summarized in what is called the development failing experience
in comprehencing the studying materials , so the students will feel worry of repeating failure
in he future , so he would seek blind keeping on heart for the studying material bond this isbad due to the forgetting in the future , one side of thesis research problem is that there is
many contradictory researches result in relation to the learning styles which impose the
nessicity to find results lessen this contradiction . the importance of the research is
summarized in the importance of the subject under the study , in that the researcher ( as in
her knowledge ) did not find a thesrs tackling the subject of the distinguished students
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show More