The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutting-edge machine learning techniques, our methodology shows a notable improvement in the precision and effectiveness of well-log predictions. Standard well logs from a reference well were used to train machine learning models. Additionally, conventional wireline logs were used as input to estimate facies for unclassified wells lacking core data. R-squared analysis and goodness-of-fit tests provide a numerical assessment of model performance, strengthening the validation process. The multi-resolution graph-based clustering and similarity threshold approaches have demonstrated notable results, achieving an accuracy of nearly 98%. Applying these techniques to data from eighteen wells produced precise results, demonstrating the effectiveness of our approach in enhancing the reliability and quality of well-log production.
The progress of science in all its branches and levels made great civilized changes of
our societies in the present day, it's a result of the huge amount of knowledge, the increase of
number of students, and the increase of community awareness proportion of the importance of
education in schools and universities, it became necessary for us as educators to look at
science from another point of view based on the idea of scientific development of curricula
and teaching methods and means of education, and for the studying class environment as a
whole, by computer and internet use in education to the emergence of the term education
technology, which relies on the use of modern technology to provide educational content to<
E-learning applications according to the levels of enlightenment (STEM Literacy) for physics teachers in the secondary stage. The sample consists of (400) teachers, at a rate of (200) males (50%), and (200)females (50%), distributed over (6) directorates of education in Baghdad governorate on both sides of Rusafa and Karkh. To verify the research goals, the researcher built a scale of e-learning applications according to the levels of STEM Literacy, which consists of (50) items distributed over (5) levels. The face validity of the scale and its stability were verified by extracting the stability coefficient through the internal consistency method “Alf-Cronbach”. The following statistical means were used: Pearson correlation coefficient,
... Show MoreIn this study, the harvest of maize silage with the cross double row sowing method were tested with a single row disc silage machine in two different PTO applications (540 and 540E min-1) and at two different working speeds v1, v2 (1.8 and 2.5 km h-1). The possibilities of harvesting with a single row machine were revealed, and performance characteristics such as hourly fuel consumption, field-product fuel consumption and PTO power consumption were determined in the trials. The best results in terms of hourly fuel consumption and PTO power consumption were determined in the 540E PTO application and V1 working speed. When the fuel consumption of the field-product is evaluated, it is obtained with V2 working speed and 540E PTO application. As
... Show MoreFinding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreA major disadvantage of dose reconstruction by means of thermoluminescence (TL) is the fact that during readout of any TL material exposed to ionizing radiation (i.e., during measuring the glow curve), the radiation-induced signal gets lost. Application of the photo-transferred thermoluminescence phenomenon (PTTL) may offer a solution to this problem. In PTTL, the residual signal that is not destroyed by conventional TL readout (because it comes from deeper electron traps) can be readout through simultaneous stimulation by UV light and heating, allowing to obtain information about the absorbed dose in a second run. The present paper describes the application of PTTL for emergency dose assessment. For
Electronic learning was used as a substitute method for learning during the COVID-19 pandemic to conduct scientific materials and perform student assessment; this study aimed to investigate academic staff opinions toward electronic education. A cross-sectional study with a web-based questionnaire distributed to academic staff in different medical colleges in Iraq. After de-identification, data were collected and analyzed with statistical software to determine the significance between variables. A total of 256 participants were enrolled in the study: 83% were not satisfied or neutral to online learning, 80% showed a poor benefit from delivery of the practical electronic knowledge and 25% for theoretical sessions with a significant difference
... Show MoreBackground: For patients with coronavirus disease(COVID-19), continuous positive airway pressure (CPAP) has been considered as a useful treatment. The goal of CPAP therapy is to enhance oxygenation, relieve breathing muscle strain, and maybe avoid intubation. If applied in a medical ward with a multidisciplinary approach, CPAP has the potential to reduce the burden on intensive care units. Methods: Cross-sectional design was conducted in the ALSHEFAA center for crises in Baghdad. Questionnaire filled by 80 nurses who work in Respiratory Isolation Unit who had chosen by non-probability (purposive) selection collected the data. Then the researcher used an observational checklist to evaluate nurses’ practice. The data was analyzed us
... Show MoreIn this study, the response and behavior of machine foundations resting on dry and saturated sand was investigated experimentally. In order to investigate the response of soil and footing to steady state dynamic loading, a physical model was manufactured. The manufactured physical model could be used to simulate steady state harmonic load at different operating frequencies. Total of (84) physical models were performed. The parameters that were taken into considerations include loading frequency, size of footing and different soil conditions. The footing parameters were related to the size of the rectangular footing and depth of embedment. Two sizes of rectangular steel model footing were used (100 200 12.5 mm) and (200 400 5.0 mm).
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show More