The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThe main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreAbstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreAttacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover. The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels wit
... Show MoreThis study compared in vitro the microleakage of a new low shrink silorane-based posterior composite (Filtek™ P90) and two methacrylate-based composites: a packable posterior composite (Filtek™ P60) and a nanofill composite (Filtek™ Supreme XT) through dye penetration test. Thirty sound human upper premolars were used in this study. Standardized class V cavities were prepared at the buccal surface of each tooth. The teeth were then divided into three groups of ten teeth each: (Group 1: restored with Filtek™ P90, Group 2: restored with Filtek™ P60, and Group 3: restored with Filtek™ Supreme XT). Each composite system was used according to the manufacturer's instructions with their corresponding adhesive systems. The teeth were th
... Show MorePurpose: the purpose of study is estimate the Risk premium, Interest rate, Inflation and FDI in the through of Coronavirus in the MENA countries. Theoretical framework: The theoretical framework included the study of the main variables, which are risk premium, interest rate, inflation, and foreign direct investment during the Corona virus pandemic. Design/methodology/approach: Concentrating on “COVID-19”, as an effective factor on the Foreign direct investment (FDI), I employ data of “MENA (Middle East and Northern Africa)” countries from 2000 to 2021 to investigate the impact of COVID-19, financial and macroeconomic indicators on FDI relying on the analytic research approach of Static panel data regression, includ
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show More