Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To verify the reliability of training data for zone-by-zone modeling, we split the scenario into two scenarios and applied them to seven wells' worth of data. Moreover, all wellbore intervals were processed, for instance, all five units of Mishrif formation. According to the findings, the more information we have, the more accurate our forecasting model becomes. Multi-resolution graph-based clustering has demonstrated its forecasting stability in two instances by comparing it to the other five machine learning models.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show More<span lang="EN-US">Diabetes is one of the deadliest diseases in the world that can lead to stroke, blindness, organ failure, and amputation of lower limbs. Researches state that diabetes can be controlled if it is detected at an early stage. Scientists are becoming more interested in classification algorithms in diagnosing diseases. In this study, we have analyzed the performance of five classification algorithms namely naïve Bayes, support vector machine, multi layer perceptron artificial neural network, decision tree, and random forest using diabetes dataset that contains the information of 2000 female patients. Various metrics were applied in evaluating the performance of the classifiers such as precision, area under the c
... Show MoreInformation about soil consolidation is essential in geotechnical design. Because of the time and expense involved in performing consolidation tests, equations are required to estimate compression index from soil index properties. Although many empirical equations concerning soil properties have been proposed, such equations may not be appropriate for local situations. The aim of this study is to investigate the consolidation and physical properties of the cohesive soil. Artificial Neural Network (ANN) has been adapted in this investigation to predict the compression index and compression ratio using basic index properties. One hundred and ninety five consolidation results for soils tested at different construction sites
... Show MoreBackground: techniques of image analysis have been used extensively to minimize interobserver variation of immunohistochemical scoring, yet; image acquisition procedures are often demanding, expensive and laborious. This study aims to assess the validity of image analysis to predict human observer’s score with a simplified image acquisition technique. Materials and methods: formalin fixed- paraffin embedded tissue sections for ameloblastomas and basal cell carcinomas were immunohistochemically stained with monoclonal antibodies to MMP-2 and MMP-9. The extent of antibody positivity was quantified using Imagej® based application on low power photomicrographs obtained with a conventional camera. Results of the software were employed
... Show More