Predicting vertical stress was indeed useful for controlling geomechanical issues since it allowed for the computation of pore pressure for the formation and the classification of fault regimes. This study provides an in-depth observation of vertical stress prediction utilizing numerous approaches using the Techlog 2015 software. Gardner's method results in incorrect vertical stress values with a problem that this method doesn't start from the surface and instead relies only on sound log data. Whereas the Amoco, Wendt non-acoustic, Traugott, average technique simply needed density log as input and used a straight line as the observed density, this was incorrect for vertical computing stress. The results of these methods show that extrapolated density measurement used an average for the real density. The gradient of an extrapolated method is much better in shallow depth into the vertical stress calculations. The Miller density method had an excellent fit with the real density in deep depth. It has been crucial to calculate vertical stress for the past 40 years because calculating pore pressure and geomechanical building models have employed vertical stress as input. The strongest predictor of vertical stress may have been bulk density. According to these results, the miller and extrapolated techniques may be the best two methods for determining vertical stress. Still, the gradient of an extrapolated method is much more excellent in shallow depth than the miller method. Extrapolated density approach may produce satisfactory results for vertical stress, while miller values are lower than those obtained by extrapolating. This may be due to the poor gradient of this method at shallow depths. Gardner's approach incorrectly displays minimum values of about 4000 psi at great depths. While other methods provide numbers that are similar because these methods use constant bulk density values that start at the surface and continue to the desired depth, this is incorrect.
Laurylamine hydrochloride CH3(CH2)11 NH3 – Cl has been chosen from cationic surfactants to produce secondary oil using lab. model shown in fig. (1). The relationship between interfacial tension and (temperature, salinity and solution concentration) have been studied as shown in fig. (2, 3, 4) respectively. The optimum values of these three variables are taken (those values that give the lowest interfacial tension). Saturation, permeability and porosity are measured in the lab. The primary oil recovery was displaced by water injection until no more oil can be obtained, then laurylamine chloride is injected as a secondary oil recovery. The total oil recovery is 96.6% or 88.8% of the residual oil has been recovered by this technique as shown
... Show MoreNatural gas and oil are one of the mainstays of the global economy. However, many issues surround the pipelines that transport these resources, including aging infrastructure, environmental impacts, and vulnerability to sabotage operations. Such issues can result in leakages in these pipelines, requiring significant effort to detect and pinpoint their locations. The objective of this project is to develop and implement a method for detecting oil spills caused by leaking oil pipelines using aerial images captured by a drone equipped with a Raspberry Pi 4. Using the message queuing telemetry transport Internet of Things (MQTT IoT) protocol, the acquired images and the global positioning system (GPS) coordinates of the images' acquisition are
... Show MoreIn this study, a new type of circulating three-phase fluidized bed reactor was conducted by adding a spiral path and was named as spiral three-phase fluidized bed reactor (TPFB-S) to investigate the possibility for removing engine oil (virgin and waste form) from synthetic wastewater by using Ricinus communis (RC) leaves natural and activated by KOH. The biosorption process was conducted by changing particle diameter in the range 150–300 and 300–600 µm, liquid flow rate in the range 2.5–4.5 L/min and gas flow rate in range of 0–1 L/min, while other parameters initial oil emulsion concentration, pH, adsorbent concentration, agitation speed and contact time were kept constant at 2000 mg/L, 2,
The COVID-19 pandemic has necessitated new methods for controlling the spread of the virus, and machine learning (ML) holds promise in this regard. Our study aims to explore the latest ML algorithms utilized for COVID-19 prediction, with a focus on their potential to optimize decision-making and resource allocation during peak periods of the pandemic. Our review stands out from others as it concentrates primarily on ML methods for disease prediction.To conduct this scoping review, we performed a Google Scholar literature search using "COVID-19," "prediction," and "machine learning" as keywords, with a custom range from 2020 to 2022. Of the 99 articles that were screened for eligibility, we selected 20 for the final review.Our system
... Show MoreCOVID 19 has spread rapidly around the world due to the lack of a suitable vaccine; therefore the early prediction of those infected with this virus is extremely important attempting to control it by quarantining the infected people and giving them possible medical attention to limit its spread. This work suggests a model for predicting the COVID 19 virus using feature selection techniques. The proposed model consists of three stages which include the preprocessing stage, the features selection stage, and the classification stage. This work uses a data set consists of 8571 records, with forty features for patients from different countries. Two feature selection techniques are used in
<span lang="EN-US">Diabetes is one of the deadliest diseases in the world that can lead to stroke, blindness, organ failure, and amputation of lower limbs. Researches state that diabetes can be controlled if it is detected at an early stage. Scientists are becoming more interested in classification algorithms in diagnosing diseases. In this study, we have analyzed the performance of five classification algorithms namely naïve Bayes, support vector machine, multi layer perceptron artificial neural network, decision tree, and random forest using diabetes dataset that contains the information of 2000 female patients. Various metrics were applied in evaluating the performance of the classifiers such as precision, area under the c
... Show MoreBackground: techniques of image analysis have been used extensively to minimize interobserver variation of immunohistochemical scoring, yet; image acquisition procedures are often demanding, expensive and laborious. This study aims to assess the validity of image analysis to predict human observer’s score with a simplified image acquisition technique. Materials and methods: formalin fixed- paraffin embedded tissue sections for ameloblastomas and basal cell carcinomas were immunohistochemically stained with monoclonal antibodies to MMP-2 and MMP-9. The extent of antibody positivity was quantified using Imagej® based application on low power photomicrographs obtained with a conventional camera. Results of the software were employed
... Show More