An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification, including ResNet50, VGG19, and InceptionV4; They were trained and tested on an open-source satellite image dataset to analyze the algorithms' efficiency and performance and correlated the classification accuracy, precisions, recall, and f1-score. The result shows that InceptionV4 gives the best classification accuracy of 97% for cloudy, desert, green areas, and water, followed by VGG19 with approximately 96% and ResNet50 with 93%. The findings proved that the InceptionV4 algorithm is suitable for classifying oil spills and no spill with satellite images on a validated dataset.
Laurylamine hydrochloride CH3(CH2)11 NH3 – Cl has been chosen from cationic surfactants to produce secondary oil using lab. model shown in fig. (1). The relationship between interfacial tension and (temperature, salinity and solution concentration) have been studied as shown in fig. (2, 3, 4) respectively. The optimum values of these three variables are taken (those values that give the lowest interfacial tension). Saturation, permeability and porosity are measured in the lab. The primary oil recovery was displaced by water injection until no more oil can be obtained, then laurylamine chloride is injected as a secondary oil recovery. The total oil recovery is 96.6% or 88.8% of the residual oil has been recovered by this technique as shown
... Show MoreAlthough its wide utilization in microbial cultures, the one factor-at-a-time method, failed to find the true optimum, this is due to the interaction between optimized parameters which is not taken into account. Therefore, in order to find the true optimum conditions, it is necessary to repeat the one factor-at-a-time method in many sequential experimental runs, which is extremely time-consuming and expensive for many variables. This work is an attempt to enhance bioactive yellow pigment production by Streptomyces thinghirensis based on a statistical design. The yellow pigment demonstrated inhibitory effects against Escherichia coli and Staphylococcus aureus and was characterized by UV-vis spectroscopy which showed lambda maximum of
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreThis paper aims to build a modern vision for Islamic banks to ensure sustainability and growth, as well it aims to highlight the positive Iraqi steps in the Islamic banking sector. In order to build this vision, several scientific research approaches were adopted (quantitative, descriptive analytical, descriptive). As for the research community, it was for all the Iraqi private commercial banks, including Islamic banks. The research samples varied according to a diversity of the methods and the data availability. A questionnaire was constructed and conducted, measuring internal and external honesty. 50 questionnaires were distributed to Iraqi academic specialized in Islamic banking. All distributed forms were subject to a thorough analys
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Polarization manipulation elements operating at visible wavelengths represent a critical component of quantum communication sub-systems, equivalent to their telecom wavelength counterparts. The method proposed involves rotating the optic axis of the polarized input light by an angle of 45 degree, thereby converting the fundamental transverse electric (TE0) mode to the fundamental transverse magnetic (TM0) mode. This paper outlines an integrated gallium phosphide-waveguide polarization rotator, which relies on the rotation of a horizontal slot by 45 degree at a wavelength of 700 nm. This will ultimately lead to the conception of a mode hybridization phenomeno
Nuclear structure of 20,22Ne isotopes has been studied via the shell model with Skyrme-Hartree-Fock calculations. In particular, the transitions to the low-lying positive and negative parity excited states have been investigated within three shell model spaces; sd for positive parity states, spsdpf large-basis (no-core), and zbme model spaces for negative parity states. Excitation energies, reduced transition probabilities, and elastic and inelastic form factors were estimated and compared to the available experimental data. Skyrme interaction was used to generate a one-body potential in the Hartree-Fock calculations for each selected excited states, which is then used to calculate the single-particle matrix elements. Skyrme interac
... Show MoreBipedal robotic mechanisms are unstable due to the unilateral contact passive joint between the sole and the ground. Hierarchical control layers are crucial for creating walking patterns, stabilizing locomotion, and ensuring correct angular trajectories for bipedal joints due to the system’s various degrees of freedom. This work provides a hierarchical control scheme for a bipedal robot that focuses on balance (stabilization) and low-level tracking control while considering flexible joints. The stabilization control method uses the Newton–Euler formulation to establish a mathematical relationship between the zero-moment point (ZMP) and the center of mass (COM), resulting in highly nonlinear and coupled dynamic equations. Adaptiv
... Show More