The petroleum industry, which is one of the pillars of the national economy, has the potential to generate vast wealth and employment possibilities. The transportation of petroleum products is complicated and changeable because of the hazards caused by the corrosion consequences. Hazardous chemical leaks caused by natural disasters may harm the environment, resulting in significant economic losses. It significantly threatens the aim for sustainable development. When a result, determining the likelihood of leakage and the potential for environmental harm, it becomes a top priority for decision-makers as they develop maintenance plans. This study aims to provide an in-depth understanding of the risks associated with oil and gas pipelines. It also tries to identify essential risk factors in flowline projects, as well as their likelihood and severity, in order to reduce loss of life and increased expenditures as a result of safety issues. The monetary quantification was used to determine the leakage-induced environmental losses. Using a 5-by-5 probability-currency matrix, the level of environmental risk was evaluated the safety and risk-based inspection (RBI) is evaluated through the use of specific schedules to determine the likelihood of failure (LOF) and Consequence of Failure (COF). The risk level appears in the matrix, and appropriate maintenance steps should be taken to reduce risks, such as injecting corrosion inhibitors to protect the Pipelines, activating cathodic protection or coating. Overall, this research contributes to the prevention of petroleum product leakage due to the corrosion consequences in the transportation sector. Also, encourage non-environmental risk decision-makers to gain a better understanding of the risk level.
The influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreGeographic Information Systems (GIS) are obtaining a significant role in handling strategic applications in which data are organized as records of multiple layers in a database. Furthermore, GIS provide multi-functions like data collection, analysis, and presentation. Geographic information systems have assured their competence in diverse fields of study via handling various problems for numerous applications. However, handling a large volume of data in the GIS remains an important issue. The biggest obstacle is designing a spatial decision-making framework focused on GIS that manages a broad range of specific data to achieve the right performance. It is very useful to support decision-makers by providing GIS-based decision support syste
... Show MoreThis study presents an adaptive control scheme based on synergetic control theory for suppressing the vibration of building structures due to earthquake. The control key for the proposed controller is based on a magneto-rheological (MR) damper, which supports the building. According to Lyapunov-based stability analysis, an adaptive synergetic control (ASC) strategy was established under variation of the stiffness and viscosity coefficients in the vibrated building. The control and adaptive laws of the ASC were developed to ensure the stability of the controlled structure. The proposed controller addresses the suppression problem of a single-degree-of-freedom (SDOF) building model, and an earthquake control scenario was conducted and simulat
... Show MoreThe flexible joint robot (FJR) typically experiences parametric variations, nonlinearities, underactuation, noise propagation, and external disturbances which seriously degrade the FJR tracking. This article proposes an adaptive integral sliding mode controller (AISMC) based on a singular perturbation method and two state observers for the FJR to achieve high performance. First, the underactuated FJR is modeled into two simple second-order fast and slow subsystems by using Olfati transformation and singular perturbation method, which handles underactuation while reducing noise amplification. Then, the AISMC is proposed to effectively accomplish the desired tracking performance, in which the integral sliding surface is designed to reduce cha
... Show MoreComputer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the bes
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreRationale, aims and objectives: A review of studies published over the last six years gives update about this hot topic. In the middle of COVID-19 pandemic, this study findings can help understand how population may perceive vaccinations. The objectives of this study were to review the literature covering the perceptions about influenza vaccines and to determine factors influencing the acceptance of vaccination using Health Belief Model (HBM). Methods: A comprehensive literature search was performed utilizing PubMed and Google Scholar databases. Three keywords were used: Influenza vaccine, perceptions, and Middle East. Empirical studies that dealt with people/ HCW perceptions of influenza vaccine in the Middle East and writt
... Show More