Peak ground acceleration (PGA) is one of the critical factors that affect the determination of earthquake intensity. PGA is generally utilized to describe ground motion in a particular zone and is able to efficiently predict the parameters of site ground motion for the design of engineering structures. Therefore, novel models are developed to forecast PGA in the case of the Iraqi database, which utilizes the particle swarm optimization (PSO) approach. A data set of 187 historical ground-motion recordings in Iraq’s tectonic regions was used to build the explicit proposed models. The proposed PGA models relate to different seismic parameters, including the magnitude of the earthquake (Mw), average shear-wave velocity (VS30), focal depth (FD), and nearest epicenter distance (REPi) to a seismic station. The derived PGA models are remarkably simple and straightforward and can be used reliably for pre-design purposes. The proposed PGA models (i.e., models I and II) obtained via the explicit formula produced using the PSO method are highly correlated to the actual PGA records owing to low coefficients of variation (CoV) of approximately 2.12% and 2.06%, and mean values (i.e., close to 1.0) of approximately 1.005 and 1.004. Lastly, high-frequency, low absolute relative error (ARE), which is below 5%, is recorded for the proposed models, thereby showing an acceptable error distribution.
The petroleum industry, which is one of the pillars of the national economy, has the potential to generate vast wealth and employment possibilities. The transportation of petroleum products is complicated and changeable because of the hazards caused by the corrosion consequences. Hazardous chemical leaks caused by natural disasters may harm the environment, resulting in significant economic losses. It significantly threatens the aim for sustainable development. When a result, determining the likelihood of leakage and the potential for environmental harm, it becomes a top priority for decision-makers as they develop maintenance plans. This study aims to provide an in-depth understanding of the risks associated with oil and gas pipeli
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreThe continuous advancement in the use of the IoT has greatly transformed industries, though at the same time it has made the IoT network vulnerable to highly advanced cybercrimes. There are several limitations with traditional security measures for IoT; the protection of distributed and adaptive IoT systems requires new approaches. This research presents novel threat intelligence for IoT networks based on deep learning, which maintains compliance with IEEE standards. Interweaving artificial intelligence with standardization frameworks is the goal of the study and, thus, improves the identification, protection, and reduction of cyber threats impacting IoT environments. The study is systematic and begins by examining IoT-specific thre
... Show More