Cryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show MoreAir-conditioning systems (ACs) are essential in hot and humid climates to ensure acceptable ambient air quality as well as thermal comfort for buildings users. It is essential to improve refrigeration system performance without increasing the effects of global warming potential (GWP) and ozone depletion potential (ODP). The main objective of this study is to evaluate the performance of an air conditioning system that operates with a liquid suction heat exchanger (LSHX) through implementing refrigerants with zero OPD and low GWP (i.e., R134a and R1234yf). Liquid suction heat exchanger (LSHX) was added to an automobile air conditioning system (AACS).When Liquid suction heat exchanger was added to the cycle, primary results indicated t
... Show MoreNew, simple and sensitive batch and Flow-injecton spectrophotometric methods for the determination of Thymol in pure form and in mouth wash preparations have been proposed in this study. These methods were based on a diazotization and coupling reaction between Thymol and diazotized procaine HCl in alkaline medium to form an intense orange-red water-soluble dye that is stable and has a maximum absorption at 474 nm. A graphs of absorbance versus concentration show that Beer’s law is obeyed over the concentration range of 0.4-4.8 and 4-80 µg.ml-1 of Thymol, with detection limits of 0.072 and 1.807 µg.ml-1 of Thymol for batch and FIA methods respectively. The FIA procedure sample throughput was 80 h-1. All different chemical and physical e
... Show MoreThe electrocardiogram (ECG) is the recording of the electrical potential of the heart versus time. The analysis of ECG signals has been widely used in cardiac pathology to detect heart disease. The ECGs are non-stationary signals which are often contaminated by different types of noises from different sources. In this study, simulated noise models were proposed for the power-line interference (PLI), electromyogram (EMG) noise, base line wander (BW), white Gaussian noise (WGN) and composite noise. For suppressing noises and extracting the efficient morphology of an ECG signal, various processing techniques have been recently proposed. In this paper, wavelet transform (WT) is performed for noisy ECG signals. The graphical user interface (GUI)
... Show MoreThis article showcases the development and utilization of a side-polished fiber optic sensor that can identify altered refractive index levels within a glucose solution through the investigation of the surface Plasmon resonance (SPR) effect. The aim was to enhance efficiency by means of the placement of a 50 nm-thick layer of gold at the D-shape fiber sensing area. The detector was fabricated by utilizing a silica optical fiber (SOF), which underwent a cladding stripping process that resulted in three distinct lengths, followed by a polishing method to remove a portion of the fiber diameter and produce a cross-sectional D-shape. During experimentation with glucose solution, the side-polished fiber optic sensor revealed an adept detection
... Show MoreThis paper present a study about effect of the random phase and expansion of the scale sampling factors to improve the monochrome image hologram and compared it with previous produced others. Matlab software is used to synthesize and reconstruction hologram.
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show More