Preferred Language
Articles
/
-oaET4YBIXToZYALG4E-
Developing a lightweight cryptographic algorithm based on DNA computing
...Show More Authors

This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of the present algorithm is simple, and the running operations required small execution time for encryption-decryption sensing data. Hence, a developed algorithm called DPRESENT was introduced to improve the complexity of the cipher text based on the PRESENT algorithm and DNA cryptography technique for developing a lightweight cipher algorithm. The NIST suite showed that the proposed algorithm tests presented high level of randomness and complexity. The execution time for the proposed algorithm was kept minimal as the current cipher algorithm. The developed algorithm is a new trend that can be applied for different lightweight cryptosystems to achieve the trade-off among complexity and speed as a robust cipher algorithm.

Crossref
Publication Date
Thu May 10 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
New Steganography System Based on Huffman Coding and Fibonacci Decomposition
...Show More Authors

   Hiding secret information in the image is a challenging and painstaking task in computer security and steganography system. Certainly, the absolute intricacy of attacks to security system makes it more attractive.in this research on steganography system involving information hiding,Huffman codding used to compress the secret code before embedding which provide high capacity and some security. Fibonacci decomposition used to represent the pixels in the cover image, which increase the robustness of the system. One byte used for mapping all the pixels properties. This makes the PSNR of the system higher due to random distribution of embedded bits. Finally, three kinds of evaluation are applied such as PSNR, chi-square attack, a

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Sep 01 2020
Journal Name
Al-khwarizmi Engineering Journal
UAV Control Based on Dual LQR and Fuzzy-PID Controller
...Show More Authors

This paper presents the design of a longitudinal controller for an autonomous unmanned aerial vehicle (UAV). This paper proposed the dual loop (inner-outer loop) control based on the intelligent algorithm. The inner feedback loop controller is a Linear Quadratic Regulator (LQR) to provide robust (adaptive) stability. In contrast, the outer loop controller is based on Fuzzy-PID (Proportional, Integral, and Derivative) algorithm to provide reference signal tracking. The proposed dual controller is to control the position (altitude) and velocity (airspeed) of an aircraft. An adaptive Unscented Kalman Filter (AUKF) is employed to track the reference signal and is decreased the Gaussian noise. The mathematical model of aircraft

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Tue Mar 12 2019
Journal Name
Al-khwarizmi Engineering Journal
Optimization Drilling Parameters of Aluminum Alloy Based on Taguchi Method
...Show More Authors

This paper focuses on the optimization of drilling parameters by utilizing “Taguchi method” to obtain the minimum surface roughness. Nine drilling experiments were performed on Al 5050 alloy using high speed steel twist drills. Three drilling parameters (feed rates, cutting speeds, and cutting tools) were used as control factors, and L9 (33) “orthogonal array” was specified for the experimental trials. Signal to Noise (S/N) Ratio and “Analysis of Variance” (ANOVA) were utilized to set the optimum control factors which minimized the surface roughness. The results were tested with the aid of statistical software package MINITAB-17. After the experimental trails, the tool diameter was found as the most important facto

... Show More
View Publication Preview PDF
Crossref (15)
Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Ieee Access
Low-Distortion MMSE Speech Enhancement Estimator Based on Laplacian Prior
...Show More Authors

View Publication
Scopus (41)
Crossref (39)
Scopus Clarivate Crossref
Publication Date
Fri Sep 03 2021
Journal Name
Entropy
Reliable Recurrence Algorithm for High-Order Krawtchouk Polynomials
...Show More Authors

Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the

... Show More
View Publication
Scopus (34)
Crossref (31)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2002
Journal Name
Iraqi Journal Of Physics
An edge detection algorithm matching visual contour perception
...Show More Authors

For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.

View Publication Preview PDF
Publication Date
Sun Jun 20 2021
Journal Name
Baghdad Science Journal
Multifactor Algorithm for Test Case Selection and Ordering
...Show More Authors

Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Engineering And Technology Journal
Face Retrieval Using Image Moments and Genetic Algorithm
...Show More Authors

Publication Date
Fri Apr 20 2012
Journal Name
International Journal Of Computer And Information Engineering
An Optimal Algorithm for HTML Page Building Process
...Show More Authors

An Optimal Algorithm for HTML Page Building Process

View Publication Preview PDF
Publication Date
Mon Aug 01 2022
Journal Name
Bulletin Of Electrical Engineering And Informatics
Solid waste recycling and management cost optimization algorithm
...Show More Authors

Solid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref