This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of the present algorithm is simple, and the running operations required small execution time for encryption-decryption sensing data. Hence, a developed algorithm called DPRESENT was introduced to improve the complexity of the cipher text based on the PRESENT algorithm and DNA cryptography technique for developing a lightweight cipher algorithm. The NIST suite showed that the proposed algorithm tests presented high level of randomness and complexity. The execution time for the proposed algorithm was kept minimal as the current cipher algorithm. The developed algorithm is a new trend that can be applied for different lightweight cryptosystems to achieve the trade-off among complexity and speed as a robust cipher algorithm.
Hiding secret information in the image is a challenging and painstaking task in computer security and steganography system. Certainly, the absolute intricacy of attacks to security system makes it more attractive.in this research on steganography system involving information hiding,Huffman codding used to compress the secret code before embedding which provide high capacity and some security. Fibonacci decomposition used to represent the pixels in the cover image, which increase the robustness of the system. One byte used for mapping all the pixels properties. This makes the PSNR of the system higher due to random distribution of embedded bits. Finally, three kinds of evaluation are applied such as PSNR, chi-square attack, a
... Show MoreThis paper presents the design of a longitudinal controller for an autonomous unmanned aerial vehicle (UAV). This paper proposed the dual loop (inner-outer loop) control based on the intelligent algorithm. The inner feedback loop controller is a Linear Quadratic Regulator (LQR) to provide robust (adaptive) stability. In contrast, the outer loop controller is based on Fuzzy-PID (Proportional, Integral, and Derivative) algorithm to provide reference signal tracking. The proposed dual controller is to control the position (altitude) and velocity (airspeed) of an aircraft. An adaptive Unscented Kalman Filter (AUKF) is employed to track the reference signal and is decreased the Gaussian noise. The mathematical model of aircraft
... Show MoreThis paper focuses on the optimization of drilling parameters by utilizing “Taguchi method” to obtain the minimum surface roughness. Nine drilling experiments were performed on Al 5050 alloy using high speed steel twist drills. Three drilling parameters (feed rates, cutting speeds, and cutting tools) were used as control factors, and L9 (33) “orthogonal array” was specified for the experimental trials. Signal to Noise (S/N) Ratio and “Analysis of Variance” (ANOVA) were utilized to set the optimum control factors which minimized the surface roughness. The results were tested with the aid of statistical software package MINITAB-17. After the experimental trails, the tool diameter was found as the most important facto
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreAn Optimal Algorithm for HTML Page Building Process
Solid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show More