Preferred Language
Articles
/
-oaET4YBIXToZYALG4E-
Developing a lightweight cryptographic algorithm based on DNA computing
...Show More Authors

This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of the present algorithm is simple, and the running operations required small execution time for encryption-decryption sensing data. Hence, a developed algorithm called DPRESENT was introduced to improve the complexity of the cipher text based on the PRESENT algorithm and DNA cryptography technique for developing a lightweight cipher algorithm. The NIST suite showed that the proposed algorithm tests presented high level of randomness and complexity. The execution time for the proposed algorithm was kept minimal as the current cipher algorithm. The developed algorithm is a new trend that can be applied for different lightweight cryptosystems to achieve the trade-off among complexity and speed as a robust cipher algorithm.

Crossref
Publication Date
Sat Aug 01 2015
Journal Name
International Journal Of Computer Science And Mobile Computing
Image Compression based on Non-Linear Polynomial Prediction Model
...Show More Authors

Publication Date
Mon Feb 04 2019
Journal Name
Journal Of The College Of Education For Women
Image Watermarking based on Huffman Coding and Laplace Sharpening
...Show More Authors

In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform

... Show More
View Publication Preview PDF
Publication Date
Sat Jul 01 2023
Journal Name
International Journal Of Computing And Digital Systems
Human Identification Based on SIFT Features of Hand Image
...Show More Authors

View Publication
Scopus (2)
Scopus Crossref
Publication Date
Thu Jan 30 2020
Journal Name
Journal Of Engineering
Design and Analysis WIMAX Network Based on Coverage Planning
...Show More Authors

In this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 04 2022
Journal Name
Ieee Access
Plain, Edge, and Texture Detection Based on Orthogonal Moment
...Show More Authors

Image pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The

... Show More
Publication Date
Fri May 17 2013
Journal Name
International Journal Of Computer Applications
Fast Lossless Compression of Medical Images based on Polynomial
...Show More Authors

In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.

View Publication Preview PDF
Crossref (7)
Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Baghdad Science Journal
Image Encryption based on Chaotic Blocks Shuffling and RC4
...Show More Authors

View Publication
Clarivate Crossref
Publication Date
Fri Jan 01 2010
Journal Name
Iraqi Journal Of Science
RETRIEVING DOCUMENT WITH COMPACT GENETIC ALGORITHM(CGA)
...Show More Authors

Preview PDF
Publication Date
Sun Mar 01 2015
Journal Name
Computer Systems Science & Engineering
Parameters' fine tuning of differential evolution algorithm
...Show More Authors

Most heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par

... Show More
Scopus (7)
Scopus
Publication Date
Mon Jan 01 2018
Journal Name
International Journal Of Data Mining, Modelling And Management
Association rules mining using cuckoo search algorithm
...Show More Authors

Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.

View Publication Preview PDF
Scopus (7)
Crossref (3)
Scopus Crossref