The principal goal guiding any designed encryption algorithm must be security against unauthorized attackers. Within the last decade, there has been a vast increase in the communication of digital computer data in both the private and public sectors. Much of this information has a significant value; therefore it does require the protection by design strength algorithm to cipher it. This algorithm defines the mathematical steps required to transform data into a cryptographic cipher and also to transform the cipher back to the original form. The Performance and security level is the main characteristics that differentiate one encryption algorithm from another. In this paper suggested a new technique to enhance the performance of the Data Encryption Standard (DES) algorithm by generate the key of this algorithm from random bitmaps images depending on the increasing of the randomness of the pixel colour, which lead to generate a (clipped) key has a very high randomness according to the know randomness tests and adds a new level of protection strength and more robustness against breaking methods.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image co
... Show MoreFractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreMost of the Internet of Things (IoT), cell phones, and Radio Frequency Identification (RFID) applications need high speed in the execution and processing of data. this is done by reducing, system energy consumption, latency, throughput, and processing time. Thus, it will affect against security of such devices and may be attacked by malicious programs. Lightweight cryptographic algorithms are one of the most ideal methods Securing these IoT applications. Cryptography obfuscates and removes the ability to capture all key information patterns ensures that all data transfers occur Safe, accurate, verified, legal and undeniable. Fortunately, various lightweight encryption algorithms could be used to increase defense against various at
... Show MoreBlock cipher technique is one of cryptography techniques to encrypt data block by block. The Serpent is one of AES candidates. It encrypts a 128-bit block by using 32 rounds of a similar calculation utilizing permutations and substitutions. Since the permutations and substitutions of it are static. Then this paper proposes dynamic methods for permutation, substitution and key generation based on chaotic maps to get more security. The proposed methods are analyzed and the results showed that they were able to exceed the weakness resulting from the use of static permutations and substitutions boxes in the original algorithm and also can reduce number of rounds and time usage compared with a classical Serpent block
... Show MoreThree-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essentia
... Show MoreRecently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreHuman posture estimation is a crucial topic in the computer vision field and has become a hotspot for research in many human behaviors related work. Human pose estimation can be understood as the human key point recognition and connection problem. The paper presents an optimized symmetric spatial transformation network designed to connect with single-person pose estimation network to propose high-quality human target frames from inaccurate human bounding boxes, and introduces parametric pose non-maximal suppression to eliminate redundant pose estimation, and applies an elimination rule to eliminate similar pose to obtain unique human pose estimation results. The exploratory outcomes demonstrate the way that the proposed technique can pre
... Show MoreFractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
In this paper, we designed a new efficient stream cipher cryptosystem that depend on a chaotic map to encrypt (decrypt) different types of digital images. The designed encryption system passed all basic efficiency criteria (like Randomness, MSE, PSNR, Histogram Analysis, and Key Space) that were applied to the key extracted from the random generator as well as to the digital images after completing the encryption process.
In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show More