Today in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%, respectively, with pleasing quality exceeding 45 dB.
The present work aims to study the effect of using an automatic thresholding technique to convert the features edges of the images to binary images in order to split the object from its background, where the features edges of the sampled images obtained from first-order edge detection operators (Roberts, Prewitt and Sobel) and second-order edge detection operators (Laplacian operators). The optimum automatic threshold are calculated using fast Otsu method. The study is applied on a personal image (Roben) and a satellite image to study the compatibility of this procedure with two different kinds of images. The obtained results are discussed.
In this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox
... Show MoreThe digital multimedia systems become standard at this time because of their extremely sensory activity effects and also the advanced development in its corresponding technology. Recently, biological techniques applied to several varieties of applications such as authentication protocols, organic chemistry, and cryptography. Deoxyribonucleic Acid (DNA) is a tool to hide the key information in multimedia platforms.
In this paper, an embedding algorithm is introduced; first, the image is divided into equally sized blocks, these blocks checked for a small amount color in all the separated blocks. The selected blocks are used to localize the necessary image information. In the second stage, a comparison is between the initial image pixel
In this paper a new structure for the AVR of the power system exciter is proposed and designed using digital-based LQR. With two weighting matrices R and Q, this method produces an optimal regulator that is used to generate the feedback control law. These matrices are called state and control weighting matrices and are used to balance between the relative importance of the input and the states in the cost function that is being optimized. A sample power system composed of single machine connected to an infinite- bus bar (SMIB) with both a conventional and a proposed Digital AVR (DAVR) is simulated. Evaluation results show that the DAVR damps well the oscillations of the terminal voltage and presents a faster respo
... Show MoreThis paper introduced an algorithm for lossless image compression to compress natural and medical images. It is based on utilizing various casual fixed predictors of one or two dimension to get rid of the correlation or spatial redundancy embedded between image pixel values then a recursive polynomial model of a linear base is used.
The experimental results of the proposed compression method are promising in terms of preserving the details and the quality of the reconstructed images as well improving the compression ratio as compared with the extracted results of a traditional linear predicting coding system.