Recently, a new secure steganography algorithm has been proposed, namely, the secure Block Permutation Image Steganography (BPIS) algorithm. The new algorithm consists of five main steps, these are: convert the secret message to a binary sequence, divide the binary sequence into blocks, permute each block using a key-based randomly generated permutation, concatenate the permuted blocks forming a permuted binary sequence, and then utilize a plane-based Least-Significant-Bit (LSB) approach to embed the permuted binary sequence into BMP image file format. The performance of algorithm was given a preliminary evaluation through estimating the PSNR (Peak Signal-to-Noise Ratio) of the stego image for limited number of experiments comprised hiding text files of various sizes into BMP images. This paper presents a deeper algorithm performance evaluation; in particular, it evaluates the effects of length of permutation and occupation ratio on stego image quality and steganography processing time. Furthermore, it evaluates the algorithm performance for concealing different types of secret media, such as MS office file formats, image files, PDF files, executable files, and compressed files.
With the increasing rate of unauthorized access and attacks, security of confidential data is of utmost importance. While Cryptography only encrypts the data, but as the communication takes place in presence of third parties, so the encrypted text can be decrypted and can easily be destroyed. Steganography, on the other hand, hides the confidential data in some cover source such that the existence of the data is also hidden which do not arouse suspicion regarding the communication taking place between two parties. This paper presents to provide the transfer of secret data embedded into master file (cover-image) to obtain new image (stego-image), which is practically indistinguishable from the original image, so that other than the indeed us
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
The haplotype association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease.Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls.It starts with inferring haplotypes from genotypes followed by a haplotype co-classification and marginal screening for disease-associated haplotypes.Unfortunately,phasing uncertainty may have a strong effects on the haplotype co-classification and therefore on the accuracy of predicting risk haplotypes.Here,to address the issue,we propose an alternative approach:In Stage 1,we select potential risk genotypes inste
... Show MoreThere are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m
... Show MoreThe objective of the research is to identify the level of supervisory performance of the educational supervisor from the point of view of headmasters at secondary schools. The problem was the need to evaluate performance. A sample of (97) school headmasters was chosen to collect the needed data, they proportionated (38%) of the total community. the researcher designed a questionnaire consisted of (43) paragraphs with five areas. The results showed that there is a good level of performance among supervisors; there are no significant differences in the variable of the certificate, while there were significant differences in terms of gender for the benefit of males. The research concluded with a number of recommendations and suggestions.
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThe need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show MoreGeneral Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, mini
... Show More