Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is preprocessed and transform of into some intermediate form which can be compressed with better efficiency and security. This solves some problems relevant to the common encryption methods which generally manipulate an entire data set, most encryption algorithms tend to make the transfer of information more costly in terms of time and sometimes bandwidth.
A strong sign language recognition system can break down the barriers that separate hearing and speaking members of society from speechless members. A novel fast recognition system with low computational cost for digital American Sign Language (ASL) is introduced in this research. Different image processing techniques are used to optimize and extract the shape of the hand fingers in each sign. The feature extraction stage includes a determination of the optimal threshold based on statistical bases and then recognizing the gap area in the zero sign and calculating the heights of each finger in the other digits. The classification stage depends on the gap area in the zero signs and the number of opened fingers in the other signs as well as
... Show MoreUncompressed form of the digital images are needed a very large storage capacity amount, as a consequence requires large communication bandwidth for data transmission over the network. Image compression techniques not only minimize the image storage space but also preserve the quality of image. This paper reveal image compression technique which uses distinct image coding scheme based on wavelet transform that combined effective types of compression algorithms for further compression. EZW and SPIHT algorithms are types of significant compression techniques that obtainable for lossy image compression algorithms. The EZW coding is a worthwhile and simple efficient algorithm. SPIHT is an most powerful technique that utilize for image
... Show MoreRecently, the internet has made the users able to transmit the digital media in the easiest manner. In spite of this facility of the internet, this may lead to several threats that are concerned with confidentiality of transferred media contents such as media authentication and integrity verification. For these reasons, data hiding methods and cryptography are used to protect the contents of digital media. In this paper, an enhanced method of image steganography combined with visual cryptography has been proposed. A secret logo (binary image) of size (128x128) is encrypted by applying (2 out 2 share) visual cryptography on it to generate two secret share. During the embedding process, a cover red, green, and blue (RGB) image of size (512
... Show MoreThis dissertation studies the application of equivalence theory developed by Mona Baker in translating Persian to Arabic. Among various translation methodologies, Mona Baker’s bottom-up equivalency approach is unique in several ways. Baker’s translation approach is a multistep process. It starts with studying the smallest linguistic unit, “the word”, and then evolves above the level of words leading to the translation of the entire text. Equivalence at the word level, i.e., word for word method, is the core point of Baker’s approach.
This study evaluates the use of Baker’s approach in translation from Persian to Arabic, mainly because finding the correct equivalence is a major challenge in this translation. Additionall
... Show MoreIn this paper, the density of state (DOS) at Fe metal contact to Titanium dioxide semiconductor (TiO2) has been studied and investigated using quantum consideration approaches. The study and calculations of (DOS) depended on the orientation and driving energies. was a function of TiO2 and Fe materials' refractive index and dielectric constant. Attention has focused on the effect of on the characteristic of (DOS), which increased with the increasing of refractive index and dielectric constant of Fe metal and vice versa. The results of (DOS) and its relation with and values of system have been discussed. As for contact system is increased, (DOS) values increased at first, but the relation is disturbed later and transforms into an inve
... Show MoreHistoric centers are often subject to urban renewal without the prior knowledge of the extent of the cohesion and attachment to place of its inhabitants. Identifying the rates of cohesion and place attachment can help urban designers to avoid decisions that lead to clashes with the reality of the social groups inhabiting the neighborhoods of the historic center. So the research aimed to measure cohesion and place attachment in a methodological approach based on a psychological instrument conducted by previous studies .The measurements were applied through a questionnaire given to the residents of six elected neighborhoods forming the historic center of Al- Adhamiya.The research assumed the relative disparity rates of cohesion and place atta
... Show MoreThe Hartley transform generalizes to the fractional Hartley transform (FRHT) which gives various uses in different fields of image encryption. Unfortunately, the available literature of fractional Hartley transform is unable to provide its inversion theorem. So accordingly original function cannot retrieve directly, which restrict its applications. The intension of this paper is to propose inversion theorem of fractional Hartley transform to overcome this drawback. Moreover, some properties of fractional Hartley transform are discussed in this paper.
The aim of this research is to construct an educational program in light of the theory of behavioral cognitive and its impact on the development of the efficient response to students affected by crises (centers of your right to education). To achieve the objectives of the research, two scales were developed by the researcher in addition to two equivalent hypotheses were formulated. The scale contains (26) items divided into five fields; for its validity and reliability were derived based on the measure of efficient response, an educational program based on the theory of behavioral cognition. The test and the educational program were applied to a sample of (60) students from the centers of your right to education, divided into experimenta
... Show MoreIn this research, the effect of adding two different types of reinforcing particles was investigated, which included: nano-zirconia (nano-ZrO2) particles and micro-lignin particles that were added with different volume fractions of 0.5%, 1%, 1.5% and 2% on the mechanical properties of polymer composite materials. They were prepared in this research, as a complete prosthesis and partial denture base materials was prepared, by using cold cure poly methyl methacrylate (PMMA) resin matrix. The composite specimens in this research consist of two groups according to the types of reinforced particles, were prepared by using casting methods, type (Hand Lay-Up) method. The first group consists of PMMA resin reinforced by (nano-ZrO
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More