Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
In today's digital era, the importance of securing information has reached critical levels. Steganography is one of the methods used for this purpose by hiding sensitive data within other files. This study introduces an approach utilizing a chaotic dynamic system as a random key generator, governing both the selection of hiding locations within an image and the amount of data concealed in each location. The security of the steganography approach is considerably improved by using this random procedure. A 3D dynamic system with nine parameters influencing its behavior was carefully chosen. For each parameter, suitable interval values were determined to guarantee the system's chaotic behavior. Analysis of chaotic performance is given using the
... Show MoreDue to the vast using of digital images and the fast evolution in computer science and especially the using of images in the social network.This lead to focus on securing these images and protect it against attackers, many techniques are proposed to achieve this goal. In this paper we proposed a new chaotic method to enhance AES (Advanced Encryption Standards) by eliminating Mix-Columns transformation to reduce time consuming and using palmprint biometric and Lorenz chaotic system to enhance authentication and security of the image, by using chaotic system that adds more sensitivity to the encryption system and authentication for the system.
In this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreDynamic Thermal Management (DTM) emerged as a solution to address the reliability challenges with thermal hotspots and unbalanced temperatures. DTM efficiency is highly affected by the accuracy of the temperature information presented to the DTM manager. This work aims to investigate the effect of inaccuracy caused by the deep sub-micron (DSM) noise during the transmission of temperature information to the manager on DTM efficiency. A simulation framework has been developed and results show up to 38% DTM performance degradation and 18% unattended cycles in emergency temperature under DSM noise. The finding highlights the importance of further research in providing reliable on-chip data transmission in DTM application.
Text categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreThe design of future will still be the most confusing and puzzling issue and misgivings that arouse worry and leading to the spirit of adventures to make progress and arrive at the ways of reviving, creativity and modernism. The idea of prevailing of a certain culture or certain product in design depends on the given and available techniques, due to the fact that the computer and their artistic techniques become very important and vital to reinforce the image in the design. Thus, it is very necessary to link between these techniques and suitable way to reform the mentality by which the design will be reformed, from what has been said, (there has no utilization for the whole modern and available graphic techniques in the design proce
... Show MoreCognitive radios have the potential to greatly improve spectral efficiency in wireless networks. Cognitive radios are considered lower priority or secondary users of spectrum allocated to a primary user. Their fundamental requirement is to avoid interference to potential primary users in their vicinity. Spectrum sensing has been identified as a key enabling functionality to ensure that cognitive radios would not interfere with primary users, by reliably detecting primary user signals. In addition, reliable sensing creates spectrum opportunities for capacity increase of cognitive networks. One of the key challenges in spectrum sensing is the robust detection of primary signals in highly negative signal-to-noise regimes (SNR).In this paper ,
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show MoreIn the present work a dynamic analysis technique have been developed to investigate and characterize the quantity of elastic module degradation of cracked cantilever plates due to presence of a defect such as surface of internal crack under free vibration. A new generalized technique represents the first step in developing a health monitoring system, the effects of such defects on the modal frequencies has been the main key quantifying the elasticity modulii due to presence any type of un-visible defect. In this paper the finite element method has been used to determine the free vibration characteristics for cracked cantilever plate (internal flaws), this present work achieved by different position of crack. Stiffness re
... Show More