Steganography involves concealing information by embedding data within cover media and it can be categorized into two main domains: spatial and frequency. This paper presents two distinct methods. The first is operating in the spatial domain which utilizes the least significant bits (LSBs) to conceal a secret message. The second method is the functioning in the frequency domain which hides the secret message within the LSBs of the middle-frequency band of the discrete cosine transform (DCT) coefficients. These methods enhance obfuscation by utilizing two layers of randomness: random pixel embedding and random bit embedding within each pixel. Unlike other available methods that embed data in sequential order with a fixed amount. These methods embed the data in a random location with a random amount, further enhancing the level of obfuscation. A pseudo-random binary key that is generated through a nonlinear combination of eight Linear Feedback Shift Registers (LFSRs) controls this randomness. The experimentation involves various 512x512 cover images. The first method achieves an average PSNR of 43.5292 with a payload capacity of up to 16% of the cover image. In contrast, the second method yields an average PSNR of 38.4092 with a payload capacity of up to 8%. The performance analysis demonstrates that the LSB-based method can conceal more data with less visibility, however, it is vulnerable to simple image manipulation. On the other hand, the DCT-based method offers lower capacity with increased visibility, but it is more robust.
Subcutaneous vascularization has become a new solution for identification management over the past few years. Systems based on dorsal hand veins are particularly promising for high-security settings. The dorsal hand vein recognition system comprises the following steps: acquiring images from the database and preprocessing them, locating the region of interest, and extracting and recognizing information from the dorsal hand vein pattern. This paper reviewed several techniques for obtaining the dorsal hand vein area and identifying a person. Therefore, this study just provides a comprehensive review of existing previous theories. This model aims to offer the improvement in the accuracy rate of the system that was shown in previous studies and
... Show MoreJPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
Bioinformatics is one of the computer science and biology sub-subjects concerned with the processes applied to biological data, such as gathering, processing, storing, and analyzing it. Biological data (ribonucleic acid (RNA), deoxyribonucleic acid (DNA), and protein sequences) has many applications and uses in many fields (data security, data segmentation, feature extraction, etc.). DNA sequences are used in the cryptography field, using the properties of biomolecules as the carriers of the data. Messenger RNA (mRNA) is a single strand used to make proteins containing genetic information. The information recorded from DNA also carries messages from DNA to ribosomes in the cytosol. In this paper, a new encryption technique bas
... Show MoreThe Wiener filter is widely used in image de-noising. It is used to reduce Gaussian noise. Although the Wiener filter removes noise from the image, it causes a loss of edge detail information, resulting in blurring of the image. The edge details are considered high-frequency components. The Wiener filter is unable to reconstruct these components. In this paper, the proposed filter based on the Wiener filter and the high-boost filter for medical images is presented. The proposed filter is applied to the degraded image. First, using Fourier Transformation, the degraded image and the high boost filter are converted in the frequency domain. Secondly, the wiener filter is applied to the image along with the high boost filter. Thirdly
... Show MoreA substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreChemical pollution is a very important issue that people suffer from and it often affects the nature of health of society and the future of the health of future generations. Consequently, it must be considered in order to discover suitable models and find descriptions to predict the performance of it in the forthcoming years. Chemical pollution data in Iraq take a great scope and manifold sources and kinds, which brands it as Big Data that need to be studied using novel statistical methods. The research object on using Proposed Nonparametric Procedure NP Method to develop an (OCMT) test procedure to estimate parameters of linear regression model with large size of data (Big Data) which comprises many indicators associated with chemi
... Show MoreDensity Functional Theory (DFT) method of the type (B3LYP) and a Gaussian basis set (6-311G) were applied for calculating the vibration frequencies and absorption intensities for normal coordinates (3N-6) at the equilibrium geometry of the Di and Tetra-rings layer (6, 0) zigzag single wall carbon nanotubes (SWCNTs) by using Gaussian-09 program. Both were found to have the same symmetry of D6d point group with C--C bond alternation in all tube rings (for axial bonds, which are the vertical C--Ca bonds in rings layer and for circumferential bonds C—Cc in the outer and mid rings bonds). Assignments of the modes of vibration IR active and inactive vibration frequ
... Show MoreDeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreProduction sites suffer from idle in marketing of their products because of the lack in the efficient systems that analyze and track the evaluation of customers to products; therefore some products remain untargeted despite their good quality. This research aims to build a modest model intended to take two aspects into considerations. The first aspect is diagnosing dependable users on the site depending on the number of products evaluated and the user's positive impact on rating. The second aspect is diagnosing products with low weights (unknown) to be generated and recommended to users depending on logarithm equation and the number of co-rated users. Collaborative filtering is one of the most knowledge discovery techniques used positive
... Show More