Preferred Language
Articles
/
ijs-5805
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Mar 06 2011
Journal Name
Baghdad Science Journal
Numeral Recognition Using Statistical Methods Comparison Study
...Show More Authors

The area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.

View Publication Preview PDF
Crossref
Publication Date
Thu Apr 27 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Detection of Human Remain Using GPR Technique
...Show More Authors

    In this work, animal bones with different shapes and sizes were used to study the characteristics of the ground penetrating Radar system wares reflected by these bones. These bones were buried underground in different depths and surrounding media. The resulting data showed that the detection of buried bones with the GPR technology is highly dependent upon the surrounding media that the bones were buried in. Humidity is the main source of signal loss in such application because humidity results in low signal-to-noise ratio which leads to inability to distinguish between the signal reflected by bones from that reflected by the dopes  in the media such as rock .

View Publication Preview PDF
Publication Date
Tue Jan 29 2019
Journal Name
Journal Of The College Of Education For Women
Object Filling Using Table Based Boundary Tracking
...Show More Authors

The feature extraction step plays major role for proper object classification and recognition, this step depends mainly on correct object detection in the given scene, the object detection algorithms may result with some noises that affect the final object shape, a novel approach is introduced in this paper for filling the holes in that object for better object detection and for correct feature extraction, this method is based on the hole definition which is the black pixel surrounded by a connected boundary region, and hence trying to find a connected contour region that surrounds the background pixel using roadmap racing algorithm, the method shows a good results in 2D space objects.
Keywords: object filling, object detection, objec

... Show More
View Publication Preview PDF
Publication Date
Sat Dec 30 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Using Image Processing of A Fabry -Perot Interferometer to Determine the Defects of Leneses by Using He-Ne Laser
...Show More Authors

A design of a Fabry -Perot  interferometer system was constructed

to determine the precise value of the wavelength which is required in spectml studies depending on varying medium pressure where the refractive  index  was a  function  of  pressure  at  a  constant  distance between  the  two  mirrors by  using a Hc-Ne  laser (632.8)  tun as  a coherent source .

The (fmee) (t) and the coefficient of finesses (F) and the visbility

of the fringes (V) has been calculated  . Image processing \\•as used and   its  result   can   be   relied   on   verifying&nbsp

... Show More
View Publication Preview PDF
Publication Date
Thu Mar 09 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Improve Performance of Solar Cell by using Grooves which Have Semicircular Shape on The Surface by using Program (ZEMAX)
...Show More Authors

 In this work silicon solar cell has been used with semicircular grooves to improve its efficiency by reducing reflection of rays and increasing optical path through the cell. Software program for optical design (zemax) has been used by ray tracing mode to evaluate prototype efficiency when using detector beneath the cell. The prototype has aspect ratio (A.R=0.2) which is the best efficiency at incident angle (Ï´=0ͦ) and the best acceptance angle (Ï´=50ͦ). 

View Publication Preview PDF
Publication Date
Sat Mar 31 2018
Journal Name
Journal Of Engineering
Effect of Using Extra Fins on the Pin Fin Classic Geometry for Enhancement Heat Sink Performance using EGM Method
...Show More Authors

In the present study, the effect of new cross-section fin geometries on overall thermal/fluid performance had been investigated. The cross-section included the base original geometry of (triangular, square, circular, and elliptical pin fins) by adding exterior extra fins along the sides of the origin fins. The present extra fins include rectangular extra fin of 2 mm (height) and 4 mm (width) and triangular extra fin of 2 mm (base) 4 mm (height). The use of entropy generation minimization method (EGM) allows the combined effect of thermal resistance and pressure drop to be assessed through the simultaneous interaction with the heat sink. A general dimensionless expression for the entropy generation rate is obtained by con

... Show More
View Publication Preview PDF
Crossref (4)
Crossref
Publication Date
Wed Dec 26 2018
Journal Name
Iraqi Journal Of Science
Using the Basic Efficiency Criteria to Estimate the Security of the New Digital Algebraic Generator System (NDAGS)
...Show More Authors

The Multiplicative Cyclic Group has been used to construct a New Digital Algebraic Generator System (NDAGS). This cryptosystem can be classified as a stream cipher cryptosystem. In this paper we will estimate the efficiency and security of the (NDAGS) by using the Basic Efficiency Criteria (BEC). A comparison has made between the some known generators and (NDAGS). The results of applying the BEC and the comparison results proof the high efficiency of the (NDAGS).

View Publication Preview PDF
Publication Date
Fri Dec 08 2023
Journal Name
Iraqi Journal Of Science
Video Image Compression Using Absolute Moment Block Truncation Method with Orthogonal Search Motion Estimation Technique
...Show More Authors

Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth

... Show More
View Publication Preview PDF
Publication Date
Wed Feb 08 2023
Journal Name
Iraqi Journal Of Science
Text Hiding in Color Images Using the Secret Key Transformation Function in GF (2n)
...Show More Authors

Steganography is one of the most popular techniques for data hiding in the different media such as images, audio or video files. This paper introduced the improved technique to hide the secret message using the LSB algorithm inside the RGB true color image by encrypting it using the secret key transformation function. The key is selecting randomly in the GF (2n) with condition it has an inverse value to retrieve the encrypted message. Only two bits are used for the low byte in each pixel (the blue byte) to hide the secret message, since the blue color has a weak effect on human eyes. The message hidden by the suggested algorithm is less vulnerable to be stolen than other similar applications.

View Publication Preview PDF
Publication Date
Tue Sep 27 2022
Journal Name
Journal Of Engineering Research And Sciences
Images Compression using Combined Scheme of Transform Coding
...Show More Authors

Some problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t

... Show More
View Publication Preview PDF
Crossref