Preferred Language
Articles
/
ijs-5805
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Sep 22 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Lossless Data Hiding Using L·SB Method
...Show More Authors

A loS.sless (reversible) data hiding (embedding) method  inside  an image  (translating medium)  - presented   in  the  present  work  using  L_SB (least  significant  bit). technique  which  enables  us to translate   data  using an  image  (host  image),  using  a  secret  key, to  be  undetectable  without losing  any  data  or  without   changing   the  size  and  the  external   scene (visible  properties) of the image, the hid-ing data is then can  be extracted (without  losing)   by reversing &n

... Show More
View Publication Preview PDF
Publication Date
Sun Sep 04 2011
Journal Name
Baghdad Science Journal
An Embedded Data Using Slantlet Transform
...Show More Authors

Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Apr 06 2017
Journal Name
Global Journal Of Engineering Science And Researches 4(2348-8034):48-52
ON SEMI-STRONG (WEAK)CJ-TOPLOGICAL SPACES
...Show More Authors

Publication Date
Thu May 11 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Approximation Properties of the Strong Difference Operators
...Show More Authors

    In this paper , we study some approximation  properties of the strong difference and study the relation between the strong difference and the weighted modulus of continuity

View Publication Preview PDF
Publication Date
Sun Jun 07 2015
Journal Name
Baghdad Science Journal
A Solution of Second Kind Volterra Integral Equations Using Third Order Non-Polynomial Spline Function
...Show More Authors

In this paper, third order non-polynomial spline function is used to solve 2nd kind Volterra integral equations. Numerical examples are presented to illustrate the applications of this method, and to compare the computed results with other known methods.

View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Tue Jan 30 2024
Journal Name
Iraqi Journal Of Science
Predicting COVID-19 in Iraq using Frequent Weighting for Polynomial Regression in Optimization Curve Fitting
...Show More Authors

     The worldwide pandemic Coronavirus (Covid-19) is a new viral disease that spreads mostly through nasal discharge and saliva from the lips while coughing or sneezing. This highly infectious disease spreads quickly and can overwhelm healthcare systems if not controlled. However, the employment of machine learning algorithms to monitor analytical data has a substantial influence on the speed of decision-making in some government entities.        ML algorithms trained on labeled patients’ symptoms cannot discriminate between diverse types of diseases such as COVID-19. Cough, fever, headache, sore throat, and shortness of breath were common symptoms of many bacterial and viral diseases.

This research focused on the nu

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Mon Jan 01 2018
Journal Name
International Journal Of Data Mining, Modelling And Management
Association rules mining using cuckoo search algorithm
...Show More Authors

Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.

View Publication Preview PDF
Scopus (7)
Crossref (3)
Scopus Crossref
Publication Date
Fri Apr 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Lossless and Lossy Polynomial Image Compression
...Show More Authors

Crossref (1)
Crossref
Publication Date
Fri Apr 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Lossless and Lossy Polynomial Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref