Preferred Language
Articles
/
bsj-5751
Text Multilevel Encryption Using New Key Exchange Protocol
...Show More Authors

The technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime numbers and the possibility of using integer numbers. While the second branch of the proposal is the multi-key encryption algorithm. The current algorithm provides the ability to use more than two keys. Keys can be any kind of integer number (at least the last key is a prime number), not necessarily to be of the same length. The Encryption process is based on converting the text characters to suggested integer numbers, and these numbers are converted to other numbers by using a multilevel mathematical model many times (a multilevel process depending on the number of keys used), while the decryption process is a one-level process using just one key as the main key, while the other keys used as secondary keys. The messages are encoded before encryption (coded by ASCII or any suggested system). The algorithm can use an unlimited number of keys with a very large size (more than 7500 bytes), at least one of them a prime number. Exponentiation is also used for keys to increase complexity. The experiments proved the robustness of the key exchange protocol and the encryption algorithm in addition to the security. Comparing the suggested method with other methods ensures that the suggested method is more secure and flexible and easy to implement.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Jun 01 2014
Journal Name
International Journal Of Advanced Research In Computer Science And Software Engineering
Medical Image Compression using Wavelet Quadrants of Polynomial Prediction Coding & Bit Plane Slicing
...Show More Authors

Publication Date
Sun Apr 08 2018
Journal Name
Al-khwarizmi Engineering Journal
Kinetic Study of the Leaching of Iraqi Akashat Phosphate Ore Using Lactic Acid
...Show More Authors

     In the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by surface chemical reactio

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Journal Of American Science
Morphohistological study of the tongue in local mice species by using special stain
...Show More Authors

Preview PDF
Publication Date
Wed Jan 01 2020
Journal Name
Advances In Science, Technology And Engineering Systems Journal
Bayes Classification and Entropy Discretization of Large Datasets using Multi-Resolution Data Aggregation
...Show More Authors

Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a

... Show More
View Publication
Scopus Crossref
Publication Date
Mon Oct 02 2023
Journal Name
Journal Of Engineering
Control of Propagation of Salt Wedge by using Roughness Blocks having Different Inclination
...Show More Authors

The hydraulic conditions of a flow previously proved to be changed when placing large-scale geometric roughness elements on the bed of an open channel. These elements impose more resistance to the flow.  The geometry of the roughness elements, the numbers used, and the configuration are parameters that can affect the hydraulic flow characteristics. The target is to use inclined block elements to control the salt wedge propagation pointed in most estuaries to prevent its negative effects. The Computational Fluid Dynamics CFD Software was used to simulate the two-phase flow in an estuary model. In this model, the used block elements are 2 cm by 3 cm cross-sections with an inclined face in the flow direction, with a length

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sat Jul 20 2024
Journal Name
Sumer Journal For Pure Science
Classify the Nutritional Status of Iraqi children under Five Years Using Fuzzy Classification
...Show More Authors

View Publication Preview PDF
Publication Date
Thu Mar 02 2023
Journal Name
Iar Journal Of Business Management
Reducing The Costs Of Transporting Multiple Products (Linear Transport Problems) Using Excel QM
...Show More Authors

The transportation model is a well-recognized and applied algorithm in the distribution of products of logistics operations in enterprises. Multiple forms of solution are algorithmic and technological, which are applied to determine the optimal allocation of one type of product. In this research, the general formulation of the transport model by means of linear programming, where the optimal solution is integrated for different types of related products, and through a digital, dynamic, easy illustration Develops understanding of the Computer in Excel QM program. When choosing, the implementation of the form in the organization is provided.

Preview PDF
Publication Date
Thu Apr 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Between Tree regression (TR), and Negative binomial regression (NBR) by Using Simulation.
...Show More Authors

            In this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Aug 30 2025
Journal Name
Iraqi Journal Of Science
Improving the Performance and Finding Bitmap of the Compression Method Using Weber's law
...Show More Authors

Image compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving

... Show More
View Publication
Scopus Crossref
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Physics: Conference Series
Study the Electronic Properties of Boron Nitride Diamondiod Nanostructure using Ab-initio DFT
...Show More Authors

View Publication
Scopus (1)
Scopus Clarivate Crossref