Preferred Language
Articles
/
0xcUPo8BVTCNdQwCqWTl
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of security to Triple Data Encryption Standard algorithm using Nth Degree Truncated Polynomial Ring Unit algorithm. This aim achieved by adding two new key functions, the first one is Enckey(), and the second one is Deckey() for encryption and decryption key of Triple Data Encryption Standard to make this algorithm more stronger. The obtained results of this paper also have good resistance against brute-force attack which makes the system more effective by applying Nth Degree Truncated Polynomial Ring Unit algorithm to encrypt and decrypt key of Triple Data Encryption Standard. Also, these modifications enhance the degree of complexity, increase key search space, and make the ciphered message difficult to be cracked by the attacker.

Publication Date
Wed Jan 01 2020
Journal Name
Advances In Science, Technology And Engineering Systems Journal
Bayes Classification and Entropy Discretization of Large Datasets using Multi-Resolution Data Aggregation
...Show More Authors

Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a

... Show More
View Publication
Scopus Crossref
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Sun Jan 01 2023
Journal Name
Petroleum And Coal
Analyzing of Production Data Using Combination of empirical Methods and Advanced Analytical Techniques
...Show More Authors

Scopus (1)
Scopus
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Wed Jul 01 2020
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Fast and robust approach for data security in communication channel using pascal matrix
...Show More Authors

This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 20 2021
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
The Necessary Condition for Optimal Boundary Control Problems for Triple Elliptic Partial Differential Equations
...Show More Authors

       In this work, we prove that the triple linear partial differential equations (PDEs) of elliptic type (TLEPDEs) with a given classical continuous boundary control vector (CCBCVr) has a unique "state" solution vector (SSV)  by utilizing the Galerkin's method (GME). Also, we prove the existence of a classical continuous boundary optimal control vector (CCBOCVr) ruled by the TLEPDEs. We study the existence solution for the triple adjoint equations (TAJEs) related with the triple state equations (TSEs). The Fréchet derivative (FDe) for the objective function is derived. At the end we prove the necessary "conditions" theorem (NCTh) for optimality for the problem.

View Publication Preview PDF
Crossref
Publication Date
Wed May 17 2023
Journal Name
College Of Islamic Sciences
The meaning of the triple verb more with one letter in Diwan al-Shafi'i
...Show More Authors

The Diwan of Imam Al-Shafi’i acquires great importance, as Al-Shafi’i is an authority in the language, and when I saw that no one had preceded me in exploring its depths, I took my tool and turned my face towards it intending to study the triple verb in it. I stop at these verbs and the student pauses for their morphological forms, looking at the significance of the triple verb more with one letter, two letters, and three letters, and I found that they are many, and such research cannot contain them all, so the choice came to choose the triple verb more with one letter, and the significance of the increase in it, as the increase in The building necessitates an increase in the meaning, and from here the study was limited to the triple

... Show More
View Publication Preview PDF
Publication Date
Sat Jan 01 2011
Journal Name
Communications In Computer And Information Science
The Use of Biorthogonal Wavelet, 2D Polynomial and Quadtree to Compress Color Images
...Show More Authors

In this paper, a compression system with high synthetic architect is introduced, it is based on wavelet transform, polynomial representation and quadtree coding. The bio-orthogonal (tap 9/7) wavelet transform is used to decompose the image signal, and 2D polynomial representation is utilized to prune the existing high scale variation of image signal. Quantization with quadtree coding are followed by shift coding are applied to compress the detail band and the residue part of approximation subband. The test results indicate that the introduced system is simple and fast and it leads to better compression gain in comparison with the case of using first order polynomial approximation.

View Publication
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (1)
Crossref