Preferred Language
Articles
/
bsj-4360
A New Methodology to Find Private Key of RSA Based on Euler Totient Function
...Show More Authors

          The aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key is decreased about 66%. On the other hand, the distance is decreased less than 1% when the multiplier is larger than 66. Therefore, to avoid attacking by using the proposed method, the multiplier which is larger than 66 should be chosen. Furthermore, it is shown that if the public key equals 3, the multiplier always equals 2.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Nov 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Proposal of Using Principle of Maximizing Entropy of Generalized Gamma Distribution to Estimate the Survival probabilities of the Population in Iraq
...Show More Authors

In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Time of Survival Rate by Using Clayton Function for the Exponential Distribution with Practical Application
...Show More Authors

Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Computers, Materials & Continua
A New Hybrid Feature Selection Method Using T-test and Fitness Function
...Show More Authors

View Publication
Scopus (10)
Crossref (7)
Scopus Clarivate Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An exploratory study of history-based test case prioritization techniques on different datasets
...Show More Authors

In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Design of Nonlinear PID Neural Controller for the Speed Control of a Permanent Magnet DC Motor Model based on Optimization Algorithm
...Show More Authors

In this paper, the speed control of the real DC motor is experimentally investigated using nonlinear PID neural network controller. As a simple and fast tuning algorithm, two optimization techniques are used; trial and error method and particle swarm optimization PSO algorithm in order to tune the nonlinear PID neural controller's parameters and to find best speed response of the DC motor. To save time in the real system, a Matlab simulation package is used to carry out these algorithms to tune and find the best values of the nonlinear PID parameters. Then these parameters are used in the designed real time nonlinear PID controller system based on LabVIEW package. Simulation and experimental results are compared with each other and showe

... Show More
View Publication Preview PDF
Publication Date
Mon Sep 23 2019
Journal Name
Baghdad Science Journal
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I
...Show More Authors

     In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used:  local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the

... Show More
View Publication Preview PDF
Publication Date
Fri Dec 01 2023
Journal Name
Baghdad Science Journal
Preparing New Ceramic Membranes from Syrian Zeolite Coated with Silver Nanoparticles to Treatment Wells Water
...Show More Authors

As a result of the exacerbation of the problem of water pollution, research was directed towards studying the treatment using ceramic membranes, which proved to be highly effective in treating all water sources. The research aims to study the possibility of preparing a new type of ceramic membranes from Syrian zeolite that was not previously used in this field. In this research, ceramic membranes were prepared from Syrian raw zeolite in several stages. Zeolite sample was characterized, grinded, mixed with boric acid, pressed to form desks, treated thermally according to experiment program, finally coated with silver nanoparticles. Specifications of prepared membranes were determined according to reference methods, effectiveness of prepar

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Tue Aug 01 2023
Journal Name
Baghdad Science Journal
Digital Data Encryption Using a Proposed W-Method Based on AES and DES Algorithms
...Show More Authors

This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (2)
Scopus Crossref
Publication Date
Fri Feb 08 2019
Journal Name
Journal Of The College Of Education For Women
Software Protection by Combining Hash Function with Hardware Identifications
...Show More Authors

This paper presents a hybrid software copy protection scheme, the scheme is applied to
prevent illegal copying of software by produce a license key which is unique and easy to
generate. This work employs the uniqueness of identification of hard disk in personal
computer which can get by software to create a license key after treated with SHA-1 one way
hash function. Two mean measures are used to evaluate the proposed method, complexity
and processing time, SHA-1 can insure the high complexity to deny the hackers for produce
unauthorized copies, many experiments have been executed using different sizes of software
to calculate the consuming time. The measures show high complexity and short execution
time for propos

... Show More
View Publication Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of estimations methods of the entropy function to the random coefficients for two models: the general regression and swamy of the panel data
...Show More Authors

In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.

The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu

... Show More
View Publication Preview PDF
Crossref