Preferred Language
Articles
/
AIaHT4YBIXToZYALNoJQ
Modified PRESENT Encryption algorithm based on new 5D Chaotic system
...Show More Authors

Cryptography is a major concern in communication systems. IoE technology is a new trend of smart systems based on various constrained devices. Lightweight cryptographic algorithms are mainly solved the most security concern of constrained devices and IoE systems. On the other hand, most lightweight algorithms are suffering from the trade-off between complexity and performance. Moreover, the strength of the cryptosystems, including the speed of the algorithm and the complexity of the system against the cryptanalysis. A chaotic system is based on nonlinear dynamic equations that are sensitive to initial conditions and produce high randomness which is a good choice for cryptosystems. In this work, we proposed a new five-dimensional of a chaotic system for a lightweight cryptographic algorithm. The proposed new chaotic system considers as super chaos. The NIST suite of all 15th tests is examined the proposed algorithm and showed high randomness and complexity.

Scopus Crossref
Publication Date
Sun Sep 04 2016
Journal Name
Baghdad Science Journal
The Importance and Interaction Indices of Bi-Capacities Based on Ternary-Element Sets
...Show More Authors

Grabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.

View Publication Preview PDF
Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Survival estimation for singly type one censored sample based on generalized Rayleigh distribution
...Show More Authors

This paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.

View Publication Preview PDF
Crossref
Publication Date
Wed Nov 19 2025
Journal Name
Journal Of The College Of Basic Education
Fuzzy Nonparametric Regression Model Estimation Based on some Smoothing Techniques With Practical Application
...Show More Authors

In this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .

View Publication Preview PDF
Publication Date
Thu Jun 06 2024
Journal Name
Journal Of Applied Engineering And Technological Science (jaets)
Deep Learning and Its Role in Diagnosing Heart Diseases Based on Electrocardiography (ECG)
...Show More Authors

Diagnosing heart disease has become a very important topic for researchers specializing in artificial intelligence, because intelligence is involved in most diseases, especially after the Corona pandemic, which forced the world to turn to intelligence. Therefore, the basic idea in this research was to shed light on the diagnosis of heart diseases by relying on deep learning of a pre-trained model (Efficient b3) under the premise of using the electrical signals of the electrocardiogram and resample the signal in order to introduce it to the neural network with only trimming processing operations because it is an electrical signal whose parameters cannot be changed. The data set (China Physiological Signal Challenge -cspsc2018) was ad

... Show More
View Publication
Scopus Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An exploratory study of history-based test case prioritization techniques on different datasets
...Show More Authors

In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Tue Dec 01 2020
Journal Name
Minar International Journal Of Applied Sciences And Technology
INNOVATE GESTATIONAL AGE ESTIMATION MODEL FOR IRAQI FETUSES BASED ON ULTRASOUND IMAGES MEASUREMENTS
...Show More Authors

Imaging by Ultrasound (US) is an accurate and useful modality for the assessment of gestational age (GA), estimation fetal weight, and monitoring the fetal growth during pregnancy, is a routine part of prenatal care, and that can greatly impact obstetric management. Estimation of GA is important in obstetric care, making appropriate management decisions requires accurate appraisal of GA. Accurate GA estimation may assist obstetricians in appropriately counseling women who are at risk of a preterm delivery about likely neonatal outcomes, and it is essential in the evaluation of the fetal growth and detection of intrauterine growth restriction. There are many formulas are used to estimate fetal GA in the world, but it's not specify fo

... Show More
View Publication
Crossref
Publication Date
Mon Oct 24 2022
Journal Name
Energies
Double-Slope Solar Still Productivity Based on the Number of Rubber Scraper Motions
...Show More Authors

In low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is

... Show More
View Publication
Scopus (4)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (3)
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Fifth International Conference On Applied Sciences: Icas2023
Facial deepfake performance evaluation based on three detection tools: MTCNN, Dlib, and MediaPipe
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref