Preferred Language
Articles
/
bsj-6382
A Novel Technique for Secure Data Cryptosystem Based on Chaotic Key Image Generation
...Show More Authors

The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given control parameters, which is then processed by converting the fractional parts of them through a function into a set of non-repeating numbers that leads to a vast number of unpredicted probabilities (the factorial of rows times columns). Double layers of rows and columns permutation are made to the values of numbers for a specified number of stages. Then, XOR is performed between the key matrix and the original image, which represent an active resolve for data encryption for any type of files (text, image, audio, video, … etc). The results proved that the proposed encryption technique is very promising when tested on more than 500 image samples according to security measurements where the histograms of cipher images are very flatten compared with that for original images, while the averages of Mean Square Error is very high (10115.4) and Peak Signal to Noise Ratio is very low (8.17), besides Correlation near zero and Entropy close to 8 (7.9975).

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Sep 23 2020
Journal Name
Artificial Intelligence Research
Hybrid approaches to feature subset selection for data classification in high-dimensional feature space
...Show More Authors

This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe

... Show More
View Publication
Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Baghdad Science Journal
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I Article Sidebar
...Show More Authors

n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Geological Journal
Inverting Gravity Data to Density and Velocity Models for Selected Area in Southwestern Iraq
...Show More Authors

The gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Sun Jun 05 2016
Journal Name
Baghdad Science Journal
Developing an Immune Negative Selection Algorithm for Intrusion Detection in NSL-KDD data Set
...Show More Authors

With the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Jan 01 2016
Journal Name
Statistics And Its Interface
Search for risk haplotype segments with GWAS data by use of finite mixture models
...Show More Authors

The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Sat Jan 01 2022
Journal Name
Journal Of Petroleum Science And Engineering
Performance evaluation of analytical methods in linear flow data for hydraulically-fractured gas wells
...Show More Authors

View Publication
Scopus (8)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Sat Feb 03 2018
Journal Name
Chinese Journal Of Physics
A true random number generator based on the photon arrival time registered in a coincidence window between two single-photon counting modules
...Show More Authors

True random number generators are essential components for communications to be conconfidentially secured. In this paper a new method is proposed to generate random sequences of numbers based on the difference of the arrival times of photons detected in a coincidence window between two single-photon counting modules

View Publication
Scopus (19)
Crossref (14)
Scopus Clarivate Crossref
Publication Date
Mon Oct 23 2023
Journal Name
Journal Of Optics
Single mode optical fiber sensor based on surface plasmon resonance for the detection of the oil aging for the electrical transformers
...Show More Authors

This work presents a novel technique for the detection of oil aging in electrical transformers using a single mode optical fiber sensor based on surface plasmon resonance (SPR). The aging of insulating oil is a critical issue in the maintenance and performance of electrical transformers, as it can lead to reduce insulation properties, increase risk of electrical breakdown, and decrease operational lifespan. Many parameters are calculated in this study in order to examine the efficiency of this sensor like sensitivity (S), signal to noise ratio (SNR), resolution (refractive index unit) and figure of merit (FOM) and the values are for figure of merit is 11.05, the signal to noise ratio is 20.3, the sensitivity is 6.63, and the resolution is 3

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Mon Feb 13 2023
Journal Name
Journal Of Educational And Psychological Researches
Evaluation of English Language Textbooks for Fifth and Sixth Graders Based on the American Council Criteria for Teaching Foreign Languages (ACTFL)
...Show More Authors

Abstract

This study aims to identify the extent to which the criteria of the American Council for Teaching Foreign Languages (ACTFL) are included in the English language books for the fifth and sixth graders. To achieve the objective of the study, a content analysis card was prepared, where the classification of language proficiencies was divided into five main levels (beginner, intermediate, advanced, superior, and distinguished) of the four language skills (listening, speaking, reading, and writing), The content analysis card consisted of (89) indicators distributed at the four levels of language skills as follows: Listening (17), speaking (33), reading (15), and writing (26). The study sample consisted of Engl

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Efficient Task Scheduling Approach in Edge-Cloud Continuum based on Flower Pollination and Improved Shuffled Frog Leaping Algorithm
...Show More Authors

The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. E

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (2)
Scopus Crossref