Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (562)
Crossref (558)
Scopus Clarivate Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
A Comparison Between the Theoretical Cross Section Based on the Partial Level Density Formulae Calculated by the Exciton Model with the Experimental Data for (_79^197)Au nucleus
...Show More Authors

In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction  at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted  in the theoretical cross section and compared with the experimental data for  nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when  doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with  the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sat Jun 01 2024
Journal Name
International Journal Of Advanced And Applied Sciences
High-accuracy models for iris recognition with merging features
...Show More Authors

Due to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info

... Show More
View Publication
Scopus (3)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Wed Dec 01 2021
Journal Name
Baghdad Science Journal
Advanced Intelligent Data Hiding Using Video Stego and Convolutional Neural Networks
...Show More Authors

Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file.  In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Fri Jun 01 2018
Journal Name
International Journal Of Computer Science Trends And Technology
Secure Video Data Deduplication in the Cloud Storage Using Compressive Sensing
...Show More Authors

Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 10 2019
Journal Name
Journal Of The College Of Education For Women
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Publication Date
Sat Aug 01 2015
Journal Name
2015 Ieee Conference On Computational Intelligence In Bioinformatics And Computational Biology (cibcb)
Granular computing approach for the design of medical data classification systems
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Tue Mar 01 2022
Journal Name
Asian Journal Of Applied Sciences
Comparison between Expert Systems, Machine Learning, and Big Data: An Overview
...Show More Authors

Today, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.

View Publication
Crossref (2)
Crossref
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b

... Show More
Publication Date
Tue Nov 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Implementation of new Secure Mechanism for Data Deduplication in Hybrid Cloud
...Show More Authors

Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of

... Show More
View Publication Preview PDF