Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression using Polynomial Coding Techniques: A review
...Show More Authors

Publication Date
Fri Jul 01 2016
Journal Name
International Journal Of Computer Science And Mobile Computing
. Interpolative Absolute Block Truncation Coding for Image Compression
...Show More Authors

Publication Date
Wed Feb 01 2023
Journal Name
International Journal Of Engineering, Ije Transactions B: Applications
Adaptive Polynomial Coding of Multi-Base Hybrid Compression
...Show More Authors

Scopus (4)
Scopus
Publication Date
Mon Aug 01 2022
Journal Name
Baghdad Science Journal
A Novel Technique for Secure Data Cryptosystem Based on Chaotic Key Image Generation
...Show More Authors

The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Fri Mar 20 2020
Journal Name
Remote Sensing
Lossy and Lossless Video Frame Compression: A Novel Approach for High-Temporal Video Data Analytics
...Show More Authors

The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec

... Show More
View Publication
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sat Dec 02 2017
Journal Name
Al-khwarizmi Engineering Journal
Speech Signal Compression Using Wavelet And Linear Predictive Coding
...Show More Authors

A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jan 03 2023
Journal Name
College Of Islamic Sciences
Ruling on selling big data (Authentical Fiqh Study): Ruling on selling big data (Authentical Fiqh Study)
...Show More Authors

Abstract:

Research Topic: Ruling on the sale of big data

Its objectives: a statement of what it is, importance, source and governance.

The methodology of the curriculum is inductive, comparative and critical

One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it

 Recommendation: Follow-up of studies dealing with the provisions of the issue

Subject Terms

Judgment, Sale, Data, Mega, Sayings, Jurists

 

View Publication Preview PDF
Publication Date
Tue Aug 01 2023
Journal Name
Baghdad Science Journal
Digital Data Encryption Using a Proposed W-Method Based on AES and DES Algorithms
...Show More Authors

This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (2)
Scopus Crossref