Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Mon Jan 30 2023
Journal Name
Iraqi Journal Of Science
Secure Big Data Transmission based on Modified Reverse Encryption and Genetic Algorithm
...Show More Authors

      The modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Mon Sep 03 2012
Journal Name
The International Archives Of The Photogrammetry, Remote Sensing And Spatial Information Sciences
CALIBRATION OF FULL-WAVEFORM ALS DATA BASED ON ROBUST INCIDENCE ANGLE ESTIMATION
...Show More Authors

Abstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration r

... Show More
View Publication
Crossref
Publication Date
Sat Dec 02 2017
Journal Name
Al-khwarizmi Engineering Journal
Speech Signal Compression Using Wavelet And Linear Predictive Coding
...Show More Authors

A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th

... Show More
View Publication Preview PDF
Publication Date
Wed Dec 30 2020
Journal Name
Iraqi Journal Of Science
DNA Encoding for Misuse Intrusion Detection System based on UNSW-NB15 Data Set
...Show More Authors

Recent researches showed that DNA encoding and pattern matching can be used for the intrusion-detection system (IDS), with results of high rate of attack detection. The evaluation of these intrusion detection systems is based on datasets that are generated decades ago. However, numerous studies outlined that these datasets neither inclusively reflect the network traffic, nor the modern low footprint attacks, and do not cover the current network threat environment. In this paper, a new DNA encoding for misuse IDS based on UNSW-NB15 dataset is proposed. The proposed system is performed by building a DNA encoding for all values of 49 attributes. Then attack keys (based on attack signatures) are extracted and, finally, Raita algorithm is app

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (3)
Scopus Crossref
Publication Date
Mon Jan 03 2022
Journal Name
Iraqi Journal Of Science
Accuracy Assessment of 3D Model Based on Laser Scan and Photogrammetry Data: Introduction
...Show More Authors

    A three-dimensional (3D) model extraction represents the best way to reflect the reality in all details. This explains the trends and tendency of many scientific disciplines towards making measurements, calculations and monitoring in various fields using such model. Although there are many ways to produce the 3D model like as images, integration techniques, and laser scanning, however, the quality of their products is not the same in terms of accuracy and detail. This article aims to assess the 3D point clouds model accuracy results from close range images and laser scan data based on Agi soft photoscan and cloud compare software to determine the compatibility of both datasets for several applications. College of Scien

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (4)
Crossref (1)
Scopus Crossref
Publication Date
Mon Aug 01 2022
Journal Name
Baghdad Science Journal
A Novel Technique for Secure Data Cryptosystem Based on Chaotic Key Image Generation
...Show More Authors

The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Fri Mar 20 2020
Journal Name
Remote Sensing
Lossy and Lossless Video Frame Compression: A Novel Approach for High-Temporal Video Data Analytics
...Show More Authors

The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec

... Show More
View Publication
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jan 03 2023
Journal Name
College Of Islamic Sciences
Ruling on selling big data (Authentical Fiqh Study): Ruling on selling big data (Authentical Fiqh Study)
...Show More Authors

Abstract:

Research Topic: Ruling on the sale of big data

Its objectives: a statement of what it is, importance, source and governance.

The methodology of the curriculum is inductive, comparative and critical

One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it

 Recommendation: Follow-up of studies dealing with the provisions of the issue

Subject Terms

Judgment, Sale, Data, Mega, Sayings, Jurists

 

View Publication Preview PDF