Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (3)
Crossref
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Sat Dec 01 2012
Journal Name
Journal Of Economics And Administrative Sciences
Using panel data in structural equations with application
...Show More Authors

The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration.  chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Data Mining Techniques for Iraqi Biochemical Dataset Analysis
...Show More Authors

This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Sep 08 2018
Journal Name
Proceedings Of The 2018 International Conference On Computing And Big Data
3D Parallel Coordinates for Multidimensional Data Cube Exploration
...Show More Authors

Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Bulletin Of Electrical Engineering And Informatics
Traffic management inside software-defined data centre networking
...Show More Authors

In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur

... Show More
View Publication
Scopus (16)
Crossref (13)
Scopus Crossref
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Estimating the reliability function of Kumaraswamy distribution data
...Show More Authors

The aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter  (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.

The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Sep 01 2008
Journal Name
Al-khwarizmi Engineering Journal
New Adaptive Data Transmission Scheme Over HF Radio
...Show More Authors

Acceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.

An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo

... Show More
View Publication Preview PDF
Publication Date
Tue Mar 30 2021
Journal Name
Wasit Journal Of Computer And Mathematics Science
Dynamic Data Replication for Higher Availability and Security
...Show More Authors

The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain

... Show More
View Publication
Crossref (1)
Crossref