Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of estimations methods of the entropy function to the random coefficients for two models: the general regression and swamy of the panel data
...Show More Authors

In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.

The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Oct 19 2018
Journal Name
Journal Of Economics And Administrative Sciences
Big Data Approch to Enhance Organizational Ambidexterity An Exploratory Study of a Sample of Managers at ASIA Cell For Mobile Telecommunication Company in Iraq
...Show More Authors

               The research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell  Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.

The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After  a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Sat May 30 2020
Journal Name
Neuroquantology Journal
The Effect of Re-Use of Lossy JPEG Compression Algorithm on the Quality of Satellite Image
...Show More Authors

In this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.

View Publication Preview PDF
Scopus (4)
Crossref (4)
Scopus Crossref
Publication Date
Wed Sep 17 2025
Journal Name
Journal Of Engineering
Image Compression Using 3-D Two-Level Techniques
...Show More Authors

In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-

... Show More
View Publication
Publication Date
Thu Feb 07 2019
Journal Name
Journal Of The College Of Education For Women
EFFICIENCY SPIHT IN COMPRESSION AND QUALITY OF IMAGE
...Show More Authors

Image compression is an important tool to reduce the bandwidth and storage
requirements of practical image systems. To reduce the increasing demand of storage
space and transmission time compression techniques are the need of the day. Discrete
time wavelet transforms based image codec using Set Partitioning In Hierarchical
Trees (SPIHT) is implemented in this paper. Mean Square Error (MSE), Peak Signal
to Noise Ratio (PSNR) and Maximum Difference (MD) are used to measure the
picture quality of reconstructed image. MSE and PSNR are the most common picture
quality measures. Different kinds of test images are assessed in this work with
different compression ratios. The results show the high efficiency of SPIHT algori

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 24 2019
Journal Name
Iraqi Journal Of Physics
Adaptive inter frame compression using image segmented technique
...Show More Authors

The computer vision branch of the artificial intelligence field is concerned with developing algorithms for analyzing video image content. Extracting edge information, which is the essential process in most pictorial pattern recognition problems. A new method of edge detection technique has been introduces in this research, for detecting boundaries.

           Selection of typical lossy techniques for encoding edge video images are also discussed in this research. The concentration is devoted to discuss the Block-Truncation coding technique and Discrete Cosine Transform (DCT) coding technique. In order to reduce the volume of pictorial data which one may need to store or transmit,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Jan 01 2011
Journal Name
Trends In Network And Communications
Header Compression Scheme over Hybrid Satellite-WiMAX Network
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Sun Jul 09 2023
Journal Name
Journal Of Engineering
Compression of an ECG Signal Using Mixed Transforms
...Show More Authors

Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Improving Fractal Image Compression Scheme through Quantization Operation
...Show More Authors

We explore the transform coefficients of fractal and exploit new method to improve the compression capabilities of these schemes. In most of the standard encoder/ decoder systems the quantization/ de-quantization managed as a separate step, here we introduce new way (method) to work (managed) simultaneously. Additional compression is achieved by this method with high image quality as you will see later.

View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Engineering
Image Compression Using 3-D Two-Level Technique
...Show More Authors

In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r

... Show More
View Publication Preview PDF