Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
A Comparison between Ericson's Formulae Results and Experimental Data Using New Formulae of Single Particle Level Density
...Show More Authors

The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter  was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of  are derived from the relation between  and level density parameter . The formulae used to derive  are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on  from the Thomas-Fermi formula show a good agreement with the experimental data.

View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Ieee Access
Implementation of Univariate Paradigm for Streamflow Simulation Using Hybrid Data-Driven Model: Case Study in Tropical Region
...Show More Authors

View Publication
Scopus (87)
Crossref (85)
Scopus Clarivate Crossref
Publication Date
Wed Apr 19 2017
Journal Name
Iraqi Dental Journal
Matching the Iris Color of Ocular Prosthesis Using an Eye Contact Lens: New Technique
...Show More Authors

View Publication
Crossref
Publication Date
Fri Mar 25 2022
Journal Name
Journal Of The College Of Basic Education
Semantic image coding in contemporary Theatrical performance
...Show More Authors

تعد مجالات الصورة وعلاماتها الحركية حضوراً دلالياً للاتصال العلامي واتساعاً في الرابطة الجدلية ما بين الدوال ومداليها، التي تقوم بها الرؤية الاخراجية لإنتاج دلالات اخفائية تمتلك جوهرها الانتقالي عبر الافكار بوصفها معطيات العرض، ويسعى التشفير الصوري الى بث ثنائية المعنى داخل الحقول المتعددة للعرض المسرحي، ولفهم المعنى المنبثق من هذه التشفيرات البصرية، تولدت الحاجة لبحث تشكيل هذه التشفيرات وكيفية تح

... Show More
View Publication
Publication Date
Wed Mar 28 2018
Journal Name
Iraqi Journal Of Science
Linear Polynomial Coding with Midtread Adaptive Quantizer
...Show More Authors

In this paper, a hybrid image compression technique is introduced that integrates discrete wavelet transform (DWT) and linear polynomial coding. In addition, the proposed technique improved the midtread quantizer scheme once by utilizing the block based and the selected factor value. The compression system performance showed the superiority in quality and compression ratio compared to traditional polynomial coding techniques.

View Publication Preview PDF
Publication Date
Sat Oct 30 2021
Journal Name
Iraqi Journal Of Science
Small Binary Codebook Design for Image Compression Depending on Rotating Blocks
...Show More Authors

     The searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time.   Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle  to involve four types of binary code books (i.e. Pour when , Flat when  , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding procedure, with very small distortion per block, by designing s

... Show More
Scopus (1)
Scopus Crossref
Publication Date
Sat Oct 30 2021
Journal Name
Iraqi Journal Of Science
Small Binary Codebook Design for Image Compression Depending on Rotating Blocks
...Show More Authors

     The searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time.   Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle  to involve four types of binary code books (i.e. Pour when , Flat when  , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding pro

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Tue Aug 31 2021
Journal Name
Iraqi Journal Of Science
Effects of Using Static Methods with Contourlet Transformation on Speech Compression
...Show More Authors

Compression of speech signal is an essential field in signal processing. Speech compression is very important in today’s world, due to the limited bandwidth transmission and storage capacity. This paper explores a Contourlet transformation based methodology for the compression of the speech signal. In this methodology, the speech signal is analysed using Contourlet transformation coefficients with statistic methods as threshold values, such as Interquartile Filter (IQR), Average Absolute Deviation (AAD), Median Absolute Deviation (MAD) and standard deviation (STD), followed by the application of (Run length encoding) They are exploited for recording speech in different times (5, 30, and 120 seconds). A comparative study of performance

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Analyzing big data sets by using different panelized regression methods with application: Surveys of multidimensional poverty in Iraq
...Show More Authors

Poverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc

... Show More
View Publication
Scopus