Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Geological Journal
Multiple and Coherent Noise Removal from X-Profile 2D Seismic Data of Southern Iraq Using Normal Move Out-Frequency Wavenumber Technique
...Show More Authors

Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain

... Show More
View Publication
Scopus (1)
Scopus Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Aip Conference Proceedings
Integration between hydrochemical and physical data with geographic information systems (GIS) for selecting the best locations groundwater wells in Baghdad city
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Thu Sep 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Analysis of the indicators of the educational process and scientific levelUsing the analysis of variance of ordered data in repeated measurements
...Show More Authors

In this research want to make analysis for some indicators and it's classifications that related with the teaching process and the            scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions  for the                         

important classifications for each indicator that has affected on   the teaching process.         &nb

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Mar 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Using Quadratic Form Ratio Multiple Test to Estimate Linear Regression Model Parameters in Big Data with Application: Child Labor in Iraq
...Show More Authors

              The current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances.  From the diversity of Big Data variables comes many challenges that  can be interesting to the  researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Sep 17 2021
Journal Name
Journal Of Petroleum Exploration And Production Technology
Characterization of flow units, rock and pore types for Mishrif Reservoir in West Qurna oilfield, Southern Iraq by using lithofacies data
...Show More Authors
Abstract<p>This study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope</p> ... Show More
View Publication
Scopus (21)
Crossref (20)
Scopus Clarivate Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jun 30 2023
Journal Name
Iraqi Geological Journal
Integrated Core and Log Data to Determine the Reservoir Flow Unit and Rock Facies for Mishrif Formation in South Eastern Iraq
...Show More Authors

This work represents study the rock facies and flow unit classification for the Mishrif carbonate reservoir in Buzurgan oil Field, which located n the south eastern Iraq, using wire line logs, core samples and petrophysical data (log porosity and core permeability). Hydraulic flow units were identified using flow zone indicator approach and assessed within each rock type to reach better understanding of the controlling role of pore types and geometry in reservoir quality variations. Additionally, distribution of sedimentary facies and Rock Fabric Number along with porosity and permeability was analyzed in three wells (BU-1, BU-2, and BU-3). The interactive Petrophysics - IP software is used to assess the rock fabric number, flow zon

... Show More
View Publication
Scopus (1)
Scopus Crossref
Publication Date
Fri Aug 31 2012
Journal Name
Al-khwarizmi Engineering Journal
Sub–Nyquist Frequency Efficient Audio Compression
...Show More Authors

This paper presents the application of a framework of fast and efficient compressive sampling based on the concept of random sampling of sparse Audio signal. It provides four important features. (i) It is universal with a variety of sparse signals. (ii) The number of measurements required for exact reconstruction is nearly optimal and much less then the sampling frequency and below the Nyquist frequency. (iii) It has very low complexity and fast computation. (iv) It is developed on the provable mathematical model from which we are able to quantify trade-offs among streaming capability, computation/memory requirement and quality of reconstruction of the audio signal. Compressed sensing CS is an attractive compression scheme due to its uni

... Show More
View Publication Preview PDF
Publication Date
Sat Dec 01 2012
Journal Name
Iraqi Journal Of Physics
Wavelet compression for remotely sensed images
...Show More Authors

Image compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.

View Publication Preview PDF
Publication Date
Fri Apr 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Lossless and Lossy Polynomial Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref