Preferred Language
Articles
/
ZRdnMI8BVTCNdQwCBV8p
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression  average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.

Crossref
View Publication
Publication Date
Tue Jun 01 2021
Journal Name
International Journal Of Nonlinear Analysis And Applications
A proposed method for cleaning data from outlier values using the robust rfch method in structural equation modeling
...Show More Authors

View Publication Preview PDF
Scopus (2)
Scopus
Publication Date
Wed Jan 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Analyzing big data sets by using different panelized regression methods with application: Surveys of multidimensional poverty in Iraq
...Show More Authors

Poverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc

... Show More
View Publication
Scopus
Publication Date
Fri Mar 31 2017
Journal Name
Iraqi Journal Of Biotechnology
Reliable Reference Gene for Normalization of RT- qPCR Data in Human Cancer Cell Lines
Subjected to Gene Knockdown
...Show More Authors

Quantitative real-time Polymerase Chain Reaction (RT-qPCR) has become a valuable molecular technique in biomedical research. The selection of suitable endogenous reference genes is necessary for normalization of target gene expression in RT-qPCR experiments. The aim of this study was to determine the suitability of each 18S rRNA and ACTB as internal control genes for normalization of RT-qPCR data in some human cell lines transfected with small interfering RNA (siRNA). Four cancer cell lines including MCF-7, T47D, MDA-MB-231 and Hela cells along with HEK293 representing an embryonic cell line were depleted of E2F6 using siRNA specific for E2F6 compared to negative control cells, which were transfected with siRNA not specific for any gene. Us

... Show More
Preview PDF
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Robust Two-Step Estimation and Approximation Local Polynomial Kernel For Time-Varying Coefficient Model With Balance Longitudinal Data
...Show More Authors

      In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of  specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2019
Journal Name
Journal Of Engineering
Evaluation of the Performance of Online GPS/GNSS Data Processing Services for Monitoring the Land Deformations and Movements
...Show More Authors

In recent years, the Global Navigation Satellite Services (GNSS) technology has been frequently employed for monitoring the Earth crust deformation and movement. Such applications necessitate high positional accuracy that can be achieved through processing GPS/GNSS data with scientific software such as BERENSE, GAMIT, and GIPSY-OSIS. Nevertheless, these scientific softwares are sophisticated and have not been published as free open source software. Therefore, this study has been conducted to evaluate an alternative solution, GNSS online processing services, which may obtain this privilege freely. In this study, eight years of GNSS raw data for TEHN station, which located in Iran, have been downloaded from UNAVCO website

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Tue Jun 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of weighted estimated method and proposed method (BEMW) for estimation of semi-parametric model under incomplete data
...Show More Authors

Generally, statistical methods are used in various fields of science, especially in the research field, in which Statistical analysis is carried out by adopting several techniques, according to the nature of the study and its objectives. One of these techniques is building statistical models, which is done through regression models. This technique is considered one of the most important statistical methods for studying the relationship between a dependent variable, also called (the response variable) and the other variables, called covariate variables. This research describes the estimation of the partial linear regression model, as well as the estimation of the “missing at random” values (MAR). Regarding the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Apr 01 2015
Journal Name
Al–bahith Al–a'alami
Methods of US propaganda in Iraq- A Study of Coalition Provisional Authority and US Army Data after 2003
...Show More Authors

with an organized propaganda campaign. This military campaign was helped to formulate its speech by many institutions, research centers, and knowledge and intelligence circles in order to mobilize public opinion gain supporters and face the opponents by different means depending on a variety of styles to achieve its required effects. 
          After the US occupation of Iraq, US media fighters sought to influence the Iraqi public opinion and making them convinced them of the important presence of US military forces in Iraq which necessitated finding its justification through the use of persuasive techniques in its intensive propaganda campaigns. 
  This research discusses the most important

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
An improved neurogenetic model for recognition of 3D kinetic data of human extracted from the Vicon Robot system
...Show More Authors

These days, it is crucial to discern between different types of human behavior, and artificial intelligence techniques play a big part in that.  The characteristics of the feedforward artificial neural network (FANN) algorithm and the genetic algorithm have been combined to create an important working mechanism that aids in this field. The proposed system can be used for essential tasks in life, such as analysis, automation, control, recognition, and other tasks. Crossover and mutation are the two primary mechanisms used by the genetic algorithm in the proposed system to replace the back propagation process in ANN. While the feedforward artificial neural network technique is focused on input processing, this should be based on the proce

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Wed Jan 31 2024
Journal Name
Iraqi Geological Journal
Estimation of Rock Mechanical Properties of the Hartha Formation and their Relationship to Porosity Using Well-Log Data
...Show More Authors

The physical and elastic characteristics of rocks determine rock strengths in general. Rock strength is frequently assessed using porosity well logs such as neutron and sonic logs. The essential criteria for estimating rock mechanic parameters in petroleum engineering research are uniaxial compressive strength and elastic modulus. Indirect estimation using well-log data is necessary to measure these variables. This study attempts to create a single regression model that can accurately forecast rock mechanic characteristics for the Harth Carbonate Formation in the Fauqi oil field. According to the findings of this study, petrophysical parameters are reliable indexes for determining rock mechanical properties having good performance p

... Show More
View Publication
Scopus (6)
Scopus Crossref
Publication Date
Sat Oct 30 2021
Journal Name
Iraqi Journal Of Science
Small Binary Codebook Design for Image Compression Depending on Rotating Blocks
...Show More Authors

     The searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time.   Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle  to involve four types of binary code books (i.e. Pour when , Flat when  , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding procedure, with very small distortion per block, by designing s

... Show More
Scopus (4)
Crossref (1)
Scopus Crossref