Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the original image. A lossless Hexadata encoding method is then applied to the data, which is based on reducing each set of six data items to a single encoded value. The tested results achieved acceptable saving bytes performance for the 21 iris square images of sizes 256x256 pixels which is about 22.4 KB on average with 0.79 sec decompression average time, with high saving bytes performance for 2 iris non-square images of sizes 640x480/2048x1536 that reached 76KB/2.2 sec, 1630 KB/4.71 sec respectively, Finally, the proposed promising techniques standard lossless JPEG2000 compression techniques with reduction about 1.2 and more in KB saving that implicitly demonstrating the power and efficiency of the suggested lossless biometric techniques.
The demand for electronic -passport photo ( frontal facial) images has grown rapidly. It now extends to Electronic Government (E-Gov) applications such as social benefits driver's license, e-passport, and e-visa . With the COVID 19 (coronavirus disease ), facial (formal) images are becoming more widely used and spreading quickly, and are being used to verify an individual's identity, but unfortunately that comes with insignificant details of constant background which leads to huge byte consumption that affects storage space and transmission, where the optimal solution that aims to curtail data size using compression techniques that based on exploiting image redundancy(s) efficiently.
Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreIn this paper, a procedure to establish the different performance measures in terms of crisp value is proposed for two classes of arrivals and multiple channel queueing models, where both arrival and service rate are fuzzy numbers. The main idea is to convert the arrival rates and service rates under fuzzy queues into crisp queues by using graded mean integration approach, which can be represented as median rule number. Hence, we apply the crisp values obtained to establish the performance measure of conventional multiple queueing models. This procedure has shown its effectiveness when incorporated with many types of membership functions in solving queuing problems. Two numerical illustrations are presented to determine the validity of the
... Show MoreUnconfined compressive strength (UCS) of rock is the most critical geomechanical property widely used as input parameters for designing fractures, analyzing wellbore stability, drilling programming and carrying out various petroleum engineering projects. The USC regulates rock deformation by measuring its strength and load-bearing capacity. The determination of UCS in the laboratory is a time-consuming and costly process. The current study aims to develop empirical equations to predict UCS using regression analysis by JMP software for the Khasib Formation in the Buzurgan oil fields, in southeastern Iraq using well-log data. The proposed equation accuracy was tested using the coefficient of determination (R²), the average absolute
... Show MoreIn this paper,we estimate the parameters and related probability functions, survival function, cumulative distribution function , hazard function(failure rate) and failure (death) probability function(pdf) for two parameters Birnbaum-Saunders distribution which is fitting the complete data for the patients of lymph glands cancer. Estimating the parameters (shape and scale) using (maximum likelihood , regression quantile and shrinkage) methods and then compute the value of mentioned related probability functions depending on sample from real data which describe the duration of survivor for patients who suffer from the lymph glands cancer based on diagnosis of disease or the inter of patients in a hospital for perio
... Show MoreIn this research, we find the Bayesian formulas and the estimation of Bayesian expectation for product system of Atlas Company. The units of the system have been examined by helping the technical staff at the company and by providing a real data the company which manufacturer the system. This real data include the failed units for each drawn sample, which represents the total number of the manufacturer units by the company system. We calculate the range for each estimator by using the Maximum Likelihood estimator. We obtain that the expectation-Bayesian estimation is better than the Bayesian estimator of the different partially samples which were drawn from the product system after it checked by the
... Show MoreThe development that solar energy will have in the next years needs a reliable estimation of available solar energy resources. Several empirical models have been developed to calculate global solar radiation using various parameters such as extraterrestrial radiation, sunshine hours, albedo, maximum temperature, mean temperature, soil temperature, relative humidity, cloudiness, evaporation, total perceptible water, number of rainy days, and altitude and latitude. In present work i) First part has been calculated solar radiation from the daily values of the hours of sun duration using Angstrom model over the Iraq for at July 2017. The second part has been mapping the distribution of so
Structure of network, which is known as community detection in networks, has received a great attention in diverse topics, including social sciences, biological studies, politics, etc. There are a large number of studies and practical approaches that were designed to solve the problem of finding the structure of the network. The definition of complex network model based on clustering is a non-deterministic polynomial-time hardness (NP-hard) problem. There are no ideal techniques to define the clustering. Here, we present a statistical approach based on using the likelihood function of a Stochastic Block Model (SBM). The objective is to define the general model and select the best model with high quality. Therefor
... Show More