The process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material a subset consists of 1150 images belong to 91 different subjects was taken from Cohn-Kanade AU coded dataset (CK); the subjects images hold different facial expressions. The test results show the effectiveness of the proposed automated localization scheme in different illuminations conditions; it gave accuracy of about 95.7%.
The origin of this technique lies in the analysis of François Kenai (1694-1774), the leader of the School of Naturalists, presented in Tableau Economique. This method was developed by Karl Marx in his analysis of the Departmental Relationships and the nature of these relations in the models of " "He said. The current picture of this type of economic analysis is credited to the Russian economist Vasily Leontif. This analytical model is commonly used in developing economic plans in developing countries (p. 1, p. 86). There are several types of input and output models, such as static model, mobile model, regional models, and so on. However, this research will be confined to the open-ended model, which found areas in practical application.
... Show MoreInformation security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show MoreIn this work, satellite images for Razaza Lake and the surrounding area
district in Karbala province are classified for years 1990,1999 and
2014 using two software programming (MATLAB 7.12 and ERDAS
imagine 2014). Proposed unsupervised and supervised method of
classification using MATLAB software have been used; these are
mean value and Singular Value Decomposition respectively. While
unsupervised (K-Means) and supervised (Maximum likelihood
Classifier) method are utilized using ERDAS imagine, in order to get
most accurate results and then compare these results of each method
and calculate the changes that taken place in years 1999 and 2014;
comparing with 1990. The results from classification indicated that
Evolution has become a feature of this era because of the speed that makes it open multiple horizons and many to identify everything that is new in different areas and also characterized by the competitive position of emotional attitudes changing depending on the positions of winning and defeat, and the use of training methods are the most important pillars of the game of wrestling, The methods contribute to raising the level of the wrestler and refining his physical and skill potential. The problem of the research is that the shooting exercises from above the chest are very important in Roman wrestling and can be terminated by the player. Through very personal interviews for coaches and concluded that there is a weakness in the level of fl
... Show More
Abstract
Rayleigh distribution is one of the important distributions used for analysis life time data, and has applications in reliability study and physical interpretations. This paper introduces four different methods to estimate the scale parameter, and also estimate reliability function; these methods are Maximum Likelihood, and Bayes and Modified Bayes, and Minimax estimator under squared error loss function, for the scale and reliability function of the generalized Rayleigh distribution are obtained. The comparison is done through simulation procedure, t
... Show MoreThis study was for searching for Cholera Bacteria serotype which causes epidemiology Cholera in the 2007 in a fast method which contains (Rapid Visual Test) (Crystal V.C.) which was used for the first time in Iraq to diagnosis of Cholera Bacteria & compared with the traditional bacteriology method. The Cholera disease is one of the most dangerous epidemiological diseases which lead to death with a percentage of (50 – 70) % in the severe cases for untreated patients . For this purpose, 100 samples of stool from the patients from a (13) hospitals in Baghdad Governorate in the period from August to the end of December. The Cholera was diagnosis in two methods, 1st method was the fast method using the nitrocellulose which is coated with anti-
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More