The area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
This investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreThere are many techniques for face recognition which compare the desired face image with a set of faces images stored in a database. Most of these techniques fail if faces images are exposed to high-density noise. Therefore, it is necessary to find a robust method to recognize the corrupted face image with a high density noise. In this work, face recognition algorithm was suggested by using the combination of de-noising filter and PCA. Many studies have shown that PCA has ability to solve the problem of noisy images and dimensionality reduction. However, in cases where faces images are exposed to high noise, the work of PCA in removing noise is useless, therefore adding a strong filter will help to im
... Show MoreMost of drinking water consuming all over the world has been treated at the water treatment plant (WTP) where raw water is abstracted from reservoirs and rivers. The turbidity removal efficiency is very important to supply safe drinking water. This study is focusing on the use of multiple linear regression (MLR) and artificial neural network (ANN) models to predict the turbidity removal efficiency of Al-Wahda WTP in Baghdad city. The measured physico-chemical parameters were used to determine their effect on turbidity removal efficiency in various processes. The suitable formulation of the ANN model is examined throughout many preparations, trials, and steps of evaluation. The predict
This paper deals with a method called Statistical Energy Analysis that can be applied to the mechanical and acoustical systems like buildings, bridges and aircrafts …etc. S.E.A as a tool can be applied to the resonant systems in the circumstances of high frequency or/and complex structure». The parameters of S.E.A such as coupling loss factor, internal loss factor, modal density and input power are clarified in this work ; coupled plate sub-systems and explanations are presented for these parameters. The developed system is assumed to be resonant, conservative, linear and there is an equipartition of energy between all the resonant modes within a given frequency band in a given sub-system. The aim of th
... Show MoreTo deduct the childhood status in Iraq, it was important manner to use statistical tools and approaches concerned with interpreting the causal relationships and their attitudes and use classification method for the important effects (variables) to draw an obvious picture of the phenomena under study in order to make it useful through investing, updating and improving it in by demographic studies in the future. Two statistical methods had been used in the field of analyzing data of multivariate analysis namely, Cluster Analysis and Factor Analysis.
The present study focuses on four fundamental axes .The nutrition axis, health axis, Educational axis, and the social axis. The study has ca
... Show MoreVideo steganography has become a popular option for protecting secret data from hacking attempts and common attacks on the internet. However, when the whole video frame(s) are used to embed secret data, this may lead to visual distortion. This work is an attempt to hide sensitive secret image inside the moving objects in a video based on separating the object from the background of the frame, selecting and arranging them according to object's size for embedding secret image. The XOR technique is used with reverse bits between the secret image bits and the detected moving object bits for embedding. The proposed method provides more security and imperceptibility as the moving objects are used for embedding, so it is difficult to notice the
... Show MoreMany of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show More