The idea of carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeeding and maternal health. The maternal health variable contained missing value and was processed in Matlab2015a using Methods Principal Component Analysis and probabilistic Principal Component Analysis of where the missing values were processed and then the methods were compared using the root of the mean error squares. The best method to processed the missing values Was the PCA method.
The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreIn this study, we investigate the behavior of the estimated spectral density function of stationary time series in the case of missing values, which are generated by the second order Autoregressive (AR (2)) model, when the error term for the AR(2) model has many of continuous distributions. The Classical and Lomb periodograms used to study the behavior of the estimated spectral density function by using the simulation.
In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
In this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
Abstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreThis study aims to derive a sustainable human development index for the Arab countries by using the principal components analysis, which can help in reducing the number of data in the case of multiple variables. This can be relied upon in the interpretation and tracking sustainable human development in the Arab countries in the view of the multiplicity of sustainable human development indicators and its huge data, beside the heterogeneity of countries in a range of characteristics associated with indicators of sustainable human development such as area, population, and economic activity. The study attempted to use the available data to the selected Arab countries for the recent years. This study concluded that a single inde
... Show More
This paper deals with a method called Statistical Energy Analysis that can be applied to the mechanical and acoustical systems like buildings, bridges and aircrafts …etc. S.E.A as a tool can be applied to the resonant systems in the circumstances of high frequency or/and complex structure». The parameters of S.E.A such as coupling loss factor, internal loss factor, modal density and input power are clarified in this work ; coupled plate sub-systems and explanations are presented for these parameters. The developed system is assumed to be resonant, conservative, linear and there is an equipartition of energy between all the resonant modes within a given frequency band in a given sub-system. The aim of th
... Show MoreThe main objective of this research is to find out the effect of deviation in the aggregate gradients of asphalt mixtures from the Job Mix Formula (JMF) on the general mixture performance. Three road layers were worked on (wearing layer, binder layer, and base layer) and statistical analysis was performed for the data of completed projects in Baghdad city, and the sieve that carried the largest number of deviations for each layer was identified. No.8 sieve (2.36mm), No.50 sieve (0.3mm), and 3/8'' sieve (9.5mm) had the largest number of deviations in the wearing layer, the binder layer, and the base layer respectively. After that, a mixture called Mix 1, was made. This mixture was selected from a number of completed mixtures, and it
... Show More