A simulation study is used to examine the robustness of some estimators on a multiple linear regression model with problems of multicollinearity and non-normal errors, the Ordinary least Squares (LS) ,Ridge Regression, Ridge Least Absolute Value (RLAV), Weighted Ridge (WRID), MM and a robust ridge regression estimator MM estimator, which denoted as RMM this is the modification of the Ridge regression by incorporating robust MM estimator . finialy, we show that RMM is the best among the other estimators
The multiple linear regression model is an important regression model that has attracted many researchers in different fields including applied mathematics, business, medicine, and social sciences , Linear regression models involving a large number of independent variables are poorly performing due to large variation and lead to inaccurate conclusions , One of the most important problems in the regression analysis is the multicollinearity Problem, which is considered one of the most important problems that has become known to many researchers , As well as their effects on the multiple linear regression model, In addition to multicollinearity, the problem of outliers in data is one of the difficulties in constructing the reg
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreAbstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show MoreIn this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti
... Show MoreThe method binery logistic regression and linear discrimint function of the most important statistical methods used in the classification and prediction when the data of the kind of binery (0,1) you can not use the normal regression therefore resort to binary logistic regression and linear discriminant function in the case of two group in the case of a Multicollinearity problem between the data (the data containing high correlation) It became not possible to use binary logistic regression and linear discriminant function, to solve this problem, we resort to Partial least square regression.
In this, search the comparison between binary lo
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreNonlinear regression models are important tools for solving optimization problems. As traditional techniques would fail to reach satisfactory solutions for the parameter estimation problem. Hence, in this paper, the BAT algorithm to estimate the parameters of Nonlinear Regression models is used . The simulation study is considered to investigate the performance of the proposed algorithm with the maximum likelihood (MLE) and Least square (LS) methods. The results show that the Bat algorithm provides accurate estimation and it is satisfactory for the parameter estimation of the nonlinear regression models than MLE and LS methods depend on Mean Square error.
In this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.