Preferred Language
Articles
/
jeasiq-1564
Comparison between the Methods of Ridge Regression and Liu Type to Estimate the Parameters of the Negative Binomial Regression Model Under Multicollinearity Problem by Using Simulation
...Show More Authors

The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonlinear regression model or part of the general exponential family. This is the basic structure of the Count Data Analysis, which was used as an alternative to the Poisson model when there is a problem with overdisperison Where the variation value of the response variable (Y) is greater than its arithmetic mean  ,The Monte Carlo study was designed to compare the Ridge Regression Estimator and the Liu Type Estimator By using the standard Compare Mean Square Error (MSE), A simulation result showed that the method of the Liu Type estimator is better than the Ridge Regression Method, The Mean Square Error in Liu Type Estimator are lower in the third and fourth estimation formulas.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Aug 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the performance of some r- (k,d) class estimators with the (PCTP) estimator that used in estimating the general linear regression model in the presence of autocorrelation and multicollinearity problems at the same time "
...Show More Authors

In the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Mar 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Estimate the Nonparametric Regression Function Using Canonical Kernel
...Show More Authors

    This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel  and give the sound amount of smoothing .

We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 10 2016
Journal Name
ألمؤتمر الدولي العلمي الخامس للاحصائيين العرب/ القاهرة
Proposition of Modified Genetic Algorithm to Estimate Additive Model by using Simulation
...Show More Authors

Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo

... Show More
Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Dynamic algorithm (DRBLTS) and potentially weighted (WBP) to estimate hippocampal regression parameters using a techniqueBootstrap (comparative study)
...Show More Authors

Bootstrap is one of an important re-sampling technique which has given the attention of  researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such  Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Some Methods for Estimating the Scheff'e Model of the Mixture
...Show More Authors

Because of the experience of the mixture problem of high correlation and the existence of linear MultiCollinearity between the explanatory variables, because of the constraint of the unit and the interactions between them in the model, which increases the existence of links between the explanatory variables and this is illustrated by the variance inflation vector (VIF), L-Pseudo component to reduce the bond between the components of the mixture.

    To estimate the parameters of the mixture model, we used in our research the use of methods that increase bias and reduce variance, such as the Ridge Regression Method and the Least Absolute Shrinkage and Selection Operator (LASSO) method a

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Dec 31 2022
Journal Name
Journal Of Economics And Administrative Sciences
Using Some Estimation Methods for Mixed-Random Panel Data Regression Models with Serially Correlated Errors with Application
...Show More Authors

This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa

... Show More
View Publication Preview PDF
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
Proposing Robust LAD-Atan Penalty of Regression Model Estimation for High Dimensional Data
...Show More Authors

         The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Sep 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of BASE methods with other methods for estimating the measurement parameter for WEBB distribution using simulations
...Show More Authors

  Weibull distribution is considered as one of the most widely  distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.

   In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
Comparison between the Local Polynomial Kernel and Penalized Spline to Estimating Varying Coefficient Model
...Show More Authors

Analysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models

... Show More
View Publication Preview PDF
Crossref