Preferred Language
Articles
/
jeasiq-120
Estimate Kernel Ridge Regression Function in Multiple Regression
...Show More Authors

             In general, researchers and statisticians in particular have been usually used non-parametric regression models when the parametric methods failed to fulfillment their aim to analyze the models  precisely. In this case the parametic methods are useless so they turn to non-parametric methods for its easiness in programming. Non-parametric methods can also used to assume the parametric regression model for subsequent use. Moreover, as an advantage of using non-parametric methods is to solve the problem of Multi-Colinearity between explanatory variables combined with nonlinear data. This problem can be solved by using kernel ridge regression which depend on what so-called bandwidth estimation (smoothing parameters). Therefore, for this purpose two different methods were used to estimate the smoothing parameter (Maximum Likelihood Cross-Validation (MLCV) and Akaike Information Criterion (AIC)). Furthermore, a comparision between the previouse methods had been provided using simulation technique , and the method of  Akaike Information Criterion (AIC) has been  found to be the best for the Gaussian function .

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Dec 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
CALCULATION BIASES FOR COEFFICIENTS AND SCALE PARAMETER FOR LINEAR (TYPE 1) EXTREME VALUE REGRESSION MODEL FOR LARGEST VALUES
...Show More Authors

Abstract

Characterized by the Ordinary Least Squares (OLS) on Maximum Likelihood for the greatest possible way that the exact moments are known , which means that it can be found, while the other method they are unknown, but approximations to their biases correct to 0(n-1) can be obtained by standard methods. In our research expressions for approximations to the biases of the ML estimators (the regression coefficients and scale parameter) for linear (type 1) Extreme Value Regression Model for Largest Values are presented by using the advanced approach depends on finding the first derivative, second and third.

View Publication Preview PDF
Crossref
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Dynamic algorithm (DRBLTS) and potentially weighted (WBP) to estimate hippocampal regression parameters using a techniqueBootstrap (comparative study)
...Show More Authors

Bootstrap is one of an important re-sampling technique which has given the attention of  researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such  Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 02 2011
Journal Name
Ibn Al-haithem Journal For Pure And Applied Sciences
On modified pr-test double stage shrinkage estimators for estimate the parameters of simple linear regression model
...Show More Authors

Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Using The Maximum Likelihood And Bayesian Methods To Estimate The Time-Rate Function Of Earthquake Phenomenon
...Show More Authors

In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
Semi-parametric regression function estimation for environmental pollution with measurement error using artificial flower pollination algorithm
...Show More Authors

Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin

... Show More
Publication Date
Sun Mar 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Use the method of parsing anomalous valueIn estimating the character parameter
...Show More Authors

In this paper the method of singular value decomposition  is used to estimate the ridge parameter of ridge regression estimator which is an alternative to ordinary least squares estimator when the general linear regression model suffer from near multicollinearity.

View Publication Preview PDF
Crossref
Publication Date
Wed Feb 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between the logistic regression model and Linear Discriminant analysis using Principal Component unemployment data for the province of Baghdad
...Show More Authors

     The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.

     Was conducted to compare the two methods above and it became clear by comparing the  logistic regression model best of a Linear Discriminant  function written

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Nov 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
Proposal of Using Principle of Maximizing Entropy of Generalized Gamma Distribution to Estimate the Survival probabilities of the Population in Iraq
...Show More Authors

In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
A comparison between Bayesian Method and Full Maximum Likelihood to estimate Poisson regression model hierarchy and its application to the maternal deaths in Baghdad
...Show More Authors

Abstract:

 This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.

The comparison was done by  simulation  using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the  Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood  with sample size  (n = 30) is the best to represent the maternal mortality data after it has been reliance value param

... Show More
View Publication Preview PDF
Crossref