In this research, we conducted a comprehensive investigation of various kernel functions employed in the estimation of nonparametric regression functions. In particular, we investigated the Nadaraya-Watson method and local polynomial regression techniques involving linear, quadratic and cubic forms. These methods were evaluated using five different kernel functions: Gaussian, Epanechnikov, uniform, triangular and quartic, collectively referred to as GEUTQ.
The main objective of the work was to determine the best estimators for non-parametric kernel functions. To achieve this, we performed a rigorous comparison with simulation methods, different regression models and different sample sizes. The evaluation of the performance of the estimators was based on the mean absolute percentage error (AMAPE) assuming a standard normal distribution with a mean of zero and a variance of one.
Our simulation results and plots clearly show that the quadratic estimator (LP2) using the kernel function (G, E) consistently has the lowest (AMAPE) across all sample sizes and two models. Similarly, the local linear estimator (LP1) in the functions (U, T, Q) has the lowest (AMAPE) for all sample sizes and two models. As for the optimal functions, function (Q) is identified as the most effective kernel function among the options considered, leading to the lowest average values.
Furthermore, it is observed that as the sample size increases, the average values for the following methods decrease: Nadaraya, the linear method and the quadratic method. Conversely, the LP3 estimators, especially the linear cubic regression, turn out to be the least favorable and have relatively high values compared to the other estimators. These results provide valuable insights into the performance of different estimators and kernel functions in nonparametric regression models and thus contribute to future research and decision-making processes.