Kernel Choice Matters for Local Polynomial Density Estimators at Boundaries
Shunsuke Imai, Yuta Okamoto
Published: 2023/6/13
Abstract
This paper examines kernel selection for local polynomial density (LPD) estimators at boundary points. Contrary to conventional wisdom, we demonstrate that the choice of kernel has a substantial impact on the efficiency of LPD estimators. In particular, we provide theoretical results and present simulation and empirical evidence showing that commonly used kernels, such as the triangular kernel, suffer from several efficiency issues: They yield a larger mean squared error than our preferred Laplace kernel. For inference, the efficiency loss is even more pronounced, with confidence intervals based on popular kernels being wide, whereas those based on the Laplace kernel are markedly tighter. Furthermore, the variance of the LPD estimator with such popular kernels explodes as the sample size decreases, reflecting the fact -- formally proven here -- that its finite-sample variance is infinite. This small-sample problem, however, can be avoided by employing kernels with unbounded support. Taken together, both asymptotic and finite-sample analyses justify the use of the Laplace kernel: Simply changing the kernel function improves the reliability of LPD estimation and inference, and its effect is numerically significant.