Jones, M. C.; Davies, S. J. and Park, B. U.
Versions of kernel-type regression estimators.
Journal of the American Statistical Association, 89(427) pp. 825–832.
We explore the aims of, and relationships between, various kernel-type regression estimators. To do so, we identify two general types of (direct) kernel estimators differing in their treatment of the nuisance density function associated with regressor variable design. We look at the well-known Gasser-Muller, Nadaraya-Watson, and Priestley-Chao methods in this light. In the random design case, none of these methods is totally adequate, and we mention a novel (direct) kernel method with appropriate properties. Disadvantages of even the latter idea are remedied by kernel-weighted local linear fitting, a well-known technique that is currently enjoying renewed popularity. We see how to to fit this approach into our general framework, and hence form a unified understanding of how these kernel-type smoothers interrelate. Though the mission of this article is unificatory (and even pedagogical), the desire for better understanding of superficially different approaches is motivated by the need to improve practical estimators. In the end, we concur with other authors that kernel-weighted local linear fitting deserves much further attention for applications.
Actions (login may be required)