Splitting criteria for regression trees

Taylor, Paul C. and Jones, M. C. (1996). Splitting criteria for regression trees. Journal of Statistical Computation and Simulation, 55(4) pp. 267–285.

DOI: https://doi.org/10.1080/00949659608811771

Abstract

Breiman, Friedman, Olshen and Stone (1984) expounded a method called Classification and Regression Trees, or CART, which can be used for nonparametric discrimination and regression. Taylor and Silverman (1993) presented a new splitting criterion for the growing of classification trees; this new criterion was called the mean posterior improvement criterion. This paper extends the mean posterior improvement criterion to the case of regression trees. The extension is made via kernel density estimation. General results on how to select the bandwidth (smoothing parameter) appropriate to estimation of the mean posterior improvement criterion are obtained. These results are adapted to allow a practical implementation of the mean posterior improvement criterion for regression trees. Examples of the behaviour of the new criterion relative to currently used splitting criteria are given.

Viewing alternatives

Metrics

Public Attention

Altmetrics from Altmetric

Number of Citations

Citations from Dimensions
No digital document available to download for this item

Item Actions

Export

About