The Open UniversitySkip to content

Splitting criteria for regression trees

Taylor, Paul C. and Jones, M. C. (1996). Splitting criteria for regression trees. Journal of Statistical Computation and Simulation, 55(4) pp. 267–285.

DOI (Digital Object Identifier) Link:
Google Scholar: Look up in Google Scholar


Breiman, Friedman, Olshen and Stone (1984) expounded a method called Classification and Regression Trees, or CART, which can be used for nonparametric discrimination and regression. Taylor and Silverman (1993) presented a new splitting criterion for the growing of classification trees; this new criterion was called the mean posterior improvement criterion. This paper extends the mean posterior improvement criterion to the case of regression trees. The extension is made via kernel density estimation. General results on how to select the bandwidth (smoothing parameter) appropriate to estimation of the mean posterior improvement criterion are obtained. These results are adapted to allow a practical implementation of the mean posterior improvement criterion for regression trees. Examples of the behaviour of the new criterion relative to currently used splitting criteria are given.

Item Type: Journal Item
Copyright Holders: 1996 Taylor and Francis
ISSN: 0094-9655
Keywords: CART; mean posterior improvement criterion; kernel density estimation; estimation of functional of mixtures of distributions; bandwidth selection
Academic Unit/School: Faculty of Science, Technology, Engineering and Mathematics (STEM) > Mathematics and Statistics
Faculty of Science, Technology, Engineering and Mathematics (STEM)
Item ID: 28178
Depositing User: Sarah Frain
Date Deposited: 29 Mar 2011 12:02
Last Modified: 07 Dec 2018 09:52
Share this page:


Altmetrics from Altmetric

Citations from Dimensions

Actions (login may be required)

Policies | Disclaimer

© The Open University   contact the OU