Trees for Regression and Classification
published: Feb. 25, 2007, recorded: May 2005, views: 1560
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Tree models are widely used for regression and classification problems, with interpretability and ease of implementation being among their chief attributes. Despite the widespread use tree models, a comprehensive theoretical analysis of their performance has only begun to emerge in recent years.
This lecture provides an overview of tree modeling theory and methods, with an emphasis on risk bounds, oracle inequalities, approximation theory, and rates of convergence, in a variety of contexts. Special attention is devoted to decision trees and wavelet-based regression methods, two of the most well-known examples of tree models. The choice of loss function (squared error, absolute error, 0/1 error) plays a pivotal role in both theory and methods.
In particular, optimal tree selection rules vary dramatically depending on the loss function employed. Despite these differences, suitable tree-based models coupled with appropriate selection rules can provide fast algorithms and near-minimax optimal performance in a very broad range of regression and classification problems. Examples from image reconstruction and pattern classification will demonstrate the effectiveness of trees in practice.
Download slides: Decision_Trees_for_Regression_and_Classification.ppt (9.3 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !