Decision Forests

成立时间:July 25, 2012

Decision Forests

for Computer Vision and

Medical Image Analysis

A. Criminisi and J. Shotton

Springer 2013, XIX, 368 p. 143 illus., 136 in color.

ISBN 978-1-4471-4929-3

 decisionforests_book_sm (opens in new tab)
  • This book presents a unified, efficient model of decision forests which can be used in a number of applications such as scene recognition from photographs, object recognition in images and automatic diagnosis from radiological scans. Such applications have traditionally been addressed by different, supervised or unsupervised machine learning techniques.

    However, in this book, diverse learning tasks including regression, classification and semi-supervised learning are all seen as instances of the same general decision forest model. The unified framework further extends to novel uses of forests in tasks such as density estimation and manifold learning. This unification carries both theoretical and practical advantages. For instance, the underlying single model gives us the opportunity to implement and optimize the general algorithm for all these tasks only once, and then easily adapt it to individual applications with relatively small changes.

    Part I describes the general forest model which unifies classification, regression, density estimation, manifold learning, semi-supervised learning and active learning under the same flexible framework. The proposed model may be used both in a discriminative or generative way and may be applied to discrete or continuous, labelled or unlabelled data. It is based on a conventional training-testing framework, with the training phase trying to optimize a well defined energy function. Tasks such as classification or density estimation, supervised or unsupervised problems can all be addressed by setting a specific model for the objective function as well as the output prediction function.

    Part II is a collection of invited chapters. Here various researchers show how it is possible to build different applications on top of the general forest model. Kinect-based player segmentation, semantic segmentation of photographs and automatic diagnosis of brain lesions are amongst the many applications discussed here.

    Part III presents implementation details, documentation for the provided research software library, and some concluding remarks.

  • Decision Forests for Computer Vision and Medical Image Analysis

    A. Criminisi and J. Shotton

    Springer 2013

    Chapter 4: Classification Forests

    Exercise 4.1

    ex.4.1.a.png ex.4.1.b.png

    Using many trees and linear splits reduces artifacts.

    Exercise 4.2

    ex.4.2.a.png ex.4.2.b.png

    The quality of the uncertainty away from training data is affected by the type of split function (weak learner).

    Exercise 4.3

    ex.4.3.a.png ex.4.3.b.png

    Using linear splits produces a possibly better separating surfaces.

    Exercise 4.4

    ex.4.4.a.png ex.4.4.b.png

    Reducing the tree depth may cause underfitting and lower confidence.

    Exercise 4.5

    ex.4.5.a.png ex.4.5.b.png

    Increasing randomness may reduce overall prediction confidence.

    Chapter 5: Regression Forests

    Exercise 5.1

    ex.5.1.a.png ex.5.1.b.png

    Large tree depth may lead to overfitting.

    Exercise 5.2

    ex.5.2.a.png ex.5.2.b.png

    Larger training noise yields larger prediction uncertainty (wider pink region).

    Exercise 5.3

    ex.5.3.a.png ex.5.3.b.png

    ex.5.3.c.png ex.5.3.d.png

    Non-linear curve fitting in diverse examples. Note the relatively smooth interpolation and extrapolation behaviour.

    Exercise 5.4

    ex.5.4.a.png ex.5.4.b.png

    Single function regression does not capture the inherently ambiguous central region. But at least it returns an associated high uncertainty.

    Chapter 6: Decision Forests

    Exercise 6.1

    ex.6.1.a.png ex.6.1.b.png

    Too deep trees may cause overfitting.

    Exercise 6.2

    ex.6.2.a.png ex.6.2.b.png

    Too deep trees may cause overfitting.

    Exercise 6.3

    ex.6.3.a.png ex.6.3.b.png

    Too deep trees may cause overfitting.

    Exercise 6.4

    ex.6.5.a.png ex.6.5.b.png

    Too deep trees may cause overfitting. Some of the visible streaky artifacts are due to the use of axis-aligned weak learners.

    Chapter 8: Semi-supervised Classification Forests

    Exercise 8.1

    ex.8.1.a.png ex.8.1.b.png

    Note the larger uncertainty in the central region (left image). A single tree is always over-confident.

    Exercise 8.2

    ex.8.2.a.png ex.8.2.b.png

    Adding further supervised data in the central region helps increase the prediction confidence.

    Exercise 8.3

    ex.8.3.a.png ex.8.3.b.png

    Confidence decreases with training noise and increases with tree depth.

    Exercise 8.4

    ex.8.4.a.png ex.8.4.b.png

    Single trees are over-confident. Using many random forests produces smooth uncertainty in the transition regions.

    Exercise 8.5

    ex.8.5.a.png ex.8.5.b.png

    Adding the amount of supervision in regions of low confidence increases the prediction accuracy and the overall confidence.