next up previous
Next: The Cascade of Classifiers Up: Learning Classification Functions Previous: Classification and Regression Trees

Gentle Ada Boost for CARTs

The Gentle Ada Boost Algorithm [5] is used to select a set of simple CARTs to achieve a given detection and error rate [13]. In the following, a detection is referred to as a hit and an error as a false alarm.

The learning is based on $ N$ weighted training examples $ (x_1,y_1), \ldots, (x_N,y_N)$, where $ x_i$ are the images and $ y_i \in \{-1,1\}, i \in \{1,\ldots,N\}$ the classified output. At the beginning of the learning phase the weights $ w_i$ are initialized with $ w_i = 1/N$. The following three steps are repeated to select CARTs until a given detection rate $ d$ is reached:

  1. Every classifier, i.e., a CART, is fit to the data. Hereby the error $ e$ is calculated with respect to the weights $ w_i$.
  2. The best CART $ h_t$ is chosen for the classification function. The counter $ t$ is incremented.
  3. The weights are updated with $ w_i := w_i \cdot e^{-y_i h_t(x_i)}$ and renormalized.

The final output of the classifier is sign$ (\sum_{t=1}^T
h_t(x)) > 0 $, with $ h_t(x)$ the weighted return value of the CART. Next, a cascade based on these classifiers is built.



root 2005-01-27