next up previous
Next: Estimating Matching Quality Up: Evaluating the Match Previous: Evaluating the Match

Competitive Object Learning

In addition to subsampling, goals of competitive object learning are the minimization of the expected quantization error and entropy maximization. A finite set of 3D scan points $ \cal D$ is subsambled to the set $ {\cal A} = \{\V w_1, \V w_2, \ldots, \V
w_N\}$. Error minimization is done with respect to the following function:

$\displaystyle E ( {\cal D}, {\cal A} ) = \frac{1}{\vert{\cal D}\vert} \sum_{\V w_i \in {\cal A}}
\sum_{\V \xi \in R_c} \norm {\V \xi - \V w_i},$      

with the set $ {\cal A}$ of samples and the Voronoi set $ {\cal R}_c$ of unit $ c$, i.e., $ {\cal R}_c = \{ \V \xi \in {\cal D} \vert
s(\V \xi) = c \}$ and $ s(\V \xi) = \arg \min_{c \in {\cal A}}
\norm {\V \xi - \V w_c}$. Entropy maximization guarantees inherent robustness. The failure of reference vectors, i.e., missing 3D points, affects only a limited fraction of the data. Interpreting the generation of an input signal and the subsequent mapping onto the nearest sample in $ {\cal A}$ as a random experiment which assigns a value $ x \in {\cal A}$ to the random variable $ X$, then maximizing the entropy
$\displaystyle H(X) = - \sum_{x \in {\cal A}} P(x) \log(P(x))$      

is equivalent to equiprobable samples. The following neural gas algorithm learns and subsamples 3D points clouds [7]:
i.).
Initialize the set $ \cal A$ to contain $ N$ vectors, randomly from the input set. Set $ t = 0$.
ii.).
Generate at random an input element $ \V \xi$, i.e., select a point from $ {\cal D}$.
iii.).
Order all elements of $ \cal A$ according to their distance to $ \V \xi$, i.e., find the sequence of indices $ (i_0, i_1,
\ldots, i_{N-1})$ such that $ \V w_{i_0}$ is the reference vector closest to $ \V \xi$, $ \V w_{i_1}$ is the reference vector second closest to $ \V \xi$, etc., $ \V w_{i_k}$, $ k = 0,
\ldots, N-1$ is the reference vector such that $ k$ vectors $ \V
w_j$ exists that are closer to $ \V \xi$ than $ \V w_{i_k}$. $ k_i(\V \xi, {\cal A})$ denotes the number $ k$ associated with $ \V w_i$.
iv.).
Adapt the reference vectors according to
$\displaystyle \Delta \V w_i = \varepsilon(t) h_\lambda(k_i(\V \xi, {\cal A}))
\cdot (\V \xi - \V w_i),$      

with the following time dependencies:
$\displaystyle \lambda ( t)$ $\displaystyle =$ $\displaystyle \lambda_i (\lambda_f /
\lambda_i)^{t/t_\text{max}}, $  
$\displaystyle end{tex2html_deferred}$      
$\displaystyle end{tex2html_deferred}
\varepsilon ( t)$ $\displaystyle =$ $\displaystyle \varepsilon_i (\varepsilon_f /
\varepsilon_i)^{t/t_\text{max}}, $  
$\displaystyle end{tex2html_deferred}$      
$\displaystyle end{tex2html_deferred}
h_\lambda (k)$ $\displaystyle =$ $\displaystyle \exp(-k/\lambda(t)).$  

v.).
Increase the time parameter $ t$.
The neural gas algorithms is used with the following parameters: $ \lambda_f = 0.01$, $ \lambda_i = 10.0$, $ \varepsilon_i = 0.5$, $ \varepsilon_f = 0.005$, $ t_$max$ = 10000$. Note that $ t_$max controls the run time. Fig. [*] shows 3D models of the database (top row) and subsampled versions (bottom) with 250 points.

Figure: Top: 3D models (point clouds) of the database. Bottom: sumbsampled models with 250 points.
Image model Image models250


next up previous
Next: Estimating Matching Quality Up: Evaluating the Match Previous: Evaluating the Match
root 2005-05-03