next up previous
Next: ICP-based 6D SLAM Up: Range Image Registration and Previous: Range Image Registration and

Calculation of the rotation and translation

In every iteration the optimal tranformation ($ \M R$, $ \V t$) has to be computed. Eq. ([*]) can be reduced to

$\displaystyle E(\M R, \V t)$ $\displaystyle \propto$ $\displaystyle \frac{1}{N} \sum_{i=1}^N
\norm {\V m_i - (\M R \V d_i + \V t)}^2,$ (2)

with $ N = \sum_{i=1}^{N_m}\sum_{j=1}^{N_d}w_{i,j}$, since the correspondence matix can be represented by a vector containing the point pairs.

Four methods are known to minimize eq. ([*]) [17]. In earlier work [20,27] we used a quaternion based method [7], but the following one, based on singular value decomposition (SVD), is robust and easy to implement, thus we give a brief overview of the SVD-based algorithms. It was first published by Arun, Huang and Blostein [2]. The difficulty of this minimization problem is to enforce the orthonormality of matrix $ \M R$. The first step of the computation is to decouple the calculation of the rotation $ \M R$ from the translation $ \V t$ using the centroids of the points belonging to the matching, i.e.,

$\displaystyle \V c_m = \frac{1}{N} \sum_{i=1}^{N} \V m_{i}, \qquad \qquad \V c_d = \frac{1}{N}
\sum_{i=1}^{N} \V d_{j}$     (3)

and
$\displaystyle M'$ $\displaystyle =$   (4)
$\displaystyle end{tex2html_deferred}{ \V m'_{i} = \V m_{i} - \V c_{m} $     (5)
$\displaystyle end{tex2html_deferred}}_{1,\ldots,N},$     (6)
$\displaystyle end{tex2html_deferred}$     (7)
$\displaystyle end{tex2html_deferred}
\qquad
D'$ $\displaystyle =$   (8)
$\displaystyle end{tex2html_deferred}{ \V d'_{i}$     (9)
$\displaystyle end{tex2html_deferred}= \V d_{i}$     (10)
$\displaystyle end{tex2html_deferred}, - \V c_{d} $     (11)
$\displaystyle end{tex2html_deferred}}_{1,\ldots,N}.$     (12)

After replacing ([*]), ([*]) and ([*]) in the error function, $ E(\M R, \V t)$ eq. ([*]) becomes:

$\displaystyle \begin{eqnarray}E(\M R, \V t) \!\!\!\!\!&\propto&\!\!\!\!\! \frac...
... \right)\\ &&+ \frac{1}{N} \sum_{i=1}^{N}\norm {\tilde {\V t}}^2.\end{eqnarray}$    

In order to minimize the sum above, all terms have to be minimized. The second sum ([*]) is zero, since all values refer to centroid. The third part ([*]) has its minimum for $ \tilde {\V t} = \VNull $ or
$\displaystyle \V t = \V c_m - \M R \V c_d.$     (14)

Therefore the algorithm has to minimize only the first term, and the error function is expressed in terms of the rotation only:
$\displaystyle E(\M R, \V t) \propto
\sum_{i=1}^{N}\norm {\V m'_{i}-\M R \V d'_i}^2.$     (15)

Theorem: The optimal rotation is calculated by $ \M R = \M V \M U^T$. Herby the matrices $ \M V$ and $ \M U$ are derived by the singular value decomposition $ \M H = \M U \M
\Lambda \M V^T$ of a correlation matrix $ \M H$. This $ 3 \times 3$ matrix $ \M H$ is given by

\begin{displaymath}\M H = \sum_{i=1}^{N} \V m'^T_i \V d'_i
= \left(
\begin{array...
...} & S_{yz} \\
S_{zx} & S_{zy} & S_{zz} \\
\end{array}\right),\end{displaymath}     (16)

with $ S_{xx} = \sum_{i=1}^{N} \m'_{ix} d'_{ix}, \S_{xy} =
\sum_{i=1}^{N} \m'_{ix} d'_{iy}, \\ ldots \, $. The analogous algorithm is derived directly from this theorem.


Proof: Since rotation is length preserving, i.e., $ \lvert\lvert\M R\V d'_i\lvert\lvert^2 = \lvert\lvert \V
d'_i\lvert \lvert^2$ the error function ([*]) is expanded

$\displaystyle E(\M R, \V t) \propto
\sum_{i=1}^{N}\norm {\V m'_i}^2
- 2 \sum_{i=1}^{N} \V m'_{i} \cdot \M R
\V d'_i
+ \sum_{i=1}^{N} \norm {\V d'_i}^2.$      

The rotation affects only the middle term, thus it is sufficient to maximize
$\displaystyle \sum_{i=1}^{N} \V m'_{i} \cdot \M R \V
d'_i$ $\displaystyle =$ $\displaystyle \sum_{i=1}^{N} \V {m'_{i}}^T \M R \V
d'_i.$ (17)

Using the trace of a matrix, ([*]) can be rewritten to obtain
$\displaystyle \trace {\sum_{i=1}^{N} \M R \V d'_i \V {m'_{i}}^T } =
\trace {\M R \M H},$      

With $ \M H$ defined as in ([*]). Now we have to find the matrix $ \M R$ that maximizes $ \trace {\M R \M H}$.

Assume that the singular value decomposition of $ \M H$ is

$\displaystyle \M H = \M U \M \Lambda \M V^T,$      

with $ \M U$ and $ \M V$ orthonormal $ 3 \times 3$ matrices and $ \M
\Lambda$ a $ 3 \times 3$ diagonal matrix without negative elements. Suppose
$\displaystyle \M R = \M V \M U^T.$      

$ \M R$ is orthonormal and
$\displaystyle \M R \M H$ $\displaystyle =$ $\displaystyle \M V \M U^T \M U \M \Lambda \M V^T $  
$\displaystyle end{tex2html_deferred}$      
$\displaystyle end{tex2html_deferred}$ $\displaystyle =$ $\displaystyle \M V \M \Lambda \M V^T$  

is a symmetric, positive definite matrix. Arun, Huang and Blostein provide a lemma to show that
$\displaystyle \trace {\M R \M H} \geq \trace {\M B \M R \M H}$      

for any orthonormal matrix $ \M B$. Therefore the matrix $ \M R$ is optimal. Prooving the lemma is straightforward using the Cauchy-Schwarz [2]. Finally, the optimal translation is calculated as (cf. eq. ([*]) and ([*]))
$\displaystyle \V t = \V c_m - \M R \V c_d.$      


next up previous
Next: ICP-based 6D SLAM Up: Range Image Registration and Previous: Range Image Registration and
root 2005-05-03