## Least Squares Fitting of… (0)

…  Affine Transformations.

Note to myself (0).

### 1. problem

Let’s say here are 2 sets of 2D points $\{(x_i, y_i)\}$ and $\{(u_i, v_i)\}$, $i = 0, 1,...,n-1$.  Besides, $(x_i, y_i)$ corresponds to $(u_i, v_i)$ for each $i$.  What I want here is a nice affine transform that maps each $(x_i, y_i)$ to $(u_i, v_i)$, such that sum of distances between mapped $(x_i, y_i)$ and $(u_i, v_i)$ is minimised.  That is an affine transform that minimise this: $\displaystyle E = \sum_{i=0}^{n-1} [(x'_i - u_i)^2 + (y'_i - v_i)^2]$,

where $\displaystyle \begin{bmatrix} x'_i \\ y'_i \end{bmatrix} = \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} x_i \\ y_i \end{bmatrix} + \begin{bmatrix} s \\ t \end{bmatrix}$.

### 2. go calculus

In order for $E$ to take minimal value, it is necessary that $E$ is stationary, i.e. all 1st partial derivatives of $E$ with respect to $a, b, c, d, s, t$ must be $0$.  Writing down the necessary conditions, I get: $\displaystyle \dfrac{\partial E}{\partial a} = \dfrac{\partial E}{\partial b} = \dfrac{\partial E}{\partial c} = \dfrac{\partial E}{\partial d} = \dfrac{\partial E}{\partial s} = \dfrac{\partial E}{\partial t} = 0$.

Computing partial derivatives and simplifying (a bit tedious) lead to the 2 separate systems of 1st order equations: $\displaystyle \begin{bmatrix} \sum x_i x_i & \sum x_i y_i & \sum x_i \\ \sum x_i y_i & \sum y_i y_i & \sum y_i \\ \sum x_i & \sum y_i & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ s \end{bmatrix} = \begin{bmatrix} \sum x_i u_i \\ \sum y_i u_i \\ \sum u_i \end{bmatrix}$, $\displaystyle \begin{bmatrix} \sum x_i x_i & \sum x_i y_i & \sum x_i \\ \sum x_i y_i & \sum y_i y_i & \sum y_i \\ \sum x_i & \sum y_i & 1 \end{bmatrix} \begin{bmatrix} c \\ d \\ t \end{bmatrix} = \begin{bmatrix} \sum x_i v_i \\ \sum y_i v_i \\ \sum v_i \end{bmatrix}$,

(sums are over i = 0 to n-1).

Solving for $a, b, c, d, s, t$ gives me what I wanted, the “nice” affine transform.

### 3. go linear algebra

Let’s imagine an ideal situation where the mapped points $\{(x'_i, y'_i)\}$ coincide with $\{(u_i, v_i)\}$, i.e. $\displaystyle \begin{bmatrix} ax_i + by_i + s \\ cx_i + dy_i + t \end{bmatrix} = \begin{bmatrix} u \\ v \end{bmatrix}$

for each $i$.  This can be rewritten as 2 systems of equations as follows: $\displaystyle \begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ s \end{bmatrix} = \begin{bmatrix} u_0 \\ u_1 \\ ... \\ u_{n-1} \end{bmatrix}$, $\displaystyle \begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} c \\ d \\ t \end{bmatrix} = \begin{bmatrix} v_0 \\ v_1 \\ ... \\ v_{n-1} \end{bmatrix}$.

Except for this ideal situation, the equations above do not hold.  I cannot equate LHS with RHS.

But the systems can give me a clue for looking at the problem from a different view point.  Namely, LHS can be seen as a linear combination of 3 column vectors which spans 3-dimensional subspace of n-dimensional space.  RHS is a vector also resides in the same n-dimensional space, but it is not, in general, on the 3-dimensional space spanned by LHS.

In the 3-dimensional space, the closest possible point to RHS is given by orthogonal projection from the RHS vector to the 3-dimensional space spanned by LHS.  This projection line must be orthogonal to 3 column vectors of LHS matrix.  Putting this condition into math expressions, I get: $\displaystyle \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \left(\begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ s \end{bmatrix} - \begin{bmatrix} u_0 \\ u_1 \\ ... \\ u_{n-1} \end{bmatrix} \right) = \boldsymbol{0}$, $\displaystyle \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \left(\begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} c \\ d \\ t \end{bmatrix} - \begin{bmatrix} v_0 \\ v_1 \\ ... \\ v_{n-1} \end{bmatrix} \right) = \boldsymbol{0}$.

These can be rewritten as $\displaystyle \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ s \end{bmatrix} = \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \begin{bmatrix} u_0 \\ u_1 \\ ... \\ u_{n-1} \end{bmatrix}$, $\displaystyle \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \begin{bmatrix} x_0 & y_0 & 1 \\ x_1 & y_1 & 1 \\ ... & ... & ... \\ x_{n-1} & y_{n-1} & 1 \end{bmatrix} \begin{bmatrix} c \\ d \\ t \end{bmatrix} = \begin{bmatrix} x_0 & x_1 & ... & x_{n-1} \\ y_0 & y_1 & ... & y_{n-1} \\ 1 & 1 & ... & 1 \end{bmatrix} \begin{bmatrix} v_0 \\ v_1 \\ ... \\ v_{n-1} \end{bmatrix}$.

Computing matrix products, these equations result in the ones given in section 2.

As a computer programmer myself, a nice thing about this approach is that I can do away with some computation by hand which is a bit error prone, provided that I have a basic matrix operators and solvers at hand.  Further more, with Householder transformer at hand, I can even dispose of matrx multiplier, which is my favourite way to go actually.

lll

This entry was posted in computer programming and tagged . Bookmark the permalink.

### 1 Response to Least Squares Fitting of… (0)

1. Pingback: Least Square Fitting of… (1) | 0909