Least Square Fitting of… (1)

Affine Transformation, 2D, 4-Degrees of Freedom

(a variation of the previous theme).

1. Intro

What I did last time was about fitting of affine transformation of full 6-DoF.  4-DoF this time.  By 4-DoF, I mean uniform scaling, rotation and x, y translations.

Say scale is c, rotation angle is \theta and translations are s, t.  Then the 4-DoF transformation from (x, y) to (u, v) can be written as:

\displaystyle \begin{bmatrix} u \\ v \end{bmatrix} = \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \begin{bmatrix} c & 0 \\ 0 & c \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} s \\ t \end{bmatrix}

\displaystyle = \begin{bmatrix} c\cos \theta & -c\sin\theta \\ c\sin\theta & c\cos\theta \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} s \\ t \end{bmatrix}.

Letting a = c\cos\theta and b = c\sin\theta, the transform above can be rewritten as:

\displaystyle \begin{bmatrix} u \\ v \end{bmatrix} = \begin{bmatrix} a & -b \\ b & a \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} + \begin{bmatrix} s \\ t \end{bmatrix}.

So again my problem is to choose nice values of a, b, s, t that minimise this error function:

\displaystyle E(a, b, s, t) = \sum_{i=0}^{N-1} [(ax_i - by_i + s - u_i)^2 + (bx_i + ay_i + t - v_i)^2].

2. by Calculus

Again, in order for the error function E to have minimum value at some a, b, s, t, all partials must be 0s.  With some hand calculation (a bit tedious), this requirement leads to a system of equations as follows:

\displaystyle \sum x_i^2 a + \sum x_i s - \sum x_i u_i + \sum y_i^2 a + \sum y_i t - \sum y_i v_i = 0,

\displaystyle \sum y_i^2 b - \sum y_i s + \sum y_i u_i + \sum x_i^2 b x_i t - \sum x_i v_i = 0,

\displaystyle Ns + \sum x_i a - \sum y_i b - \sum u_i = 0,

\displaystyle Nt + \sum x_i b + \sum y_i a - \sum v_i = 0.

This system looks quite cumbersome, but is solvable. 1stly for a, b:

\displaystyle a = \dfrac{-\sum x_i \sum u_i + N \sum x_i u_i - \sum y_i \sum v_i + N \sum y_i v_i}{N \sum x_i^2 - (\sum x_i)^2 + N \sum y_i^2 - (\sum y_i)^2},

\displaystyle b = \dfrac{\sum y_i \sum u_i - N \sum y_i u_i - \sum x_i \sum v_i + N \sum x_i v_i}{N\sum y_i^2 - (\sum y_i)^2 + N \sum x_i^2 - (\sum x_i)^2}.

And once a, b are obtained, s, t are too:

\displaystyle s = (-\sum x_i a + \sum y_i b + \sum u_i)/N,

\displaystyle t = (-\sum x_i b - \sum y_i a + \sum v_i)/N.

sweat sweat sweat.  And this is ugly.

2. by Linear Algebra

Pretending I am already given nice a, b, s, t, I can write:

\displaystyle \begin{bmatrix} ax_0 - by_0 + s \\ bx_0 + ay_0 + t \\ ax_1 - by_1 + s \\ bx_1 + ay_1 + t \\ ... \end{bmatrix} \approx \begin{bmatrix} u_0 \\ v_0 \\ u_1 \\ v_1 \\ ... \end{bmatrix}.

Tweaking and rearranging the above non-equation to factor a, b, s, t out (a bit of puzzle), I can rewrite it as follows:

\displaystyle \begin{bmatrix} x_0 & -y_0 & 1 & 0 \\ y_0 & x_0 & 0 & 1 \\ x_1 & -y_1 & 1 & 0 \\ y_1 & x_1 & 0 & 1 \\ ... & ... & ... & ... \end{bmatrix} \begin{bmatrix} a \\ b \\ s \\ t \end{bmatrix} \approx \begin{bmatrix} u_0 \\ v_0 \\ u_1 \\ v_1 \end{bmatrix}.

Then I would rely on Householder Transform again to solve for a, b, s, t.  Hum.  In this way, I can successfully have my computers work instead of me working hard.  Relieved.

lll

About azumih

Computer Programmer
This entry was posted in computer programming and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s