Find a transformation $3 \times 3$ matrix given a set of input and output $3 \times1$ matrices

linear algebramatricesmatrix equations

I am trying to find out if it is possible to come up with a transformation $3 \times 3$ matrix that transforms a set of more than three $3 \times 1$ input matrices into an equal sized set of $3 \times 1$ output matrices.

The number of the inputs and outputs has to be more than three. I already managed to do it for a three inputs and outputs by turning it into equations of 3 unknowns.

To give you some context, I am studying color science and we use $3 \times 3$ matrices a lot to transform colors.

One of the things that utilise a $3 \times 3$ transformation matrix is when we film a color chart with a set of unknown color patches. We then take the resulting image and create a transformation matrix that maps the values of the patches in the resulting image to their original known values, in order to correct the colors of the image.

Now, there are a lot of programs that do that automatically, but I am trying to understand the math they use.

As I said, the colors of the patches are each represented in a $3 \times 1$ matrix of the RGB values of the respective patch.

The transformation matrix required is a $3 \times 3$.

Is it possible to find the matrix that transforms all of the inputs to all of the outputs, or at least gets them closer enough.

Best Answer

"Gets them close enough" is the tricky part. How do you measure how far apart they are? If the answer is "I use the usual Euclidean metric," so that the distance from $(a, b, c)$ to $(x, y, z)$ is $\sqrt{(a-x)^2 + (b-y)^2 + (c-z)^2}$, then you can minimize the sum of squared errors by using the Moore-Penrose pseudo-inverse.

Here's (roughly) the trick. You want $$ Mv_i = u_i $$ for some given list of "input" vectors $v_1, v_2, \ldots, v_k$ and "target vectors" $u_1, u_2, \ldots, u_k$, where each of these is in $\Bbb R^3$, and $M$ should be a $3 \times 3$ matrix.

If you put the $v$-vectors into a $ 3 \times k$ matrix (each vector becomes one column of the matrix $V$), and do the same thing for the $u$-vectors, making a matrix $U$, then you're hoping that $$ M V = U $$ where $M$ is $3 \times 3$, and $V$ and $U$ are each $3 \times k$. For $k > 3$, this problem is generally not solvable. But if you look at $$ R = MV - U $$ for a particular $M$, that's a matrix that shows the "errors" -- how far $Mv_i$ is from $u_i$. And if the sum of the squares of the entries of $R$ is small...then you've done well.

Now that is a calculus problem: over all $3 \times 3 $ matrices $M$, consider the function $$ f(M) = \sum_{i,j}(MV - U)^2_{i,j} $$ and minimize it. You take derivatives, set them to zero, and (after a good deal of algebra) you arrive at a formula for $M$.

Let me lead you there in a slightly different way. Let's suppose that there is such a matrix $M$ for your $U$ and $V$ data. Start from $$ MV = U $$ and multiply both sides by $V^t$ to get $$ M (VV^t) = (U V^t) $$ The matrices in parentheses are now $3 \times 3$, and unless your set of $v$-vectors is particularly bad (e.g., they all lie in one plane!), the matrix $V V^t$ is invertible. So you find that $$ M = (UV^t)(V V^t)^{-1} $$ Now that's the answer for $M$ in the case that $MV = U$ does have a solution. The surprising thing (which you get by doing the calculus I mentioned) is that even if $MV = U$ doesn't have an exact solution, the matrix $$ M^{*} = (UV^t)(V V^t)^{-1} $$ is the one for which $f(M)$ is minimized, i.e., for which the transformed $v$-vectors are as close as possible (in the aggregate) to the corresponding $u$-vectors.

Related Question