Trouble with the Jacobian for a camera projection matrix.

derivativesjacobianlinear algebrapartial derivativetotal-variation

I am using the Gauss Newton method to update a camera projection matrix.
$$
x = \frac{PX}{P_\text{last row} X}
$$

where x is a known $3\times k$ matrix, X is a known $4\times k$ matrix, and
$$
P =
\begin{bmatrix}
p_1 & p_2 & p_3 & p_4\\
p_5 & p_6 & p_7 & p_8\\
p_{9} & p_{10} & p_{11} & p_{12}\\
\end{bmatrix}
$$

I have a loss function I want to minimize L, where
$$
R(P) = \frac{PX}{P_\text{last row} X} – x
$$

and unroll r into
$$
r(P) =
\begin{bmatrix}
R(P)_{1,1} \\ R(P)_{2,1} \\ \vdots \\ R(P)_{2,k} \\ R(P)_{3,k}
\end{bmatrix}
$$

then
$$
L(P) = r(P)^T \cdot r(P)
$$

I unroll P into a vector p
$$
p =
\begin{bmatrix}
p_1 \\ p_2 \\ \vdots \\ p_{11} \\ p_{12}\\
\end{bmatrix}
$$

I know
$$
\frac{\partial L}{\partial P} = 2 \frac{\partial r}{\partial
P} r(P)
$$

and then I have my Jacobian where I want to find
$$
\frac{\partial r(P)}{\partial p} = J =
\begin{bmatrix}
\frac{\partial r(P)_{1,1}}{\partial p_1} & \cdots & \frac{\partial r(P)_{1,1}}{\partial p_{12}}\\
\frac{\partial r(P)_{2,1}}{\partial p_1} & \cdots & \frac{\partial r(P)_{2,1}}{\partial p_{12}}\\
\frac{\partial r(P)_{1,2}}{\partial p_1} & \cdots & \frac{\partial r(P)_{1,2}}{\partial p_{12}}\\
\vdots & \ddots & \vdots\\
\frac{\partial r(P)_{2,k}}{\partial p_1} & \cdots & \frac{\partial r(P)_{2,k}}{\partial p_{12}}\\
\end{bmatrix}
$$

I removed the rows that don't depend on P, every third row, because the last row of the output is always 1.

$$
\frac{\partial R(P)_{1,i}}{\partial P_{1, j}} = \frac{X_{j, i}}{X_{1, i}P_{3, 1} + X_{2, i}P_{3, 2} + X_{3, i}P_{3, 3} + X_{4, i}P_{3, 4}} = \frac{X_{j,i}}{c_i}\\
\frac{\partial R(P)_{1,i}}{\partial P_{2, j}} = 0 \\
\frac{\partial R(P)_{1,i}}{\partial P_{3, j}} = \frac{X_{j, i}(X_{1, i}P_{1, 1} + X_{2, i}P_{1, 2} + X_{3, i}P_{1, 3} + X_{4, i}P_{1, 4})}{(X_{1, i}P_{3, 1} + X_{2, i}P_{3, 2} + X_{3, i}P_{3, 3} + X_{4, i}P_{3, 4})^2} = \frac{X_{j, i}a_i}{c_i^2}\\
\frac{\partial R(P)_{2,i}}{\partial P_{1, j}} = 0 \\
\frac{\partial R(P)_{2,i}}{\partial P_{2, j}} = \frac{X_{j, i}}{X_{1, i}P_{3, 1} + X_{2, i}P_{3, 2} + X_{3, i}P_{3, 3} + X_{4, i}P_{3, 4}} = \frac{X_{j,i}}{c_i}\\
\frac{\partial R(P)_{2,i}}{\partial P_{3, j}} = \frac{X_{j, i}(X_{1, i}P_{2, 1} + X_{2, i}P_{2, 2} + X_{3, i}P_{2, 3} + X_{4, i}P_{2, 4})}{(X_{1, i}P_{3, 1} + X_{2, i}P_{3, 2} + X_{3, i}P_{3, 3} + X_{4, i}P_{3, 4})^2} = \frac{X_{j, i}b_i}{c_i^2} \\
\vdots
$$

Doing some substitution I get
$$
J =
\begin{bmatrix}
\frac{X_{1,1}}{c_1} & 0 & \frac{X_{1, 1}a_1}{c_1^2} & \cdots & \frac{X_{4, 1}a_1}{c_1^2}\\
0 & \frac{X_{1,1}}{c_1} & \frac{X_{1, 1}b_1}{c_1^2} & \cdots & \frac{X_{4, 1}b_1}{c_1^2}\\
\frac{X_{1,2}}{c_2} & 0 & \frac{X_{1, 2}a_2}{c_2^2} & \cdots & \frac{X_{4, 2}a_2}{c_2^2}\\
0 & \frac{X_{1,2}}{c_2} & \frac{X_{1, 2}b_2}{c_2^2} & \cdots & \frac{X_{4, 2}b_2}{c_2^2}\\
\vdots & \ddots & \ddots & \ddots & \vdots\\
0 & \frac{X_{1,k}}{c_2} & \cdots & \cdots & \frac{X_{4, k}b_k}{c_k^2}\\
\end{bmatrix}
$$

But when plugging this into the Gauss-Newton Method:

$$
p_{n+1} = p_n – (J^TJ)^{-1}J^Tr(P_n)
$$

It just gives me wildly incorrect results. Where did I go wrong?

Best Answer

For convenience, the cost function is rewritten as $ \phi = \| \mathbf{R} \|_F^2 $ with the $3 \times k$ residual matrix $$ \mathbf{R} = \mathbf{U}-\mathbf{X}_\mathrm{meas} $$ and $$ \mathbf{U} = \mathbf{V} \mathbf{D}^{-1} $$ where $$ \mathbf{V} = \mathbf{P}\mathbf{X}, \mathbf{D} = \mathrm{diag}(\mathbf{V}^T \mathbf{e}_3) $$

\noindent We deduce \begin{eqnarray*} d\mathbf{R}= d\mathbf{U} &=& (d\mathbf{V}) \mathbf{D}^{-1} - \mathbf{U} (d\mathbf{D}) \mathbf{D}^{-1} \\ &=& \mathbf{I}_3 (d\mathbf{P}) \mathbf{X} \mathbf{D}^{-1} - \mathbf{U} (d\mathbf{D}) \mathbf{D}^{-1} \end{eqnarray*} Vectorization yields \begin{eqnarray*} d\mathbf{r} &=& \left[ ( \mathbf{X} \mathbf{D}^{-1} )^T \otimes \mathbf{I}_3 \right] d\mathbf{p} - (\mathbf{D}^{-1} \ast \mathbf{U}) (d\mathbf{V})^T \mathbf{e}_3 \\ &=& \left[ \mathbf{D}^{-1} \mathbf{X}^T \otimes \mathbf{I}_3 \right] d\mathbf{p}- (\mathbf{D}^{-1} \ast \mathbf{U}) \mathbf{X}^T \mathbf{I}_4 (d\mathbf{P})^T \mathbf{e}_3 \\ &=& \left[ \mathbf{D}^{-1} \mathbf{X}^T \otimes \mathbf{I}_3 \right] d\mathbf{p}- (\mathbf{D}^{-1} \ast \mathbf{U}) \mathbf{X}^T (\mathbf{e}_3^T \otimes \mathbf{I}_4) d\mathrm{vec}(\mathbf{P}^T) \end{eqnarray*} where we have used the Khatri-Rao product $\ast$.

Using the vec-permutation matrix $\mathbf{K}$, the $3k \times 12$ Jacobian writes in matrix form as $$ \left[ \mathbf{D}^{-1} \mathbf{X}^T \otimes \mathbf{I}_3 \right] - (\mathbf{D}^{-1} \ast \mathbf{U}) \mathbf{X}^T (\mathbf{e}_3^T \otimes \mathbf{I}_4) \mathbf{K} $$

The second term of the Jacobian has an effect only on columns 3-6-9-12.

Related Question