Projection of vector $v$ onto a plane in $\mathbb R^k$

linear algebralinear-transformationsprojective-spacevector-spaces

Suppose that $v$ is a $\mathbb R^k$ vector. $n_1,n_2,…,n_{k-2}$ are the normal vectors of a plane $H$. What is the projection of $v$ onto that plane?


If $k=3$ then the projection is $$x=v-(n\cdot v)n_1$$

Best Answer

Without just throwing an answer and saying "this is it", I'll try a rational approach to what a projection is: a (orthogonal) projection of a vector $v$ onto a hypersurface $S$ is the point $p$ in $S$ such that the distance from $p$ to $v$ is minimized. To solve your problem, we will require that we know a point in the plane, and we will assume that the plane contains the point $0^k$. The normals to the plane are orthogonal to any point in the plane, so if $x$ is in the plane, then $(n_i,v)=0$ for all $1\leq i\leq k-2$, and $(n_i,v)$ represents the standard inner product. We then seek to minimize the function

$$f(x)=||x-v||_2^2=(x,x)-2(x,v)+(v,v)$$

subject to the constraints

$$g_i(x)=(n_i,x)=0$$

This is a classic Lagrange Multipliers problem. We then seek the critical points of the function

$$L(x,\lambda)=(x,x)+2(x,v)+(v,v)-\sum_{i=1}^{k-2}\lambda_i(n_i,x)$$

The derivative with respect to $x_j$ is

$$\nabla_{x_j} L=2x_j-2v_j-\sum_{i=1}^{k-2}\lambda_in_{ij}=0$$

We can create a system of equations that allows for us to solve for the $\lambda$ vector by multiplying by $n_{mj}$ and summing over all $j$ to get

$$(n_m,x)-(n_m,v)-\sum_{i=1}^{k-2}(n_i,n_m)\lambda_i=0$$

or as a linear system,

$$\begin{pmatrix} (n_1,n_1) & \cdots & (n_{k-2},n_1) \\ \vdots & \ddots & \vdots \\ (n_1,n_{k-2}) & \cdots & (n_{k-2},n_{k-2}) \end{pmatrix}\begin{pmatrix} \lambda_1 \\ \vdots \\ \lambda_{k-2}\end{pmatrix}=-\begin{pmatrix} (n_1,v) \\ \vdots \\ (n_{k-2},v)\end{pmatrix}$$

where we have used the orthogonality between $x$ and the normals. From here, there isn't anything we can do without assuming that the normals that we are given are orthonormal. This means that the matrix is simply the identity matrix, and we have

$$\lambda_i=-(n_i,v)$$

We return back to the derivative of the Lagrangian and replace the $\lambda_i$ terms, and we have

$$x_j=v_j-\sum_{i=1}^{k-2}(n_i,v)n_{ij}$$

and as vectors,

$$x=v-\sum_{i=1}^{k-2}(n_i,v)n_i$$

We have generalized the formula for three dimensions by simply adding to the subtraction term, which I view as "removing" the normals from the vector. Again, this required that the normal vectors were orthonormal.

Hope this helps!