[Math] How to find an orthogonal vector given two vectors

linear algebra

I´m trying to find a vector $\vec{c} = $ , which is orthogonal to vector $\vec{a}$ and $\vec{b}$:

As far I understood, I have to show that:

$$\langle a,c\rangle=0 $$
$$\langle b,c\rangle=0 $$

So if I would like to determine an orthogonal vector regarding: \begin{bmatrix}-1\\1\end{bmatrix}
I just intuitively uses: $$\langle v,w\rangle=1 \cdot(-1)+1\cdot 1=0 $$ in order to arrive at \begin{bmatrix}1\\1\end{bmatrix}
My problem is that I just dont know a mechanic way to solve for an orthogonal vector. It was more a educated guess.

For example, given:
$\vec{a} = \begin{bmatrix}-1\\1\\1\end{bmatrix}$ and $\vec{b} = \begin{bmatrix}\sqrt{2}\\1\\-1\end{bmatrix}$ how do I find a orthogonal vector?

Thank you in advance.

Best Answer

Given $m$ orthogonal vectors $v_1, v_2, \ldots, v_m$ in $\mathbb R^n$, a vector orthogonal to them is any vector $x$ that solves the matrix equation

$$\begin{pmatrix}v_1^T \\ v_2^T \\ \vdots \\ v_m^T\end{pmatrix} x = 0.$$

To put this a bit more concretely, suppose

$$v_1 = \begin{pmatrix}v_{11} \\ v_{12} \\ \vdots \\ v_{1n}\end{pmatrix},\quad v_2 = \begin{pmatrix}v_{21} \\ v_{22} \\ \vdots \\ v_{2n}\end{pmatrix},\ \ldots,\quad v_m = \begin{pmatrix}v_{m1} \\ v_{m2} \\ \vdots \\ v_{mn}\end{pmatrix},\ \mbox{and}\quad x = \begin{pmatrix}x_{1} \\ x_{2} \\ \vdots \\ x_{n}\end{pmatrix}$$

where the numbers $v_{ij} \in \mathbb R$ are all known and the numbers $x_i \in \mathbb R$ are all unknown. Then the matrix equation above can also be written

$$ \begin{pmatrix} v_{11} & v_{12} & \cdots & v_{1n} \\ v_{21} & v_{22} & \cdots & v_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ v_{m1} & v_{m2} & \cdots & v_{mn} \end{pmatrix} \begin{pmatrix}x_1 \\ x_2 \\ \vdots \\ x_n\end{pmatrix} = \begin{pmatrix}0 \\ 0 \\ \vdots \\ 0\end{pmatrix}.$$

This is equivalent to the system of linear equations $$ \begin{array}{ccccccccl} v_{11}x_1 &+& v_{12}x_2 &+& \cdots &+& v_{1n}x_n &=& 0, \\ v_{21}x_1 &+& v_{22}x_2 &+& \cdots &+& v_{2n}x_n &=& 0, \\ \vdots&&\vdots&&\ddots&&\vdots&&\vdots \\ v_{m1}x_1 &+& v_{m2}x_2 &+& \cdots &+& v_{mn}x_n &=& 0. \end{array}$$

That is, you need to solve a linear system of $m$ equations with $n$ unknowns. This is something you can do using row reduction.

The solution will never be unique; if the vector $x$ is a solution then the vector $cx$ is also a solution, where $c$ is any scalar constant. If the $m$ vectors include fewer than $n-1$ independent vectors, the solution is not even unique up to a scalar constant; you can have multiple vectors in different directions that are all orthogonal to the given vectors. If $m \geq n$ there may not be a solution at all; the $m$ vectors may span $\mathbb R^n$. There will, however, be solutions as long as the set of given vectors does not contain $n$ or more mutually independent vectors.

In your particular case, if you are not aware of the fact that the cross-product of two independent vectors in $\mathbb R^3$ is orthogonal to each of those vectors, you have

$$v_1 = \begin{pmatrix}v_{11}\\v_{12}\\v_{13}\end{pmatrix} = \begin{pmatrix}-1\\1\\1\end{pmatrix} \quad \mbox{and} \quad v_2 = \begin{pmatrix}v_{21}\\v_{22}\\v_{23}\end{pmatrix} = \begin{pmatrix}\sqrt{2}\\1\\-1\end{pmatrix},$$

so you could solve the system of equations

$$\begin{eqnarray} -1\cdot x_1 + 1\cdot x_2 + 1 \cdot x_3 &=& 0, \\ \sqrt{2}\cdot x_1 + 1\cdot x_2 - 1 \cdot x_3 &=& 0. \end{eqnarray}$$

Blindly applying the methods I was taught in high school, I find this is equivalent to

$$\begin{array}{ccccccl} x_1 &-& x_2 &-& x_3 &=& 0, \\ &&\left(1+\sqrt{2}\right)x_2 &+&\left(-1+\sqrt{2}\right) x_3 &=& 0. \end{array}$$

At this point we can make an arbitrary choice of for $x_3$ and proceed to solve the equations as a system of two equations in two unknowns.

Related Question