Finding the closest two points on two lines in N-dimensions.

geometryoptimizationreal-analysis

Let $a,b,c,d$ be vectors in $\mathbb R^n$, where $b$ and $d$ are linearly independent.

Further define $x(s) = a+sb$ and $y(t)=c+td$ ($s,t \in \mathbb R$)

The question now is which $(s,t)$ minimizes $||x(s)-y(t)||_2^2$

Because the vectors defining the direction are linearly independent, the lines are not parallel.

Now, if the lines intersect at a point, than that point is the minimum.

My first question now is, how one can write that intersection point in N dimensions. I have not found a good way to parametrize it.

If the lines are skew lines, that is, they neither intersect nor are they parallel, the points of the closest distance are given by the connecting line which is orthogonal to both lines.

I am not sure however, why that is the case, why the minimum is unique and subsequently how one finds the minimal $(s,t)$.

Best Answer

Hint.

$$ \delta^2 = ||a+s b-c-t d||^2 = ||a-c||^2+2s(a-c)\cdot b-2t(a-c)\cdot d +s^2||b||^2+t^2||d||^2-2s t b\cdot d $$

then

$$ \partial_s \delta^2= 2(a-c)\cdot b+2s||b||^2-2t b\cdot d = 0\\ \partial_t \delta^2= -2(a-c)\cdot d+2t||d||^2-2sb\cdot d=0 $$

now solving this linear system for $s,t$ we obtain the points at minimum distance.