Localization based on Distance Measurements via Least Squares

estimationleast squareslinear algebrasystems of equations

Estimating a vector $x \in \mathbb{R}^2$ knowing its distance to four beacons $v_1, \dots, v_4\in \mathbb{R}^2$ via least-squares means finding the least-squares solution to $A x = y$, where $y\in\mathbb{R}^4$ is the vector of distances between $x$ and $v_i$ and $A\in\mathbb{R}^{4×2}$ is the matrix with row vectors $a_i^T=\frac{v_i^T}{\|{v_i}\|}$.

I'm having trouble understanding intuitively where the formulation $Ax = y$ (so, $A$'s and $y$'s form as described above and their derivation as this problem's formulation) is coming from. What projections are happening under the hood that rectify this formulation? What's happening geometrically?

Or: Why will the $x^*$ that solves this least-squares problem really be the best estimation for the position of the real $x$ whose distance to $v_1,..,v_4$ was measured?

Best Answer

I will describe the classic way to derive least squares localization using distance.
We'll do it in 2 dimensional world for simplicity.

Model

Assume we have an unknown location $ \boldsymbol{x}_{m} \in \mathbb{R}^{2} $ (Namely $ \boldsymbol{x}_{m} = {\left[ {x}_{m}, {y}_{m} \right]}^{T} $).
We are given $ 3 $ other known locations $ \boldsymbol{x}_{i} \in \mathbb{R}^{2}, \; i = 1, 2, 3 $.
Assume we know the distance between $ {x}_{m} $ to each of those locations $ {r}_{i} = \left\| \boldsymbol{x}_{i} - \boldsymbol{x}_{m} \right\| $.

Without loss of generality, one could assume $ \boldsymbol{x}_{1} = {\left[ 0, 0 \right]}^{T} $.

Non Linear Least Squares Solution

So we can form 3 equations:

$$\begin{align} {r}_{1}^{2} & = {x}_{m}^{2} + {y}_{m}^{2} && \text{As $ \boldsymbol{x}_{1} $ is at the origin} \label{a}\tag{1} \\ {r}_{2}^{2} & = {\left( {x}_{2} - {x}_{m} \right)}^{2} + {\left( {y}_{2} - {y}_{m} \right)}^{2} && \text{} \label{b}\tag{2} \\ {r}_{3}^{2} & = {\left( {x}_{3} - {x}_{m} \right)}^{2} + {\left( {y}_{3} - {y}_{m} \right)}^{2} && \text{} \label{c}\tag{3} \end{align}$$

One way to solve this is forming a Non Linear Least Squares cost function and solve it:

$$ J \left( \boldsymbol{x}_{m} \right) = \sum_{i = 1}^{3} {\left( {r}_{i}^{2} - {\left\| \boldsymbol{x}_{m} - \boldsymbol{x}_{i} \right\|}_{2}^{2} \right)}^{2} $$

Linear Least Squares Solution

The trick to convert the problem into linear form would be subtracting between the equations and form a linear system of equations:

$$\begin{align} {r}_{2}^{2} - {r}_{1}^{2} & = {x}_{2}^{2} - 2 {x}_{2} {x}_{m} + {y}_{2}^{2} - 2 {y}_{2} {y}_{m} && \text{Subtracting $ \ref{a} $ from $ \ref{b} $} \label{d}\tag{4} \\ {r}_{3}^{2} - {r}_{1}^{2} & = {x}_{3}^{2} - 2 {x}_{2} {x}_{m} + {3}_{2}^{2} - 2 {y}_{2} {y}_{m} && \text{Subtracting $ \ref{a} $ from $ \ref{c} $} \label{e}\tag{5} \\ \end{align}$$

Now, this can be arranged into the form of $ A \boldsymbol{x}_{m} = \boldsymbol{b} $ where:

$$ A = \begin{bmatrix} {x}_{2} & {y}_{2} \\ {x}_{3} & {y}_{3} \end{bmatrix}, \; \boldsymbol{b} = \frac{1}{2} \begin{bmatrix} {x}_{2}^{2} + {y}_{2}^{2} - {r}_{2}^{2} + {r}_{1}^{2} \\ {x}_{3}^{2} + {y}_{3}^{2} - {r}_{3}^{2} + {r}_{1}^{2} \end{bmatrix} $$

Of course it can be generalized to $ n $ measurements and known locations:

$$ A = \begin{bmatrix} {x}_{2} & {y}_{2} \\ {x}_{3} & {y}_{3} \\ \vdots & \vdots \\ {x}_{n} & {y}_{n} \end{bmatrix}, \; \boldsymbol{b} = \frac{1}{2} \begin{bmatrix} {x}_{2}^{2} + {y}_{2}^{2} - {r}_{2}^{2} + {r}_{1}^{2} \\ {x}_{3}^{2} + {y}_{3}^{2} - {r}_{3}^{2} + {r}_{1}^{2} \\ \vdots \\ {x}_{n}^{2} + {y}_{n}^{2} - {r}_{n}^{2} + {r}_{1}^{2} \end{bmatrix} $$

The solution is given by:

$$ \hat{\boldsymbol{x}}_{m} = {\left( {A}^{T} A \right)}^{-1} {A}^{T} \boldsymbol{b} $$

Remark
For the Tagging and Labeling of the equations have a look at MathJax: How to Number and Reference Equations?

Related Question