[Physics] How to calculate speed and distance from gyro, orientation and acceleration data

accelerationgyroscopesvelocity

In iOS 8 SDK, there's an API that gives you a distance covered by user over a period of time.
I wonder how is Apple calculating distance, as it seems to work without GPS help. And since I have the time range, then I can calculate the speed as well.

Is there a way to calculate speed and distance using other CoreMotion APIs? Most likely candidate is CMDeviceMotion.

It has device rotationRate – a measurement of gyroscope data whose bias has been removed – which is (x,y,z) in radians per second.

It has attitude which is (x,y,z) device orientation in radians based on reference frame. That frame is tricky to get, but I'm thinking it's not really needed as there's also API multiplyByInverseOfAttitude which gets you just the difference from previously measured attitude. So I pick 0-point and work from 1 and up.

It has userAcceleration which is -1.0 to 1.0 of gravity.

And of course, there's timestamp for each measurement.

Is there a formula that can use this and calculate the speed and distance between any two measurement points in a series of measured points taken during a running workout?

Best Answer

OK... once more: let's assume we are looking at a map-coordinate system where the x-axis points East, the y-Axis points North and the z-axis points towards the zenith. In such a system there are three acceleration components $\{a_E, a_N, a_Z - g\}$. Since the surface of the Earth is not an inertial system, the gravity of Earth always acts on the Zenith component and there is no way to get rid of that.

IF our accelerometer would have its axes aligned perfectly with the North, East and Zenith directions, we would be done easily: subtract a constant g from the Zenith-component and voila... perfect acceleration data. THAT is exactly what a commercial inertial measurement unit does! It keeps its accelerometer inside a gimbaled mechanical unit that makes sure that all three axes are always pointing into the same directions! This is achieved with a gyroscopic stabilizer rotating at 10,000rpm and very expensive precision electro-mechanics. Good for your Boing 777 but not your pocket!

So what's in your pockets? It's a mediocre accelerometer next to a mediocre rotation sensor next to an unreliable compass which will pick up your car keys more often than it will pick up Earth's actual magnetic field. That's what you have to work with.

The rotation of the accelerometer against the $\{a_E, a_N, a_Z - g\}$ system is initially unknown. Let's assume the device rests on the table. The only acceleration acting on it is -g_Z. Since the device is rotated trough three Euler angles against our favorite map system, we will be seeing three acceleration readings $a_x, a_y, a_z$, which should add up vectorially to the gravitational acceleration. So we can naively hope that $a_x^2 + a_y^2 + a_z^2 = g^2$. That, however, will not be the case, because, as we said, we have a crappy accelerometer. Each channel will have an offset, a gain error and, worst of all, at least a first order crosstalk error plus a noise term. So the measured accelerations will be

$\vec{a}_{measured} = M_{gain/crosstalk} \vec{a}_{physical} + \vec{a}_{offset} + \vec{a}_{noise}$.

In the above equation $M_{gain/crosstalk}$ is almost a unity matrix with diagonal elements representing the channel gains and the off-diagonal elements representing the channel crosstalk. If left uncorrected, the channel offsets will lead to a slow drift of an integrated position. If left uncorrected the channel gain errors will make us over- or underestimate the actual distance and the crosstalk errors will make point us in the wrong direction. Most importantly, the zx- and zy-crosstalk will mix part of the constant acceleration g into x and y directions, making the x and y offsets even larger.

But wait! We still haven't taken care of the rotation of the device! This rotation is another Matrix $M_{rot}$, which rotates the physical accelerations in the North-East-Zenith system into the physical accelerations acting on our sensor. So in reality the mapping goes something like this:

$\vec{a}_{measured} = M_{gain/crosstalk} M_{rot} \vec{a}_{NEZ} + \vec{a}_{offset} + \vec{a}_{noise}$.

So how do we estimate M_{rot}? With the rotation sensor, of course. That sensor measures the angular velocities of the device's rotation, then it adds its own offset, gain and crosstalk errors to that and then it gives you a signal that, after proper one time integration can be transformed into the $M_{rot}$ matrix, assuming we knew, at some point in time the original orientation of the device.

Do I have to say it? The million dollar 777 inertial guidance system uses the GPS and a ground calibration routine to align itself properly before the plane takes off. And that's what you have to do, too, to get started turning rotation sensor data into $M_{rot}$. So how does one calibrate the rotation sensor? Using $g_{Zenith}$ to determine the direction of the Zenith and the compass to determine North and East, of course! Oh... wait... we don't know where that vector is pointing unless we have a properly calibrated (and rotated!) accelerometer signal!

And there begins your adventure of combining the signals from three unreliable sensors into one hopefully reliable signal. Part of the problem is to quantify all the error terms and to reconstruct and invert the coordinate transformation matrix.

I don't know how much of this Apple has done already. That should be documented somewhere... in the API or maybe just in an internal document that they won't show to you unless you are under an NDA with the Apple legal department.