You can't calculate the displacement like this. The application note clearly says " When implementing positioning in 3 axes, extra processing is required to null the earth's gravity effect.", and that's the understatement of the month!
Nulling gravity is an enormous problem, unless your accelerometer is perfectly perpendicular to Earth's gravity vector! The accelerations you want to measure are probably on the order of $1m/s^2$ or less, right? That is overlaid by $\vec{g}=9.81m/s^2\vec{z}$. Before you can even estimate the displacement causing acceleration your are interested in, you have to determine angles of your three accelerometer channels relative to $\vec{g}$, then subtract $\vec{g}$. Thankfully that procedure will give you a direction towards the floor... however, you still don't know which way your $\vec{x}$ and $\vec{y}$ axes are pointing! Your accelerometer could be turning around 180 degrees while it is accelerating... and the real displacement could end up in the opposite direction of where you thought it was going. Now, a perfect acceleration sensor would not even be sensitive to this rotation, for that you would need a different sensor that can detect rotations! Such a gyroscope may, or may not be built into your device. Given the way most accelerometer chips are implemented, gyroscopes and accelerometers will not even sample synchronously, which means that you need to have a digital resampling filter in place to correlate the readings of both sensors to get a reliable six axis position/orientation vector. I don't think that the Android platform gives you enough information to do that, right now. iOS might... on newer phones and tablets which have both sensors.
Having said that, if you want to simplify things to the level you are on, right now, you can (theoretically) mount your cell phone on a straight rail, carefully rotate it, until the motion vector points exactly in the direction of one of the accelerometer axes, and that axis is perpendicular to $\vec{g}$ and then you can apply a good numerical integration algorithm to what it measures on one axis, like one of the higher order integrators described by http://mathworld.wolfram.com/NumericalIntegration.html.
I suppose that was your original question, right? I apologize for ruining your day with the description of the real problem, and I promise not to bore you with other goodies like calibration of offsets, gain drift, crosstalk (misaligned axes) and non-linearities.
You need to integrate acceleration to get the velocity.
$$v(t) = \int_{t=0}^{t} a. dt$$
There are a number of ways of doing this numerically.
I assume that you get these readings regularly with a spacing of $\delta t$, for example $\delta t = 100 ms$ or something like that.
About the simplest way to do it is
$$v(t) = v(0) + \sum a \times \delta t$$
where $v(t)$ is the velocity at time $t$.
but there are more sophisticated ways of doing it - I will not repeat them here, but you might want to look at using Simpson's rule, which is described here.
The problem is complicated by velocity being three dimensional - so you need to integrate in each of the three dimensions x, y and z separated.
It depends how the phone gives you the information about the acceleration, but if you get $a_x$, $a_y$ and $a_z$ at regular intervals then you can do the following...
vx += ax * dt;
vy += ay * dt;
vz += az * dt;
if you get accleration as a raw number and angle then you will have to convert from I guess polar coordinates to xyz components to be able to add them up.
Total speed, $|v|$ is, of course, given by $|v|=\sqrt{v_x^2 + v_y^2 + v_z^2}$
I would, of course, try to start at $v=0$
Curious One, raises a really interesting point about $g$ - the best way to test this is to code it and try it - shake the phone and see if the velocity returns to zero when it is at rest after shaking it or moving it....
... can you post your results if you do this and try it out?
Another issue is twisting the phone and twisting the accelerometer - this would require you to think about angular acceleration etc., but the basic principles outlined here would be the same if you needed to think about angles.
Best Answer
OK... once more: let's assume we are looking at a map-coordinate system where the x-axis points East, the y-Axis points North and the z-axis points towards the zenith. In such a system there are three acceleration components $\{a_E, a_N, a_Z - g\}$. Since the surface of the Earth is not an inertial system, the gravity of Earth always acts on the Zenith component and there is no way to get rid of that.
IF our accelerometer would have its axes aligned perfectly with the North, East and Zenith directions, we would be done easily: subtract a constant g from the Zenith-component and voila... perfect acceleration data. THAT is exactly what a commercial inertial measurement unit does! It keeps its accelerometer inside a gimbaled mechanical unit that makes sure that all three axes are always pointing into the same directions! This is achieved with a gyroscopic stabilizer rotating at 10,000rpm and very expensive precision electro-mechanics. Good for your Boing 777 but not your pocket!
So what's in your pockets? It's a mediocre accelerometer next to a mediocre rotation sensor next to an unreliable compass which will pick up your car keys more often than it will pick up Earth's actual magnetic field. That's what you have to work with.
The rotation of the accelerometer against the $\{a_E, a_N, a_Z - g\}$ system is initially unknown. Let's assume the device rests on the table. The only acceleration acting on it is -g_Z. Since the device is rotated trough three Euler angles against our favorite map system, we will be seeing three acceleration readings $a_x, a_y, a_z$, which should add up vectorially to the gravitational acceleration. So we can naively hope that $a_x^2 + a_y^2 + a_z^2 = g^2$. That, however, will not be the case, because, as we said, we have a crappy accelerometer. Each channel will have an offset, a gain error and, worst of all, at least a first order crosstalk error plus a noise term. So the measured accelerations will be
$\vec{a}_{measured} = M_{gain/crosstalk} \vec{a}_{physical} + \vec{a}_{offset} + \vec{a}_{noise}$.
In the above equation $M_{gain/crosstalk}$ is almost a unity matrix with diagonal elements representing the channel gains and the off-diagonal elements representing the channel crosstalk. If left uncorrected, the channel offsets will lead to a slow drift of an integrated position. If left uncorrected the channel gain errors will make us over- or underestimate the actual distance and the crosstalk errors will make point us in the wrong direction. Most importantly, the zx- and zy-crosstalk will mix part of the constant acceleration g into x and y directions, making the x and y offsets even larger.
But wait! We still haven't taken care of the rotation of the device! This rotation is another Matrix $M_{rot}$, which rotates the physical accelerations in the North-East-Zenith system into the physical accelerations acting on our sensor. So in reality the mapping goes something like this:
$\vec{a}_{measured} = M_{gain/crosstalk} M_{rot} \vec{a}_{NEZ} + \vec{a}_{offset} + \vec{a}_{noise}$.
So how do we estimate M_{rot}? With the rotation sensor, of course. That sensor measures the angular velocities of the device's rotation, then it adds its own offset, gain and crosstalk errors to that and then it gives you a signal that, after proper one time integration can be transformed into the $M_{rot}$ matrix, assuming we knew, at some point in time the original orientation of the device.
Do I have to say it? The million dollar 777 inertial guidance system uses the GPS and a ground calibration routine to align itself properly before the plane takes off. And that's what you have to do, too, to get started turning rotation sensor data into $M_{rot}$. So how does one calibrate the rotation sensor? Using $g_{Zenith}$ to determine the direction of the Zenith and the compass to determine North and East, of course! Oh... wait... we don't know where that vector is pointing unless we have a properly calibrated (and rotated!) accelerometer signal!
And there begins your adventure of combining the signals from three unreliable sensors into one hopefully reliable signal. Part of the problem is to quantify all the error terms and to reconstruct and invert the coordinate transformation matrix.
I don't know how much of this Apple has done already. That should be documented somewhere... in the API or maybe just in an internal document that they won't show to you unless you are under an NDA with the Apple legal department.