[GIS] Reduce Tracking Data Set

geoprocessingpostgisspatial-databaseweb-mapping

I'm building an application that uses data from GPS units on board vehicles and displays it on a map. A lat/long is recorded every .5 seconds and a session can be many hours… this can add up to be a lot of data. The vehicles travel in a straight line the majority of the time but the data needs to be very accurate when a vehicle turns a corner, hence the high recording frequency. I'm storing the lat/longs as a POINT in a PostGIS/PostgreSQL database (This isn't set in stone).

I'm trying to find a way of reducing the dataset when displaying it on the map while still maintaining accuracy.

I have come up with a few options:

  1. Group the data into 2 second blocks and average the lat/longs in each group. This is really easy to do but has too much information when traveling in straight lines and not enough when turning corners.

  2. Group points together which are a certain distance (a vehicles length?) and time (4 seconds?… to ensure it won't be group if a vehicle moves over the same point half an hour later)

  3. Calculate splines of the data (I don't know very much about this option).

Perhaps PostGIS already provides this functionality? I'm simply looking for someone to point me in the right direction.

Best Answer

In PostGIS, you could use the 3DM concept. A 3DM point has four "dimension" X, Y, Z, and M where "M" stands for "measure" and can be any value you like, e.g. a timestamp in seconds.

Creating a line from 3DM points should preserve the fourth dimension for every single point. (I haven't tried this myself. I'm basing this on the excellent "PostGIS in Action" book.) Then you could use Douglas-Peucker Algorithm to reduce the number of points/vertices of the line.