[GIS] If the GPS navigation message takes 12 1/2 minutes to cycle, how can receivers update your position every second

accuracygps

According to Wikipedia's article on GPS,

Each GPS satellite continuously broadcasts a navigation message… Each complete message takes 750 seconds (12 1/2 minutes) to complete.

My understanding of how GPS works is that the receiver receives the broadcast message and calculates location based on that. So how can a GPS system start delivering reasonably accurate locations within a few seconds?

Wikipedia mentions that the message is divided into frames and sub-frames:

The message structure has a basic format of a 1500-bit-long frame made up of five subframes, each subframe being 300 bits (6 seconds) long… [A] complete data message requires the transmission of 25 full frames…. [T]his gives 750 seconds to transmit an entire almanac message (GPS). Each 30-second frame begins precisely on the minute or half-minute as indicated by the atomic clock on each satellite.

But the iPhone's GPS, for example, delivers about a location point a second. Even if a frame or a subframe is enough, that's 6-30 seconds. How can the hardware update its position with reasonable (reported) accuracy every single second? Is the iPhone lying to me?

Best Answer

First of all, the GPS Almanac consists of information about the GPS constellation, satellites' health and their course in order to make it easier (possible) for your receiver to find them in the air. (Most of the time, you do not need to download that since you did before, but that is a discussion for another Q/A). The point is, almanac is not important for receiver positioning update rate. The receiver has to have it and then it can work.

(GPS.About.com, GPSWorld)

The GPS receiver calculates its position from distances to satellites, this technique is called Trilateration. But the important is, how it measures the distances (!). It measures the distances to satellites by comparing the code it receives from satellites with a replica of the same code it creates. Each GPS satellite transmits signal which a specific pattern. The receiver knows the patterns, so it tests the received code to its own replicas to know from which satellite the signal came from.

Each satellite transmits a pattern of 0 and 1 (for example: 00111010111010101001...). The civil signals has code of 1.023 Mcps (mega-chips per second, chip = bit without information, every micro-second one chip). The receiver creates its own replicated pattern which goes the same.

We now assume ideal situation when your clock is synchronized with the satellite clock. In such case the satellite transmitted the first '0' in time t0 and your code inside the receiver also generated the first '0' at t0.

The code from the satellite will arrive with a delay, td. So, you will receive the first '0' from the satellite at td. The receiver finds out the delay by shifting its replica until the replica and the received signal'start at the same time' (this is very rough explanation, the correct therm is that the receiver correlates the received signal and replica and the time delay is found when the correlation function reaches its maximum).

I hope this picture from Kaplan:Understanding GPS will help to demonstrate the principle, the last picture is simplified correlation function,(Tc is duration of one chip (in our case 1 micro-sec): The signal generated by the satellite, the received signal, the replicated signal ,the shifted signal, and bellow the simplified shape of autocorrelation function. (*Tc* is duration of one chip (in our case 1 micro-sec)

So, regarding the position update. To estimate a position from code delay every second once the satellite code is tracked is not a problem. The receiver has only to make some shifts to estimate the delay of signal it continuously receives and replicates. These days, we were testing some of our application with the 10Hz (10 updates per sec) and 20Hz (20 updates per sec) receivers.

Related Question