[Physics] How often does a GPS receiver receive microwave signals from satellites

gps

I'm just finishing up a report on the mathematics of GPS, but I'm not sure how to determine how often a GPS receiver actually receives data from the satellites.

If a satellite send out its microwave signals at a frequency of 1575.42 MHz, does that mean it sends out a signal every $1/1575.42\times10^6=6.35\times10^{-10}$ seconds? Or I am I just working out the period of the wave here? If so, how would I determine how often a signal is received?

Best Answer

The satellites broadcast continuously, so there is no "signal reception interval" per se. Like others wrote, the 1.57542 GHz oscillation is just a carrier-wave for additional modulations on top of it.

To directly answer the question - a receiver can chose to sample the satellite signal for example for only 8 milliseconds, or longer say 500 ms, or it can continuously receive it (this is usually called "tracking mode" and is done after acquisition of the signal in the first place). If you listen longer, you have a better chance of "seeing" the satellites and getting a better fix.

The more complicated answer is that you have to realize that the GPS system is not a typical communications medium where you send abstract pieces of information from A to B like in a normal radio transmission, in GPS there are layers of bit encodings and sub-bit / phase decodings of the modulation layers themselves, doppler-shift analysis of the modulations because the sats move etc.; a bit too complicated to describe in a reply here and I don't think it's what you're looking for either (but if you want, just ask in the comment and I'll explain).

Related Question