[Physics] What’s the difference between proper acceleration and coordinate acceleration

accelerationgravityinertial-framessensor

I'm not a physics student, but I need to understand these concepts in order to explain what exactly an accelerometer measures and not measures, and why. Please, do not explain to me what an accelerometer does: that isn't the direct purpose of this question. I'm not looking for all mathematical equations, but more for intuitions, but at the same time I'm not looking for oversimplifications.


If I understood correctly, proper acceleration is measured in units which are multiples of $g=9.81$. It's not the same thing as coordinate acceleration, which is dependent on the choice of coordinate systems (according to this Wikipedia article).

  1. Why does proper acceleration does not depend on a coordinate system, and why coordinate acceleration does?

    I understood that proper acceleration is measured from an inertial frame of reference, i.e. a frame of reference which is not accelerating. Is this the reason why proper acceleration does not depend on a coordinate system, where coordinate system here actually means a frame of reference?

    If this is the case, then I suppose coordinate acceleration is the acceleration measured from any random coordinate system (frame of reference).

    1. Would coordinate acceleration and proper acceleration be the same if coordinate acceleration was measured in an inertial frame of reference?

Apparently, gravity does not cause proper acceleration since an accelerometer would measure $0g$ if in free-fall, i.e. where the only force acting upon the accelerometer would be gravity, in case we consider gravity a force.

I think there's a misunderstanding here between what an accelerometer does and what actually proper acceleration is. I believe that the accelerometer would detect $0g$ because it subtracts the acceleration due to gravity from its calculations…

Furthermore, an accelerometer would detect $1g$ (upwards) on the surface of the Earth, apparently, because there's the surface of the Earth pushing the accelerometer upwards.

  1. Why exactly does an accelerometer measure $1g$ on the surface of the Earth?

  2. Coordinate acceleration, I've seen, seems also to be defined as simply the rate of change of velocity. Why isn't proper acceleration also defined as a rate of change of velocity?

  3. What's the fundamental difference between coordinate acceleration and proper acceleration (maybe without referring to an accelerometer to avoid a circular definition, which would only cause confusion again)?

Best Answer

Why does proper acceleration does not depend on a coordinate system, and why coordinate acceleration does?

Essentially, this defines the difference. Coordinate acceleration depends on the coordinate system chosen, proper acceleration does not.

As an analogy, consider the time read by a 'wristwatch' worn by an astronaut in some thought experiment. All observers agree on the time read by that wristwatch at some event which is to say that the wristwatch (proper) timer is observer (coordinate) independent.

Similarly, all observers agree on the the reading of an accelerometer worn by the astronaut at some event which is to say that the accelerometer (proper) acceleration is observer (coordinate) independent.

However, and assuming that the astronaut has non-zero proper acceleration, two inertial observers that are in uniform relative motion will, in general, measure (using their own inertial coordinate systems) different accelerations for the astronaut.

The usual example is an astronaut with constant proper acceleration. Observers for which the astronaut has near light speed will measure vanishingly small (coordinate) acceleration while observers for which the astronaut has non-relativistic speed will measure nearly the same (coordinate) acceleration as the astronaut's accelerometer reads.

You specifically asked for an intuition so here it is: this result more or less simply must be the case since no material object can accelerate to speed $c$.