[Physics] Why is the reciprocal of the Hubble constant equal to the age of the universe

cosmologyspace-expansiontime

I understand that the Hubble constant is the gradient of the line of best fit when we plot Redshift against distance. I understand why the reciprocal of the gradient would give a value for time. But why do we know (or assume) that this value of time is equal to the age of the universe? How do we know it isn't equal to something else, or it isn't just an arbitrary value?

Best Answer

Hubble's law shows that the redshift velocity of an object is proportional to the distance to the object:

$$v=H \cdot D $$

The redshift velocity being the velocity that would give the observed redshift. At low velocities the amount of redshift is proportional to the redshift velocity.

If you assume the universe to expand linearly, i.e., that the distance between comoving objects grows linearly in time, then the apparent velocity at which a given object seems to move away due to the expansion of space remains constant over time. Assuming the object moved at this same velocity since the big bang, you can calculate how long ago the distance to the object was zero. This time is given by:

$$t=\frac{D}{v}=\frac{1}{H}$$

Note that a consequence of this assumption is that Hubble's constant is actually not a constant, but changes over time: $H=\frac{1}{t}$

Note also that it is far from trivial that this assumption would be legitimate. It depends on the amount of matter and energy in the universe, whether the rate of expansion is constant or not.

For more info see Hubble's law

Related Question