Space Expansion – Why is the Hubble Time Used to Approximate the Age of the Universe?

space-expansionuniverse

As per Hubble's Law, the recessional velocity of a galaxy is proportional to its distance from us. In a universe that is expanding at a constant rate, this would imply that as the distance to a galaxy increases over time due to the expansion, its recessional velocity would also increase.

However, I’ve read that the Hubble time, which is the reciprocal of the Hubble constant, can be used to approximate the age of the universe in this scenario where the rate of the universe’s expansion is constant, the idea being that this would be equal to the time taken for galaxies to reach their current observed distances from us assuming they started at a single point. This approximation seems to assume that galaxies have been moving away from us at constant velocities, which seems contradictory to the increasing velocities suggested by Hubble's Law.

Can someone help clarify where I’ve gone wrong here? Why is the Hubble time used as an approximation for the age of the universe in a scenario of constant expansion, when the recessional velocities of galaxies would not be constant?

Best Answer

As per Hubble's Law, the recessional velocity of a galaxy is proportional to its distance from us.

Correct.

In a universe that is expanding at a constant rate, this would imply that as the distance to a galaxy increases over time due to the expansion, its recessional velocity would also increase.

Incorrect, but I can understand why you might interpret the law that way.

Let's go back to the basic definitions. When we say the universe is expanding, we essentially mean that its "scale" is increasing. So the distance $D$ between any two objects increases with time. The expansion rate is then naturally defined as being the rate of change in this separation, $\dot{D}$. A constant expansion rate therefore means any two objects recede from each other at a fixed velocity at all times. In this case, Hubble's law, $\dot{D}=H_0D$, can only (strictly) hold at a single instant.

But this is only because we're taking $H_0$ to be a constant. We can instead define the Hubble parameter by $$H(t)=\frac{\dot{D}(t)}{D(t)},$$ which will be time-dependent with a form determined by the evolution of $D(t)$. In the case of constant expansion, $\dot{D}(t)={\rm const.}$, implying $D(t)={\rm const.}\times t$ and therefore $H(t)=1/t$. So we find that $t=1/H(t)$.

Of course, the expansion hasn't remained constant throughout the universe's history, but has evolved under the influence of its matter and energy content. However, during the periods of radiation domination and matter domination that characterized most of its history, the scale factor would have evolved as a power law: $D(t)\propto t^\alpha$, which again gives $H(t)=1/t$. But this is only approximate, and the currently close agreement between $1/H_0$ and the age of the universe is more of a coincidence than anything. This is illustrated by the following plots, which show the slope in $a(t)$ at the present time (marked red) happening to intersect close to $(0,0)$ (left) and the history of the ratio of $1/H(t)$ and $t$ (right).

(Own work)

Funnily enough, we are currently entering the epoch of dark energy domination as matter gets more and more diluted. This means the scale factor will evolve as an exponential $D(t)=\exp(Ht)$ and so $H(t)$ will become a constant. So actually what you imagined will be very accurate in a few billion years :)

Related Question