1

I am watching a series of lectures by the Noble prize laureate Brian Schmidt and Paul Francis and in this episode (at 4:20) they make the simple assumption that a galaxy receding from us due to the Hubble law must fulfil

$D=v \cdot t$

where $t$ is the time since Big Bang (or since we as observers overlapped in space with the galaxy), $v$ is the galaxy's recession velocity from us and $D$ is the proper distance of the galaxy to us (or at least I gues it is the PROPER distance, because they use it also in $H_0=cz/D$). When they rearange for $t$ they get the age of the universe.

What I don't understand is why the equation $D=v \cdot t$ is a valid assumption in this example? Doesn't this equation only hold if the object is moving at a constant speed? But every galaxy increases its velocity with time due to the Hubble law. What am I getting wrong?

Qmechanic
  • 201,751
NeStack
  • 157
  • It is just an approximation. See this question, answers and links therein: https://physics.stackexchange.com/questions/328409/why-is-hubbles-constant-exactly-the-inverse-of-the-age-of-the-universe?rq=1 – Koschi Apr 19 '22 at 13:14

0 Answers0