In Robertson-Walker universe, light is emitted from a star with spatial coordinates $(r_s,\theta_s,\phi_s)$. It travels radially inwards and is received by an observer situated at the origin $(r=0)$. Show that the ratio of the observed wavelength $\lambda$ to the proper wavelength $\lambda_0)$ is given by $$\lambda /\lambda_0=R(t_2)/R(t_1),$$ where $t_1$ is the time of emision, and $t_2$ is the time of reception.
Asked
Active
Viewed 50 times
2
-
1Doy you mean spatial coordinates? And what is $R(t)$? – draks ... Sep 27 '12 at 15:03
-
This might be better placed at http://phyiscs.stackexchange.com. – joriki Sep 27 '12 at 15:43
-
2@draks From the context it looks like he means spherical coordinates. – rschwieb Sep 27 '12 at 16:23
-
I think that in order to be able to answer the question, we should exactly know which coordinate representation of the metric he is talking about. – Raskolnikov Oct 02 '12 at 15:31