The following question has been taken from David J Griffith's Intro to QM. This is not my homework! :D
Suppose I drop a rock off a cliff of height $h$. As it falls, I snap a million photographs, at random intervals. On each picture I measure the distance the rock has fallen. Question : What is the average of all these distances ? That is to say, what is the time average of the distance traveled ?
Now the book gives the solution. I do not understand the question.The solution goes on to develop a probability density as a solution although I do not understand why.The probability density $\rho(x)$ has been shown below. I am aware of the mechanics behind the freely falling body. I fail to understand the question. What is the idea behind the question ? The distance the rock travels has to be h, in reality. But how does taking photos change the average distance ? The total time of travel is given by $T = \sqrt{2h/g}$. If we are to take a million snaps, then certainly a snap has to be taken after every fixed number of units of time, in order to accomodate a million snaps within the time $T$. Kindly help me to get the physical reality behind this problem. Thank you !
$\rho(x) = \frac{1}{2\sqrt{hx}}$ where $(0\leq x\leq h)$