I'm making an app that times how long it takes a pebble to fall, then calculates the distance it fell.
I noticed that the simple $f(t) = \frac{1}{2}gt^2$ was becoming increasingly inaccurate as the distance got higher, so I'm curious if there's a standard formula accounting for terminal velocity.
What I've come up with so far
The wikipedia page on terminal velocity lists the formula for terminal velocity ($V_t$) as:
$$ V_t = \sqrt{\frac{2mg}{\rho AC_d}} $$
I'm approximating that the pebble weighs $m=5g$ with a projected area $A=2cm^2$. $C_d=0.47$ for spheres and $\rho =1.204$ at $20^{\circ}C$, so that gives us:
$$ V_t = \sqrt{\frac{2\cdot0.005\cdot9.81}{1.204\cdot 0.0004\cdot 0.47}} \approx 20.82 m/s $$
Seems pretty above board so far. But now I need to combine $f(t) = \frac{1}{2}gt^2$ with $V_t = 20.82 m/s$ so $f(t)$ "grows asymptotically" to $V_t$. I don't know how to do that, but playing around with my computer's graphing program got me this:
$$ f(t) = \frac{(v_t-\frac{1}{5})\cdot x^2+1}{x^2+5}-\frac{1}{5} $$
($\frac{1}{2}gt^2$ is green, $V_t$ is dashed, my made up formula is blue)
My approximated formula seems... close? I could take some measurements and validate this new formula experimentally, but I can't imagine I'm the first person to need to approximate distance given time for a falling object.
Also: this graph made it pretty clear that after ~2 seconds of the pebble falling $\frac{1}{2}gt^2$ (the green line) starts getting grossly inaccurate.
tl;dr: what's the formula for the velocity of a falling object with respect to time given $g, \rho, A$, and $C_d$?


