1

Let $x_1=\exp(\lambda_1),x_2=\exp(\lambda_2)$, and $x_1, x_2$ are independent random variables.

Show that $P(x_1<x_2)=\dfrac{\lambda_1}{\lambda_1+\lambda_2}$

callculus42
  • 30,550
Wan Ethan
  • 13
  • 3

1 Answers1

1

Hint: Just calculate

$$P(X_1<X_2)=\int_0^{\infty} \int_0^{x_2} \lambda_1\cdot e^{-\lambda_1\cdot x_1}\cdot \lambda_2\cdot e^{-\lambda_2\cdot x_2} dx_1 dx_2$$

$$=\lambda_1\cdot \lambda_2\cdot\int_0^{\infty} e^{-\lambda_2\cdot x_2}\cdot \left( \int_0^{x_2} e^{-\lambda_1\cdot x_1} dx_1 \right) dx_2$$

callculus42
  • 30,550