1

Let $X_1$ and $X_2$ be real-valued random variables, such that $X:=(X_1,X_2)$ has the density $$f_X(x_1,x_2)=\begin{cases}e^{-(x_1+x_2)}&\text{, if }x_1,x_2\ge 0\\0&\text{, otherwise}\end{cases}\tag{1}$$ Then

  1. $X_1+X_2$ and $X_1/X_2$ are independent
  2. $X_1+X_2$ and $X_1/(X_1+X_2)$ are independent

I know the following fact: If $X_i$ has a continuous density $f_{X_i}$, then $X_1$ and $X_2$ are independent iff $$f_X(x_1,x_2)=f(x_1)f(x_2)\tag{2}$$ Since $(1)$ is obviously continuous, this fact seems to be useful here. However, I'm unsure what exactly I really know about $X_1$,$X_2$,$X_1+X_2$,$X_1/X_2$ and $X_1/(X_1+X_2)$ if the only thing given is the density of $X$.

0xbadf00d
  • 13,422
  • Try the approach there. – Did Nov 17 '14 at 14:50
  • @Did Let's try to first determine the density of $X_1+X_2$. Please notice: If $\psi\in C^1(\mathbb{R}^2\to\mathbb{R}^2)$ such that $\psi^{-1}\in C^1$, we've got $$f_X(x)=f_Y\left(\psi(x)\right)\left|\det\frac{d\psi(x)}{dx}\right|$$ Now let $$\psi(X):=\begin{pmatrix}X_1+X_2\X_2\end{pmatrix},$$ i.e. $$\psi^{-1}(X)=\begin{pmatrix}Y_1-Y_2\Y_2\end{pmatrix}$$ and thereby $$A:=\frac{d\psi(X)}{dx}=\begin{pmatrix}1&1\0&1\end{pmatrix}$$ with $\det A=1$. So, $$f_Y(y_1,y_2)=f_X(y_1-y_2,y_2)$$ We can obtain the density of $Y_1=X_1+X_2$ by – 0xbadf00d Nov 17 '14 at 16:40
  • $$f_{Y_1}(y_1)=\int_{-\infty}^\infty f_Y(y_1,y_2)dy_2=\int_{-\infty}^\infty \underbrace{e^{-(y_1-y_2+y_2)}}_{=e^{-y_1}}dy_2$$ But whait; this integral doesn't converge. What am I doing wrong? – 0xbadf00d Nov 17 '14 at 16:41
  • 1
    Why this should help to deduce the density of $(X_1+X_2,X_1/X_2)$ is beyond me, but anyway, $$f_{X_1+X_2}(y)=\int_{-\infty}^\infty f_{X_1,X_2}(x_1,y-x_1)\mathrm dx_1.$$ – Did Nov 17 '14 at 17:13

1 Answers1

1

Let $U=X_1+X_2$ and $V=\frac{X_1}{X_2}$. First, observe that $U,V \in (0,+\infty)$ and that $X_1, X_2$ are independent since $$f_{X_1,X_2}(x_1,x_2)=e^{-x_1}1_{\{x_1\ge0\}}\cdot e^{-x_2}1_{\{x_2\ge0\}}=f_{X_1}(x_1)f_{X_2}(x_2)$$ Second $$\begin{cases}U=X_1+X_2\\V=\frac{X_1}{X_2}\end{cases}\implies\begin{cases}X_1=\dfrac{UV}{V+1}\\X_2=\dfrac{U}{V+1}\end{cases}$$ which gives you $$det(J)=\begin{vmatrix}\frac{\partial X_1}{\partial U}&\frac{\partial X_1}{\partial V}\\\frac{\partial X_2}{\partial U}&\frac{\partial X_2}{\partial V}\end{vmatrix}=\begin{vmatrix}\frac{V}{V+1}&\frac{U}{(V+1)^2}\\\frac{1}{V+1}&-\frac{U}{(V+1)^2}\end{vmatrix}=-\frac{U}{(V+1)^2}$$ and therefore $$\begin{align*}f_{U,V}(u,v)&=f_{X_1,X_2}\left(\frac{uv}{v+1},\frac{u}{v+1}\right)\cdot|det(J)|=\exp\left(-\frac{uv}{v+1}-\frac{u}{v+1}\right)\cdot\frac{u}{(v+1)^2}\\&=ue^{-u}1_{\{u\ge0\}}\cdot\frac{1}{(v+1)^2}1_{\{v\ge0\}}=g(u)\cdot h(v)\end{align*}$$ which shows that $U,V$ are independent since $f_{U,V}$ is of "separating variables" with respect to $U,V$. Note, however that you should multiply (and divide) $g(u), h(v)$ with appropriate constants to obtain the marginal densities of $U,V$.


Work similarly for the second case, using the result of the first.

Jimmy R.
  • 35,868