4

Suppose I have two (independent) continuous random variables $X$ and $Y$ with pdfs $f(x)$ and $g(x)$ respectively. It is well-known that $f(x)g(x)$ is not the pdf of $XY$; in fact, $f(x)g(x)$ may not be a pdf at all (see Appendix).

On the other hand, (assuming $X$ and $Y$ have common support — h/t Thomas Andrews) it's easy enough to make $f(x)g(x)$ a pdf: just rescale by $\left(\int{f(x)g(x)\,dx}\right)^{-1}$. Then we have the following interesting facts:

  • If $f(x)=g(x)=1[0\leq x\leq 1]$ (uniform distribution), then $f(x)g(x)$ is also the pdf of the uniform distribution.
  • If $f(x)=\lambda_fe^{-\lambda_fx}\cdot1[0\leq x]$, $g(x)=\lambda_ge^{-\lambda_gx}\cdot1[0\leq x]$ (exponential distribution), then $f(x)g(x)\propto(\lambda_f+\lambda_g)e^{-(\lambda_f+\lambda_g)x}\cdot1[0\leq x]$, also the pdf of the exponential distribution.
  • Same for two normal distributions.
  • Multiplying a normal and an exponential gives a normal.
  • If $X$ is as in the appendix ($2x1[0\leq x\leq 1]$) and $Y$ is supported on $[0,1]$ with standard deviation $\sigma$, then $f(x)g(x)$ after rescaling has mean $2\sigma^2$.

So clearly there's something going on here. Is there a probabilistic interpretation for $f(x)g(x)$?


Appendix

For example, let $$f(x)=g(x)=2x\cdot 1[0\leq x\leq1]$$ (where $1[A]$ is the indicator function of $A$). Then $$\int_{\mathbb{R}}{f(x)g(x)\,dx}=\int_0^1{4x^2\,dx}=\frac{4}{3}\neq1$$ Thus $f(x)g(x)$ isn't even a pdf.

For completeness, the law of $XY$ is as follows: \begin{align*} \mathbb{P}[XY\leq x]&=\int_{\mathbb{R}}{f(s)\mathbb{P}\left[Y\leq\frac{x}{s}\right]\,ds} \\ &=\int_0^1{2s\min{(1,(x/s)^2)}\,ds} \\ &=\int_0^x{2s\,ds}+\int_x^1{2x/s\,ds} \\ &=x^2-2x\ln{(x)} \end{align*} To get the pdf, differentiate; the result is precisely $2(x-\ln{(x)}-1)1[0\leq x\leq1]$.

  • 1
    You can only rescale if that integral is non-zero. If $X$ is strictly negative, and $Y$ is strictly positive, then $\int f(x)g(x),dx=0.$ – Thomas Andrews Jun 27 '21 at 23:57
  • @ThomasAndrews: Ooh, good point! – Jacob Manaker Jun 27 '21 at 23:58
  • 1
    If you consider the simpler discrete probabilities, the product is $$P(X=n\mid X=Y)=\frac{P(X=n\land Y=n)}{P(X=Y)}$$ This assumes that $P(X=Y)\neq 0.$ – Thomas Andrews Jun 28 '21 at 00:33
  • @ThomasAndrews: That sounds like the start of an answer to me! – Jacob Manaker Jun 28 '21 at 00:37
  • Yeah, I’m still thinking about how that plays out in the continuous case, since in most continuous cases, $P(X=Y)=0.$ – Thomas Andrews Jun 28 '21 at 00:43
  • 1
    The natural interpretation is that this the joint distribution of independent $X$ and $Y$, conditioned on the event that $X=Y$. I’m no probability expert but I understand there are some ways to interpret conditional probability that make sense even for certain zero-probability events. See for instance this answer: https://math.stackexchange.com/a/450004/30402 – Erick Wong Jun 28 '21 at 01:30

1 Answers1

3

Here's one thing I've noticed: multiplying the pdf multiplies the moment-generating/characteristic function. (Sort of. See below.)

First, recall what the moment generating function (MGF) is: given a random variable $R$, the MGF of $R$ is $$M_R(t)=\mathbb{E}[e^{tR}]$$ The name arises from the fact that $M_R$ has Maclaurin series $$M_R(t)=1+\mathbb{E}[R]t+\mathbb{E}[R^2]\frac{t^2}{2}+\dots\qquad t\ll1$$ $M_R$ doesn't always exist (what if $R$ has infinite moments?), but an analogous function, the characteristic function always does; everything I say can be translated to that mutatis mutandis. In any case, the moment generating function uniquely determines the distribution.

Now, let $X$ have pdf $f(x)$ and $Z$ pdf $cf(z)g(z)$. Then we can write down a simple formula for the moment-generating function of $Z$: $$\mathbb{E}[e^{tZ}]=\int_{\mathbb{R}}{e^{tz}\cdot cf(z)g(z)\,dz}=\int_{\mathbb{R}}{(cg(z)e^{tz})f(z)\,dz}=\mathbb{E}[cg(X)e^{tX}]$$