How calculate the probability density function of $Z = X_1/X_2$

Let's first assume that $X_1$ and $X_2$ are independent continuous random variables.

The cumulative distribution function for $Z$ is $$ \begin{eqnarray} F_Z(z) &=& \mathbb{P}( X_1/X_2 \le z) = \mathbb{P}( X_1 \le X_2 z ; X_2 >0) + \mathbb{P}( X_1 \ge X_2 z ; X_2< 0) \\ &=& \mathbb{E}( F_{X_1}( z X_2) ; X_2 > 0) + \mathbb{E}( 1 - F_{X_1}( z X_2) ; X_2 < 0) \end{eqnarray} $$

The cumulative function $F_Z(z)$ is continuous, so the probability density for $Z$ is found by differentiation $f_Z(z) = F_Z^\prime(z)$: $$ f_Z(z) = \mathbb{E}( f_{X_1}( z X_2) X_2 ; X_2 > 0) + \mathbb{E}( -f_{X_1}( z X_2) X_2 ; X_2 < 0) = \mathbb{E}\left( \vert X_2\vert \cdot f_{X_1}(z X_2)\right) $$


Example: Let's work out an example. Let $X_1$ and $X_2$ be independent random variables each from standard normal distribution. Then $$ \begin{eqnarray} f_Z(z) &=& \mathbb{E}\left( \vert X_2\vert \frac{1}{\sqrt{2\pi}} \mathrm{e}^{-\frac{1}{2} z^2 X_2^2} \right) \\ &=& \int_{-\infty}^\infty \vert x\vert \frac{1}{\sqrt{2\pi}} \mathrm{e}^{-\frac{1}{2} z^2 x^2} \cdot \frac{1}{\sqrt{2\pi}} \mathrm{e}^{-\frac{1}{2} x^2} \mathrm{d} x \\ &\stackrel{\text{use parity}}{=}& \frac{1}{2 \pi} 2 \int_0^\infty x \exp\left( -\frac{x^2}{2} \left(1 + z^2 \right) \right) \mathrm{d} x \\ &\stackrel{t = x^2}{=}& \frac{1}{\pi} \int_0^\infty \exp(-t \left(1+z^2 \right)) \mathrm{d} t \\ &=& = \frac{1}{\pi} \frac{1}{1+z^2} \end{eqnarray} $$

And we obtained the expected density of the Cauchy distribution.


Dependent variables

The above derivation carries through almost the same for dependent variables, whose measure is absolutely continuous, that is when $\mathrm{d}F_{X_1,X_2}(x_1,x_2) = f_{X_1,X_2}(x_1,x_2) \mathrm{d}x_1 \mathrm{d}x_2$. Dilip worked it out to be $$ f_Z(z) = \int_{-\infty}^\infty \vert x_2\vert f_{X_1,X_2}(z x_2, x_2) \mathrm{d}x_2 = \mathbb{E}\left( \vert X_2\vert \cdot f_{X_1|X_2}(z X_2, X_2)\right) $$

Absolute continuity of the joint measure is sufficient, but not necessary, condition to ensure that $Z$ is a continuous random variable, i.e. that $Z$ also has absolutely continuous measure.

It is instructive to work out few examples to see what happens when the joint measure is not absolutely continuous. Borrowing an example from this post, consider $(X_1,X_2) = (\sin(U), \cos(U))$, where $U$ follows uniform distribution on $\left(-\frac{\pi}{2}, \frac{\pi}{2}\right)$ interval. Then, although the joint density is not absolutely continuous, $Z = X_1/X_2 = \tan(U)$ is continuous random variable and follows the standard Cauchy distribution.

However, things can go awry. Consider this example from Didier Piau, and let $(X_1,X_2) = (Y,Y)$, where $Y$ follows, say, standard normal distribution. In this case $Z = X_1/X_2 = 1$, and $F_Z(z) = \mathbf{1}_{z \ge 1}$, so the measure is not absolutely continuous.


As an alternative to Sasha's answer, with $Z = Y/X$, $F_Z(z)= P\{Y/X \leq z\}$ is the total probability mass in the region of the $x$-$y$ plane where $y \leq zx$. Although the OP did not state so, I assume that he meant that $X$ and $Y$ are jointly continuous, in which case this probability can be obtained by integrating the joint density function $f_{X,Y}(x,y)$ over this region. Sketching the region in question (for ease in setting up the integrals) we have $$\begin{align*} F_Z(z) &= \int_{x=0}^{\infty}\int_{y=-\infty}^{zx} f_{X,Y}(x,y) \mathrm dy\ \mathrm dx + \int_{x=-\infty}^{0}\int_{y=zx}^{\infty} f_{X,Y}(x,y) \mathrm dy\ \mathrm dx,\\ f_Z(z) = \frac{\mathrm d}{\mathrm dz}F_Z(z) &= \int_{0}^{\infty} x\cdot f_{X,Y}(x,zx) \mathrm dx - \int_{-\infty}^{0} x\cdot f_{X,Y}(x,zx) \mathrm dx, \end{align*} $$ via the formula for "differentiating under the integral sign".

When the joint density of $X$ and $Y$ has circular symmetry about the origin, then $X$ and $Y$ are in general, dependent random variables except in the special case when the marginal densities of $X$ and $Y$ are zero-mean normal densities with identical variance. For circularly symmetric joint densities, the integrals above can be evaluated readily by switching to polar coordinates, and even more simply by noting that the volume under the surface in the region of integration is a linear function of the angle $\theta$ between the line $y = zx$ and the $x$ axis, with the volume being $0$ at angle $\theta = -\pi/2$ and $1$ at $\theta = \pi/2$. Since $\theta = \arctan(z)$, we get $$ F_Z(z) = \frac{1}{2} + \frac{1}{\pi}\arctan(z) $$ and thus $Z = Y/X$ has a Cauchy density if $X$ and $Y$ have a joint density that is circularly symmetric about the origin.

The more detailed calculation is as follows. If $f_{X,Y}(x,y) = g(r)$, then since $$\int_{x=-\infty}^{\infty}\int_{y=-\infty}^{\infty} f_{X,Y}(x,y) \mathrm dy\ \mathrm dx = \int_0^{\infty}\int_0^{2\pi} g(r)\cdot r\cdot \mathrm d\theta\ \mathrm dr = 1$$ so that $\int_0^{\infty} g(r)\ \mathrm dr = 1/2\pi$, we have $$\begin{align*} F_Z(z) &= \int_{x=0}^{\infty}\int_{y=-\infty}^{zx} f_{X,Y}(x,y) \mathrm dy\ \mathrm dx + \int_{x=-\infty}^{0}\int_{y=zx}^{\infty} f_{X,Y}(x,y) \mathrm dy\ \mathrm dx\\ &= 2\int_{\theta=-\pi/2}^{\arctan(z)}\int_{r=0}^{\infty} r \cdot g(r) \mathrm dr \ \mathrm d\theta\\ &= 2(\arctan(z) + \pi/2)/(2\pi) = \frac{1}{2} + \frac{1}{\pi}\arctan(z) \end{align*} $$ and the Cauchy density is obtained upon differentiating.


@Sasha Could someone elaborate on why: \begin{equation} F_Z(z)=\mathbb{P}(X_1/X_2≤z)=\mathbb{P}(X_1≤X_2z;X_2>0)+\mathbb{P}(X_1≥X_2z;X_2<0)\\ \mathbb{E}(F_{X_1}(zX_2);X_2>0)+\mathbb{E}(1−F_{X_1}(zX_2);X_2<0) \end{equation} In the first line the R.H.S. has both $(X_1\leq X_2z)$ and $(X_1\geq X_2z)$. I get that we consider the cases when $X_2 \gt 0$ and $X_2 \lt 0$ but the $X_1$ inequality shouldn't remain $\leq$?
And in the second line why $\mathbb{P}(X_1≤X_2z;X_2>0) = \mathbb{E}(F_{X_1}(zX_2);X_2>0)$

Thanks in advance.

Tags:

Probability