Negative binomial distribution - sum of two random variables

The $NB(r,p)$ can be written as independent sum of geometric random variables.

Let $X_i$ be i.i.d. and $X_i\sim Geometric (p)$.

Then $X\sim NB(r,p)$ satisfies $X = X_1 + \cdots +X_r$,

and $Y\sim NB(s,p)$ satisfies $Y= X_{r+1} + \cdots + X_{r+s}.$

Therefore, $X+Y = X_1 + \cdots + X_{r+s}.$

This yields $X+Y \sim NB(r+s, p)$.


Since $X,Y$ are independent, the moment generating function (MGF) of $X+Y$ is the multiplication of the MGF of $X$ and MGF of $Y$. The MGF of $X$ is $\displaystyle M_X(t)=(\frac{1-p}{1-pe^t})^r$, and this is $\displaystyle(\frac{1-p}{1-pe^t})^s$ for $Y$. Now since $X,Y$ are independent, we have that $$\begin{align} M_{X+Y}(t)&=M_X(t)M_Y(t)\\ &=(\frac{1-p}{1-pe^t})^s(\frac{1-p}{1-pe^t})^r\\ &=(\frac{1-p}{1-pe^t})^{s+r} \end{align} $$ Therefore $\displaystyle M_{X+Y}(t)=(\frac{1-p}{1-pe^t})^{s+r}$ is the MGF of an $NB$ distribution with parameters $r+s$ and $p$, meaning that $X+Y$ is $NB(r+s,p)$.


Hint:

If $\Pr(X=k)={k+r-1 \choose k}\cdot (1-p)^r p^k$ and $\Pr(Y=k)={k+s-1 \choose k}\cdot (1-p)^s p^k$ then

$$\Pr(X+Y=k)=\sum_{j=0}^k {j+r-1 \choose j}\cdot (1-p)^r p^j \cdot {k-j +s-1 \choose k-j}\cdot (1-p)^s p^{k-j}$$

$$=\sum_{j=0}^k {j+r-1 \choose j}\cdot {k-j +s-1 \choose k-j}\cdot (1-p)^{r+s} p^k$$

and you need to show

$$\Pr(X+Y=k)= {k+r+s-1 \choose k}\cdot (1-p)^{r+s} p^k$$

so it is just a matter of showing $\displaystyle \sum_{j=0}^k {j+r-1 \choose j}\cdot {k-j +s-1 \choose k-j}={k+r+s-1 \choose k}.$