Sum of independent Gamma distributions is a Gamma distribution
Now that the homework deadline is presumably long past, here is a proof for the case of $b=1$, adapted from an answer of mine on stats.SE, which fleshes out the details of what I said in a comment on the question.
If $X$ and $Y$ are independent continuous random variables, then the probability density function of $Z=X+Y$ is given by the convolution of the probability density functions $f_X(x)$ and $f_Y(y)$ of $X$ and $Y$ respectively. Thus, $$f_{X+Y}(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x)\,\mathrm dx. $$ But when $X$ and $Y$ are nonnegative random variables, $f_X(x) = 0$ when $x < 0$, and for positive number $z$, $f_Y(z-x) = 0$ when $x > z$. Consequently, for $z > 0$, the above integral can be simplified to $$\begin{align} f_{X+Y}(z) &= \int_0^z f_X(x)f_Y(z-x)\,\mathrm dx\\ &=\int_0^z \frac{x^{a_1-1}e^{-x}}{\Gamma(a_1)}\frac{(z-x)^{a_2-1}e^{-(z-x)}}{\Gamma(a_2)}\,\mathrm dx\\ &= e^{-z}\int_0^z \frac{x^{a_1-1}(z-x)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dx &\scriptstyle{\text{now substitute}}~ x = zt~ \text{and think}\\ &= e^{-z}z^{a_1+a_2-1}\int_0^1 \frac{t^{a_1-1}(1-t)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dt & \scriptstyle{\text{of Beta}}(a_1,a_2)~\text{random variables}\\ &= \frac{e^{-z}z^{a_1+a_2-1}}{\Gamma(a_1+a_2)} \end{align}$$
You may use a easier method. Consider the moment generating function or probability generating function. $E(e^{(X+Y)t} )=E(e^{Xt}e^{Yt})=E(e^{Xt})E(e^{Yt})$ as they are independent then we can get a moment generating function of a gamma distribution. Then you can find the mean and variance from the Moment generating function
It's easier to use Moment Generating Functions to prove that. $$ M(t;\alpha,\beta ) = Ee^{tX} = \int_{0}^{+\infty} e^{tx} f(x;\alpha,\beta)dx = \int_{0}^{+\infty} e^{tx} \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha-1}e^{-\beta x}dx \\ = \frac{\beta^\alpha}{\Gamma(\alpha)} \int_{0}^{+\infty} x^{\alpha-1}e^{-(\beta - t) x}dx = \frac{\beta^\alpha}{\Gamma(\alpha)} \frac{\Gamma(\alpha)}{(\beta - t)^\alpha} = \frac{1}{(1- \frac{t}{\beta})^\alpha} $$ By using the property of independent random variables, we know $$M_{X + Y}(t) = M_{X}(t)M_{Y}(t) $$ So if $X \sim Gamma(\alpha_1,\beta), Y \sim Gamma(\alpha_2,\beta), $ $$M_{X + Y}(t) = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1}} \frac{1}{(1- \frac{t}{\beta})^{\alpha_2}} = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1 + \alpha_2}}$$ You can see the MGF of the product is still in the format of Gamma distribution. Finally we can get $X + Y \sim Gamma(\alpha_1 + \alpha_2, \beta)$