Prove that these functions are linearly independent
You can use the Vandermonde determinant.
Let $$\beta_1e^{-\alpha_1x} + \cdots + \beta_ne^{-\alpha_1x} = 0$$
Plugging in $x = 0, 1, \ldots, n-1$ yields:
$$\beta_1 + \cdots + \beta_n = 0$$ $$\beta_1e^{-\alpha_1} + \cdots + \beta_ne^{-\alpha_n} = 0$$ $$\beta_1e^{-2\alpha_1} + \cdots + \beta_ne^{-2\alpha_n} = 0$$ $$\vdots$$ $$\beta_1e^{-(n-1)\alpha_1} + \cdots + \beta_ne^{-(n-1)\alpha_n} = 0$$ The determinant of this linear system is the Vandermonde determinant:
$$\begin{vmatrix} 1 & 1 & \cdots & 1\\ e^{-\alpha_1} & e^{-\alpha_2} & \cdots & e^{-\alpha_n}\\ e^{-2\alpha_1} & e^{-2\alpha_2} & \cdots & e^{-2\alpha_n}\\ \vdots & \vdots & \ddots & \vdots\\ e^{-(n-1)\alpha_1} & e^{-(n-1)\alpha_2} & \cdots & e^{-(n-1)\alpha_n}\\ \end{vmatrix} = \prod_{1 \le i < j \le n} (e^{-\alpha_j} - e^{-\alpha_i}) \ne 0$$
because $e^{-\alpha_i} \ne e^{-\alpha_j}$ for all $i \ne j$. Hence, the system has the unique solution $\beta_1= \beta_2 = \cdots = \beta_n = 0$, which implies linear independence.
Another solution using Vandermonde.
Assume
$$\beta_1e^{-\alpha_1x} + \cdots + \beta_ne^{-\alpha_1x} = 0$$
Taking the derivative $n-1$ times yields:
$$\beta_1e^{-\alpha_1x} + \cdots + \beta_ne^{-\alpha_nx} = 0$$ $$-\alpha_1\beta_1e^{-\alpha_1x} - \cdots - \alpha_n\beta_ne^{-\alpha_nx} = 0$$ $$\alpha_1^2\beta_1e^{-\alpha_1x} + \cdots + \alpha_n^2\beta_ne^{-\alpha_nx} = 0$$ $$\vdots$$ $$\alpha_1^{n-1}(-1)^{n-1}\beta_1e^{-\alpha_1x} + \cdots - \alpha_n^{n-1}(-1)^{n-1}\beta_ne^{-\alpha_nx} = 0$$
The determinant of this linear system is again the Vandermonde determinant:
$$\begin{vmatrix} e^{-\alpha_1x} & e^{-\alpha_2x} & \cdots & e^{-\alpha_nx}\\ (-\alpha_1)e^{-\alpha_1x} & (-\alpha_2)e^{-\alpha_2x} & \cdots & (-\alpha_n)e^{-\alpha_nx}\\ (-\alpha_1)^2e^{-\alpha_1x} & (-\alpha_2)^2e^{-\alpha_2x} & \cdots & (-\alpha_n)e^{-\alpha_nx}\\ \vdots & \vdots & \ddots & \vdots\\ (-\alpha_1)^{n-1}e^{-\alpha_1x} & (-\alpha_2)^{n-1}e^{-\alpha_2x} & \cdots & (-\alpha_n)^{n-1}e^{-\alpha_nx}\\ \end{vmatrix}$$ $$ = e^{-\alpha_1x}\cdots e^{-\alpha_nx} \begin{vmatrix} 1 & 1 & \cdots & 1\\ -\alpha_1 & -\alpha_2 & \cdots & -\alpha_n\\ (-\alpha_1)^2 & (-\alpha_2)^2 & \cdots & (-\alpha_n)^2\\ \vdots & \vdots & \ddots & \vdots\\ (-\alpha_1)^{n-1} & (-\alpha_2)^{n-1} & \cdots & (-\alpha_n)^{n-1}\\ \end{vmatrix} =e^{-\alpha_1x}\cdots e^{-\alpha_nx} \prod_{1 \le i < j \le n} (\alpha_i - \alpha_j) \ne 0$$
because $\alpha_i \ne \alpha_j$ for all $i \ne j$. Hence, the system has the unique solution $\beta_1= \beta_2 = \cdots = \beta_n = 0$, which implies linear independence.
Hint. If they are linearly dependent then there exist $c_1,\dots,c_n$ not all zero such that for $x\in \mathbb{R}$, $$c_1e^{-\alpha_1 x}+\dots +c_ne^{-\alpha_n x}=0.$$ We may assume that $0\not=c_i=c_{i+1}=\dots =c_n=0$, then $$c_1e^{-\alpha_1 x}+\dots+c_{i-1}e^{-\alpha_{i-1} x}+c_ie^{-\alpha_{i} x}=0,$$ and after multiplying both sides by $e^{\alpha_{i} x}$ we get $$c_1e^{(\alpha_i-\alpha_1) x}+\dots+c_{i-1}e^{(\alpha_i-\alpha_{i-1}) x}+c_i=0.$$ Now note that $\alpha_i-\alpha_k<0$ for $k=1,\dots, i-1$, and take the limit as $x\to +\infty$. What may we conclude about $c_i$?
P.S. Note that the hint works also without the condition $\alpha_n\geq 0$.
As usual, we will consider $$\beta_1 e^{-\alpha_1 x}+\beta_2 e^{-\alpha_2 x}=0$$
Dividing by $e^{-\alpha_2 x}$ gives $$\beta_1 e^{-(\alpha_1-\alpha_2)x}+\beta_2=0$$ Letting $x\to\infty$ we see that $e^{-(\alpha_1-\alpha_2)x}\to 0$ since $\alpha_1>\alpha_2\geqslant 0$. Therefore $\beta_2=0$. Since $e^{-\alpha_1x}>0$ for all $x$, we also have $\beta_1=0$. Therefore $\{e^{-\alpha_1 x}, e^{-\alpha_2 x}\}$ is linearly independent.
I am sure that you can prove it now for the more general case.