Legitimacy of a Bivariate Distribution Function
Suppose $F$ is the joint cumulative distribution function of the random variables $X$ and $Y$, so that $P(X \leq x, Y \leq y) = F(x,y)$.
Fix $x = 0$ and consider $$\lim_{y\rightarrow \infty} F(0,y) = \lim_{y\rightarrow \infty}1 - e^{-y} = 1$$
For $x_{0} < 0$
$$\lim_{y\rightarrow \infty}F(x_{0}, y) = 0$$
follows from $F(x,y)$ being equal to zero outside the first quadrant.
It follows that $P(X = 0) = 1$.
A similar argument shows that $P(Y = 0) = 1$.
But then the random vector $(X,Y)$ can only take the value $(0,0)$, which is inconsistent with our alleged cumulative distribution function. Hence $F$ can't really be a cumulative distribution function after all.
Another way to see it is that all bivariate joint cumulative distribution functions must satisfy the property
For all $x_1 < x_2 \in \mathbb{R}$ and $y_1 < y_2 \in \mathbb{R}$
$$F(x_2,y_2) - F(x_2,y_1) - F(x_1, y_2) + F(x_1,y_1) = P(x_1 < X \leq x_2, y_2 < Y \leq y_2) \geq 0$$
To see why the left hand side gives the probability in the middle first consider the region $\left\{(x,y) : x \leq x_2, y \leq y_2\right\}$. $F(x_2, y_2)$ gives the probability $(X,Y)$ lies in this region. $F(x_1, y_2) = P(X \leq x_1, y \leq y_2)$, while $F(x_2, y_1) = P(X \leq x_2, y \leq y_1)$. If we subtract by $F(x_1, y_2)$ and $F(x_2, y_1)$, we almost have $P(x_1 < X \leq x_2, y_1 < Y \leq Y_2)$, except we've subtracted out the probability for the region $\left\{(x,y) : x \leq x_1, y \leq y_1\right\}$ twice. The probability $(X,Y)$ lies in this last region is $F(x_1, y_1)$, adding $F(x_1, y_1)$ back in to make up for the double subtraction we did gives us
$$P(x_1 < X \leq x_2, y_1 < Y \leq y_2) = F(x_2, y_2) - F(x_1, y_2) - F(x_2, y_1) + F(x_1, y_1)$$
It can easily be seen that $F(x,y) = 1 - e^{-x-y}$ doesn't satisfy this property by choosing $x_1 = y_1 = 0$ and $x_2 = y_2 = \infty$.