Necessary and sufficient condition for convergence of series
I would argue that your approach proved that 0-1 law does not hold to show convergence to a constant in your case. I assume you used Kolmogorov's 0-1 law, though that does not hold here to show convergence to a constant, it only shows you that $S$ converges almost surely, since the tail event $S_k \rightarrow a$ very much depends on the value of $X_1$. (It might be clearer to write it like $c e^{X_1} + c^2 e^{X_2} + ... $), and the value of $X_1$ will influence the value of the limit.
Kolmogorov's 0-1 law here can only show that this sequence converges almost surely, since that indeed depends on the behavior of the tail.
We have to distinguish the following two lemmata, both of which are a consequence of Kolmogorov's 0-1-law:
Lemma 1
Let $(X_{k})_{k \in \mathbb{N}}$ be a sequence of independent random variables and let $S_{n}:=\sum_{k=1}^{n}X_{k}$.
Then $\mathbb{P}(S_{n} \text{ converges}) \in \{0,1\}$.
Lemma 2
Any random variable $Y$ that is measurable with respect to the tail-sigma-field of such a sequence of independent random variables, is a.s constant.
To prove almost sure convergence, we could apply Kolmogorov's Three-Series Theorem, but that in itself is a consequence of Borel-Cantelli - so no shortcut here.
Finally, Kolmogorov's 0-1 law does not allow us to conclude that the limit $S=\lim S_{n}$ is constant if it indeed exists, since $S$ is not measurable with respect to the the terminal sigma-field.