For symmetric stable distributions, why is $\alpha \le 2$?
If $\phi$ is a characteristic function, then, for every real values of $s$ and $t$, $K(t,s)\geqslant0$ where $K(t,s)$ is the determinant $$ K(t,s)=\det\begin{pmatrix}\phi(0) & \phi(t) & \phi(t+s) \\ \phi(-t) & \phi(0) & \phi(s) \\ \phi(-t-s) & \phi(-s) & \phi(0)\end{pmatrix}. $$ Using $\phi_\alpha(t)=\mathrm e^{-c|t|^\alpha}$ for every $t$, one gets, for every fixed $x$, $K_\alpha(t,xt)=c^2|t|^{2\alpha}k_\alpha(x)+o(|t|^{2\alpha})$ when $t\to0$, with $$ k_\alpha(x)=2x^\alpha(1+x)^\alpha+2x^\alpha+2(1+x)^\alpha−x^{2\alpha}−(1+x)^{2\alpha}−1. $$ If $\alpha>2$, $k_\alpha(x)=−\alpha^2x^2+o(x^2)$ when $x\to0$ hence $k_\alpha(x)<0$ for some values of $x$ and $K_\alpha(t,tx)<0$ for some (small) values of $t$ and $x$. This proves that $\phi_\alpha$ is not a characteristic function.
First edit To prove that the condition that $K$ is nonnegative is necessary for $\phi$ to be a characteristic function, consider more generally the matrix $M=(M_{k,\ell})$ where $M_{k,\ell}=\mathrm E(\mathrm e^{\mathrm i(t_k-t_\ell)X})$ for some given real numbers $(t_k)$. Then, for every complex valued vector $v=(v_k)$, $$ v^*Mv=\sum\limits_{k,\ell}M_{k,\ell}v_k\bar v_\ell=\mathrm E\left(\sum\limits_{k,\ell}Z_k\bar Z_\ell v_k\bar v_\ell\right)=\mathrm E\left(\left|\sum\limits_{k}Z_kv_k\right|^2\right), $$ with $Z_k=\mathrm e^{\mathrm it_kX}$, hence $v^*Mv\ge0$ for every $v$. This means that $M$ represents a nonnegative form, and in particular, $\det M\geqslant0$.
Second edit Here is an alternative proof. I seem to remember that the second moment of $X$ with characteristic function $\phi$, be it finite or not, is $$ \mathrm E(X^2)=\lim\limits_{t\to0}\ t^{-2}(2-\mathrm E(\mathrm e^{\mathrm itX})-\mathrm E(\mathrm e^{-\mathrm itX})). $$ Assuming $\phi_\alpha$ is the characteristic function of $X_\alpha$ and using $\phi_\alpha(t)=1-c|t|^\alpha+o(|t|^\alpha)$ when $t\to0$, one gets $\mathrm E(X_\alpha^2)=0$ when $\alpha>2$, which is absurd.