Analytic Vectors (Nelson's Theorem)
Let $A$ be a symmetric operator in a Hilbert space $\mathcal H$. For $f \in \mathcal C^\infty(A)$ let $\mathcal L_f$ be the vector space spanned by $\{A^k f | k=0, 1,... \}$ and $\mathcal H_f := \overline{\mathcal L_f} $.
Nussbaum's "Vector of Uniqueness" Lemma: If for non-real $\lambda$, $\mathcal D(A)$ contains a dense set of vectors $u$ such that $\mathcal R(A | \mathcal L_u-\lambda)$ is dense in $\mathcal H_u$ then $\mathcal R(A-\lambda)$ is dense in $\mathcal H$ ($A$ is essentially self-adjoint).
Proof: There are vectors in $\mathcal R(A | \mathcal L_u-\lambda)$ arbitrarily close to $u$ and $\mathcal R(A | \mathcal L_u-\lambda) \subseteq \mathcal R(A-\lambda)$. $\blacksquare$
$f$ is called an analytic vector for $A$ if $ \sum_{n=0}^\infty \frac{\Vert A^nf \Vert s^n}{ n!} \lt \infty $ for some $s\gt 0$ (depending on $f$) $(\Leftarrow \Vert A^nf \Vert = O(n!) )$.
Nelson's Theorem: If $\mathcal D(A)$ contains a dense set of analytic vectors then $A$ is essentially self-adjoint.
Proof:
We prove a stronger theorem of Nussbaum with the weaker condition "quasi-analytic" vector:
$\sum_{n=1}^\infty \frac{1}{\Vert A^nf \Vert ^{\frac{1}{n}}} = \infty$
($\Leftarrow\Vert A^nf \Vert = O( n^n)$). W.l.o.g. $\Vert f \Vert=1$.
On an orthonormal basis $\{e_n\}$ constructed from $\{A^nf \}$ by the Gram-Schmidt process, $A | \mathcal L_f$ is a Jacobi matrix operator
$$ \begin{bmatrix} a_0 & b_0 & 0 & 0 & \cdots \\ b_0 & a_1 & b_1 & 0 & \cdots\\ 0 & b_1 & a_2 & b_2 & \cdots \\ 0 & 0 & b_2 & a_3 & \cdots \\ \vdots & \vdots & \vdots & \vdots & \ddots \\ \end{bmatrix}. $$ ($Ae_m \in \text{span}(e_0,...e_{m+1})$ so $(e_n, Ae_m) = 0$ for $n\gt m+1$. Since $A$ is symmetric, the matrix must be tridiagonal. The off diagonal elements can be made positive real by choice of phase of the $e_n$ .)
Then $b_0b_1...b_{n-1}=(e_n, A^n f)\le \Vert A^nf \Vert$. Using Carleman's inequality: $$\infty = \sum_{n=1}^\infty\frac{1}{\Vert A^nf \Vert ^{\frac{1}{n}}} \le \sum_{n=1}^\infty \frac{1}{b_0 b_1...b_{n-1}}^\frac{1}{n} \le e\sum_{n=0}^\infty \frac{1}{b_n}$$ By Carleman's test $\mathcal A | \mathcal L_f$ is essentially self-adjoint and by Nussbaum's lemma so is $A$. $\blacksquare$
Carleman's test: A Jacobi matrix operator $J$ is essentially self-adjoint if $\sum_{n=0}^\infty \frac{1}{b_n} =\infty$.
Proof:
Suppose a vector $\{p_k\}$ is orthogonal to the range of $J-\lambda$ for non-real $\lambda$. Then:
\begin{align}
\lambda p_0 = a_0 p_0 +b_0 p_1 \\
\lambda p_n = b_{n-1}p_{n-1} + a_np_n +b_n p_{n+1} && n=1, 2, ...
\end{align}
Multiplying this and its conjugate by $\bar p_n$ and $p_n$ respectively and subtracting leads to:
\begin{align}
\frac{b_0(p_1 \bar p_0 -\bar p_1 p_0)} {\lambda - \bar \lambda} =
|p_0|^2 \\
\frac{b_n(p_{n+1} \bar p_n -\bar p_{n+1} p_n )}{\lambda - \bar \lambda} =
\frac{b_{n-1}(p_n \bar p_{n-1} -\bar p_n p_{n-1})}{\lambda - \bar \lambda}+ |p_n|^2 && n=1, 2, ...
\end{align}
Note that the LHSs are $ \ge |p_0|^2$. Using the Cauchy-Schwarz inequality:
$$\infty = |p_0|^2\sum_{n=0}^\infty \frac{1}{b_n} \le \frac{1}{\lambda - \bar\lambda}\sum_{n=0}^\infty (p_{n+1} \bar p_n -\bar p_{n+1} p_n) \le \frac{1}{|\text{Im}(\lambda)|}\sum_{n=0}^\infty |p_n|^2$$ $\blacksquare$
Analytic vectors are used to construct continuous unitary group representations on a dense subset, it extends to the entire space and the generator is the self-adjoint closure of the original operator, see p.200-202 in Reed and Simon, Vol. II.