The limit of a convergent Gaussian random variable sequence is still a Gaussian random variable

  • First, we note that the sequence $\{\sigma_n\}$ and $\{\mu_n\}$ has to be bounded. It's a consequence of what was done in this thread, as we have in particular convergence in law. What we use is the following:

If $(X_n)_n$ is a sequence of random variables converging in distribution to $X$, then for each $\varepsilon$, there is $R$ such that for each $n$, $\mathbb P(|X_n|\geqslant R)\lt \varepsilon$ (tightness).

To see that, we assume that $X_n$ and $X$ are non-negative (considering their absolute values). Let $F_n$, $F$ the cumulative distribution function of $X_n$, $X$. Take $t$ such that $F(t)\gt 1-\varepsilon$ and $t$ is a continuity point of $F$. Then $F_n(t)\gt 1-\varepsilon$ for $n\geqslant N$ for some $N$. And a finite collection of random variables is tight.

  • Now, fix an arbitrary strictly increasing sequence $\{n_k\}$. We extract further sub-sequences of $\{\sigma_{n_k}\}$ and $\{\mu_{n_k}\}$, which converge respectively to $\sigma$ and $\mu$. Taking the modulus, we can see that $e^{-\sigma^2/2}=|\varphi_X(1)|$, so $\sigma$ is uniquely determined.
  • We have $e^{it\mu}=\varphi_X(t)e^{t\sigma^2/2}$ for all $t\in\Bbb R$, so $\mu$ is also completely determined.