Applying Central Limit Theorem to show that $E\left(\frac{|S_n|}{\sqrt{n}}\right) \to \sqrt{\frac{2}{\pi}}\sigma$

We assume $\sigma=1$.

  • Show that for all $K$ and all $n$, $$P\left(\frac{|S_n|}{\sqrt n}\geqslant K\right)\leqslant\frac 1{K^2}.$$

  • This implies $$\int \frac{|S_n|}{\sqrt n}dP\leqslant \int_{|S_n|<K\sqrt n} \frac{|S_n|}{\sqrt n}dP+\frac 1{K^2}.$$

  • Use the definition of weak convergence together with $x\mapsto \min\{|x|,K\}$, a continuous bounded map.

Actually, there is a deeper result (not needed here) called the invariance principle (see Billingsley's book Convergence of probability measures) which stated the following. Take $\{X_n\}$ a sequence of i.i.d. centered random variables, with $EX_n^2=1$, and define for each $n$, a random function $f_n(\omega,\cdot)$ in the following way:

  • $f_n(\omega,kn^{-1})=\frac 1{\sqrt n}\sum_{j=1}^kX_j(\omega)$ for $0\leqslant k\leqslant n$.
  • $f_n(\omega,\cdot)$ is piecewise linear.

The measures associated with these functions converge in law to a Brownian motion.

Now, to get the result, take the continuous bounded functional $F\colon C[0,1]\to \Bbb R$ given by $F(f)=|f(1)|$. The first version of the invariance principle, found by Kac and Erdős, was about the functional $F(f):=\sup_{0\leqslant x\leqslant 1}f(x)$. Then Donsker generalized it.


In a nutshell, you consider some random variables $(Y_n)_n$ and $Y$ defined on a probability space $(\Omega,\mathcal F,\mathbb P)$, such that $Y_n\to Y$ in distribution and you wonder whether $\mathbb E(Y_n)\to \mathbb E(Y)$ (in your case, $Y_n=|S_n|/\sqrt{n}$ and $Y=|Z|$). Obviously, this is not true in general and Fatou lemma only guarantees that $\liminf \mathbb E(Y_n)\geqslant \mathbb E(Y)$. But here, you also know that $(Y_n)_n$ is bounded in $L^2(\mathbb P)$ since $\mathbb E(Y_n^2)=\sigma^2$ for every $n$. Hence $(Y_n)_n$ is uniformly integrable, and, together with the convergence in distribution, this guarantees that $\mathbb E(Y_n)\to\mathbb E(Y)$.

One can prove the last assertion as follows. First there exists some random variables $(\bar Y_n)_n$ and $\bar Y$, possibly defined on another probability space $(\bar\Omega,\bar{\mathcal F},\bar{\mathbb P})$, such that each $\bar Y_n$ is distributed like $Y_n$, $\bar Y$ is distributed like $Y$, and $\bar Y_n\to\bar Y$ almost surely. In particular, (1) $\bar Y_n\to\bar Y$ in probability and (2) $(\bar Y_n)_n$ is uniformly integrable. For any integrable $\bar Y$, properties (1) and (2) together are equivalent to the convergence $\bar Y_n\to\bar Y$ in $L^1(\bar{\mathbb P})$, that is, to the fact that $\bar{\mathbb E}(|\bar Y_n-\bar Y|)\to0$. As a consequence, $\mathbb E(Y_n)=\bar{\mathbb E}(\bar Y_n)\to \bar{\mathbb E}(\bar Y)=\mathbb E(Y)$.


There are two parts to this problem:

1) $\limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \leq E[|Z|] $

Proof: (Credits to DavideGiraudo)

$$E[\frac{|S_n|}{\sqrt{n}}] = E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} > K}]$$ $$\leq E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + \sqrt{E[\frac{S_n^2}{n}]}.\sqrt{P(\frac{|S_n|}{\sqrt{n}} > K) } $$ (Cauchy Schwarz) $$\leq E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + \sigma^2.\frac{1}{K} $$ (Used the Markov Inequality)

Take limsup on both sides

$$\limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \leq \limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + \sigma^2.\frac{1}{K}$$ $$ \leq \limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + K.P(\frac{|S_n|}{\sqrt{n}} > K) + \sigma^2.\frac{1}{K} \quad (\times) $$ I have added that extra term for a reason...

Now use the following: (a) $f(x)=\min(|x|,K)$ is continuous and bounded

(b) $E[f(X_n)] \rightarrow E[f(X)] $ if $X_n \rightarrow^w X$. f as in (a)

Note that $ E[f(\frac{|S_n|}{\sqrt{n}})] = E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + K.P(\frac{|S_n|}{\sqrt{n}} > K)$

Now apply (a) and (b) to $(\times)$ to get

$$\limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \leq E[|Z|1_{|Z| \leq K}] + K.P(|Z| > K) + \sigma^2.\frac{1}{K}$$ LHS is independent of K. Take $K \rightarrow \infty$. Note that $K.P(|Z| > K) \rightarrow 0$ and MCT applied to the first term yields

$$\limsup_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \leq E[|Z|] $$

(2) $\liminf_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \geq E[|Z|] $ Proof:

$$ E[\frac{|S_n|}{\sqrt{n}}] = E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} > K}] $$ $$ \geq E[\frac{|S_n|}{\sqrt{n}}1_{\frac{|S_n|}{\sqrt{n}} \leq K}] + K.P(\frac{|S_n|}{\sqrt{n}} > K)$$ Take $\liminf_{n \rightarrow \infty}$ to get

$$\liminf_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \geq E[|Z|1_{|Z| \leq K}] + K.P(|Z| > K)$$ (I used (b)) Take $K \rightarrow \infty$, use MCT on the first term on RHS to get

$\liminf_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] \geq E[|Z|] $

Thus we get $\lim_{n \rightarrow \infty} E[\frac{|S_n|}{\sqrt{n}}] = E[|Z|] $ QED