Difference between the largest and second largest observations from a sample of iid normal variables
We can explicitly derive the joint density of the general case of two order statistics $X_{(j)}, X_{(k)}$, $1 \le j < k \le n$, from an iid sample drawn from some continuous distribution with CDF $F(x)$. The idea is to visualize $$X_{(1)} \le \ldots \le X_{(j-1)} \le X_{(j)} \le X_{(j+1)} \le \ldots \le X_{(k-1)} \le X_{(k)} \le X_{(k+1)} \le \ldots \le X_{(n)}.$$ For a fixed $X_{(j)}, X_{(k)}$, we see that there are $$\frac{n!}{(j-1)! ((k-1)-(j+1)+1)! (n-(k+1)+1)!} = \frac{n!}{(j-1)! (k-j-1)! (n-k)!}$$ permutations that map the sample to the other $X_{(i)}$, $i \ne j, k$. Now, the probability that $j-1$ observations of $X$ are less than $x_j$, $k-j-1$ observations are between $x_j$ and $x_k$, and $n-k$ observations exceed $x_k$, is given by $$F(x_j)^{j-1} (F(x_k) - F(x_j))^{k-j-1} (1 - F(x_k))^{n-k}.$$ So we find that the joint density is $$f_{X_{(j)}, X_{(k)}}(x_j, x_k) = \frac{n! \, F(x_j)^{j-1} (F(x_k) - F(x_j))^{k-j-1} (1 - F(x_k))^{n-k} f(x_j) f(x_k)}{(j-1)! (k-j-1)! (n-k)!} .$$
For the special case that $j = n-1$, $k = n$, we then get $$f_{X_{(n-1)}, X_{(n)}}(x_{n-1}, x_n) = n(n-1) F(x_{n-1})^{n-2} f(x_{n-1}) f(x_n).$$ For $X$ a standard normal random variable, this turns out to look reasonably nice: $$f(u,v) = \begin{cases} \frac{n(n-1)}{2\pi} e^{-(u^2+v^2)/2} \Phi(u)^{n-2}, & u \le v \\ 0 & \text{otherwise}.\end{cases}$$ Now defining $W = X_{(n)} - X_{(n-1)}$, we get $$f_W(w) = \int_{u=-\infty}^\infty f(u,w+u) \, du, \quad w \ge 0.$$ This integral, to my knowledge, doesn't have an elementary closed form. I did numerically integrate the cases $n = 2^a$ for $a = 1, 2, \ldots 16$ and plotted the resulting distributions below:
This suggests that the expectation and variance of $W$ decrease with increasing $n$.