Proof of Pearson's chi squared test

If we write $Z_i = \frac{O_i-np_i}{\sqrt{np_i}}$ as in the leture notes, the idea is that the vector $Z\to\mathcal N(0,\Sigma)$, a multivariate normal distribution, where

$$\Sigma=\text{Cov}(Z)=\begin{bmatrix} 1-p_1 & -\sqrt{p_1 p_2} & \cdots \\ -\sqrt{p_1 p_2} & 1-p_2 & \cdots \\ \vdots & \vdots & \ddots \end{bmatrix}.$$

If we compute $\text{Det}(\Sigma-\lambda I)=(1-\lambda)^{n-1}\lambda$ we get that $\Sigma$ has $n-1$ eigenvalues that are 1 and one that is 0. (The computation is made easy by the fact that $\Sigma=I-pp^T$ for $p=(\sqrt{p_1},\sqrt{p_2},\dots)$, and Sylvester's theorem.)

This means the distribution is really $n-1$ dimensional embedded in $n$ dimensions, and there is a rotation matrix $A$ that makes

$$A\Sigma A^T=\begin{bmatrix} 0 & 0 & 0 & \cdots \\ 0 & 1 & 0 & \cdots \\ 0 & 0 & 1 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix}.$$

Now let $X = AZ \sim N(0,A\Sigma A^T)$. Then $X$ is a vector $(0, X_1, X_2, \dots)$ of iid. $\mathcal N(0,1)$ gaussians. The function $f(Z) = Z_1^2 + Z_2^2 + \dots$ is the norm $\|Z\|_2^2$, and hence it doesn't change when we rotate its argument. This means $f(Z) = f(AZ) = f(X) = 0^2 + X_1^2 + \dots$, which is Chi-square distributed!

Pretty cool stuff. Thank you for pointing me towards this result :-)