Proof of the independence of the sample mean and sample variance

I guess this is probably a little late, but this result is immediate from Basu's Theorem, provided that you are willing to accept that the family of normal distributions with known variance is complete. To apply Basu, fix $\sigma^2$ and consider the family of $N(\mu, \sigma^2)$ for $\mu \in \mathbb R$. Then $\frac{(n - 1)S^2}{\sigma^2} \sim \chi^2_{n - 1}$ so $S^2$ is ancillary, while $\bar X$ is complete sufficient, and hence they are independent for all $\mu$ and our fixed $\sigma^2$. Since $\sigma^2$ was arbitrary, this completes the proof.

This can also be shown directly without too much hassle. One can find the joint pdf of $(A, \bar X)$ directly by making a suitable transformation to the joint pdf of $(X_1,\cdots, X_n)$. The joint pdf of $(A, \bar X)$ factors as required, which gives independence. To see this quickly, without actually doing the transformation, skipping some algebra we may write

$$f(x_1, x_2, ..., x_n) = (2\pi \sigma^2)^{-n/2} \exp\left\{-\frac{\sum(x_i - \bar x)^2}{2\sigma^2}\right\} \exp\left\{-\frac{n(\bar x - \mu)^2}{2\sigma^2}\right\}$$

and we can see that everything except for the last term depends only on $$(x_2 - \bar x, x_3 - \bar x, ..., x_n - \bar x)$$ (note we may retrieve $x_1 - \bar x$ from only the first $n - 1$ deviations) and the last term depends only on $\bar x$. The transformation is linear, so the jacobian term won't screw this factorization up when we actually pass to the joint pdf of $(A, \bar X)$.


Hint:

  1. Show that $A' = (\bar X, A)$ can be writen as $A' = C X$ where $C$ is some square matrix. Deduce from that that $A'$ is jointly gaussian.

  2. Recall that independence is equivalent to zero correlation for jointly gaussian variables. Show that $E(A_i \bar X) = 0$. Conclude that $A$ and $\bar X$ are independent.


Here is an alternative matrix approach. By replacing each $X_i$ by $(X_i - \mu)/\sigma$, we can assume wlog that the $X$'s are independent standard normal variables.

Pack $X_1,\ldots,X_n$ into an $n\times 1$ column vector $X$. The idea is to express $\sum X$ and $(X_1-\bar X,\ldots,X_n-\bar X)^T$ as matrix transformations of $X$. This is achieved by taking $U=(1,\ldots,1)$, a $1\times n$ row vector of ones (so that $UX=\sum X_i$), and defining the $n\times n$ matrix $$C:=I_n - \frac1n U^TU$$ (so that $CX$ has $i$th member $X_i-\bar X$). Check that $UX$ and $CX$ each have zero mean. Their covariance is $$E[(UX)(CX)^T]=E(UXX^TC^T)=UC^T=UC=U-\frac1nUU^TU.$$ But $UU^T=n$, so the covariance matrix is zero. Since $UX$ are $CX$ are jointly normal, this implies that they are independent, hence so are $\bar X$ and $S^2$, which are functions respectively of $UX$ and $CX$.


Why are $UX$ and $CX$ jointly normal? The product of a partitioned matrix with $X$: $$\begin{pmatrix}U\\\hdashline C\end{pmatrix}X $$ yields a $(n+1)\times 1$ multivariate normal vector; $UX$ is the first member, and $CX$ is the remaining subvector.

Tags:

Statistics