Principal angles between subspaces

We prove this by induction on $k$. This is trivially true for $k=0$, which will be our base case.

Let $S$ be the unit sphere $\{u: \|u\|=1\}$ in $V$. $S$ is compact, so its intersections with the closed subspaces $A$ and $B$ are compact. The distance function $d(u,v)=\|u-v\|$ on the Cartesian product $(A\cap S)\times (B\cap S)$ therefore achieves its minimum; let $(x,y)$ be a pair of vectors in $A\cap S$ and $B\cap S$ respectively that achieves this minimum distance $d$.

Next, let $A'=x^\perp \cap A$ and $B'=y^\perp\cap B$, the orthogonal complements of $x$ and $y$ in $A$ and $B$ respectively. They each have dimension $k-1$. We claim that $A'\perp y$ and $B'\perp x$ as well. Why? Given some $x'\in A'$ and an arbitrary small angle $\theta$, $x\cos\theta+x'\sin\theta\in A\cap S$, leading to \begin{align*}d^2 &\le \left\|x\cos\theta+x'\sin\theta-y\right\|^2 = 2-2\langle x,y\rangle\cos\theta-2\langle x',y\rangle\sin\theta\\ d^2 &\le d^2\cos\theta+2(1-\cos\theta)-2\langle x',y\rangle\sin\theta\\ d^2 &\le d^2-2\langle x',y\rangle\theta+O(\theta^2)\end{align*} If $\langle x',y\rangle\neq 0$, then a small $\theta$ of the same sign produces a distance of less than $d$, contradicting minimality. We must have $\langle x',y\rangle=0$, and this holds for all $x'\in A'$. Showing that $B'\perp x$ is exactly the same, using the pair $(x,y\cos\theta+y'\sin\theta)$.

Now we apply the induction hypothesis. Let $x_2,x_3,\dots,x_k,x_{k+2},\dots,x_{2k}$ be an orthonormal system and $\phi_2,\phi_3,\dots,\phi_k$ be angles in $[0,\frac{\pi}{2}]$ such that $\{x_2,x_3,\dots,x_k\}$ is a basis for $A'$ and $\{x_2\cos\phi_2+x_{k+2}\sin\phi_2,x_3\cos\phi_3+x_{k+3}\sin\phi_3,\dots,x_k\cos\phi_2+x_{2k}\sin\phi_k\}$ is a basis for $B'$. We wish to find $x_1$, $x_{k+1}$, $\phi_1$ to extend this system to cover $A$ and $B$. There are two cases for this extension:

Case 1: $d=0$. Then $x=y$. Let $x_1=x$ and let $x_{k+1}$ be a unit vector orthogonal to all $2k-1$ vectors $x_1,x_2,\dots,x_k,x_{k+2},\dots,x_{2k}$. The angle $\phi_1$ is zero, as $1\cdot x_1+0\cdot x_{k+1}=x=y$.

Case 2: $d>0$. Let $x_1=x$ and let $x_{k+1}$ be the result of one-step Gram-Schmidt $$x_{k+1}=\frac{y-\langle x,y\rangle x}{\|y-\langle x,y\rangle x\|}=\frac{y-\langle x,y\rangle x}{\sqrt{1-\langle x,y\rangle^2}}$$ The angle $\phi_1$ is $\arccos\langle x,y\rangle$. Now, we have two things to verify.
First, since $x$ and $y$ are each orthogonal to $A'$ and $B'$, and linear combination of them is orthogonal to all of the other $x_i$, and this is indeed an orthonormal system.
Second, that $\langle x,y\rangle \ge 0$. The distance between $x$ and $y$ is $\sqrt{2-\langle x,y\rangle}$; if $\langle x,y\rangle$ were negative, we could just replace $y$ with $-y$ to get a pair with smaller distance. As such, the minimum-distance property gives us that $\langle x,y\rangle\ge 0$ and $0< \phi_1\le \frac{\pi}{2}$.

Done. That extremal trick of looking for the minimum distance is really the key to it all.


Start with arbitrary orthonormal bases $\vec{p}_1$, ..., $\vec{p}_k$ and $\vec{q}_1$, ..., $\vec{q}_k$ of $A$ and $B$ respectively. Let $P$ and $Q$ be the matrices with columns $\vec{p}_i$ and $\vec{q}_j$, so $A$ and $B$ are the images of $P$ and $Q$ and $P^T P = Q^T Q = \mathrm{Id}_k$.

Let $G = P^T Q$, so $G_{ij} = \vec{p}_i \cdot \vec{q}_j$. Compute the singular value decomposition $G = U^T \mathrm{diag}(\sigma_1, \sigma_2,\ldots, \sigma_k) V$. Replace $P$ by $P' := PU^T$ and $Q$ by $Q' = QV^T$, with columns $\vec{p}'_i$ and $\vec{q}'_j$. Then the $\vec{p}'_i$ and $\vec{q}'_j$ are likewise orthonormal bases of $A$ and $B$ and now we have $$\vec{p}'_i \cdot \vec{q}'_j = \begin{cases} \sigma_j & i =j \\ 0 & i \neq j \\ \end{cases}.$$

Since singular values are nonnegative by definition and $\sigma_j$ is the dot product of two unit vectors, we have $0 \leq \sigma_j \leq 1$. Assuming $A \cap B=(0)$, which I believe the OP intended, we have $\sigma_j < 1$.

Let $\theta_j = \cos^{-1} (\sigma_j)$, so $0 < \theta_j \leq \tfrac{\pi}{2}$. So $\theta_j$ is the angle between $\vec{p}'_j$ and $\vec{q}'_j$. Put $$\vec{q}'_j = (\cos \theta_j) \vec{p}'_j + (\sin \theta_j) \vec{r}_j. $$

Then $\vec{p}'_1$, ..., $\vec{p}'_k$, $\vec{r}_1$, ..., $\vec{r}_k$ is orthonormal and $\vec{q}'_j$ is a linear combination of $\vec{p}'_j$ and $\vec{r}_j$ as desired.