Prob. 8, Sec. 3.5 in Erwin Kreyszig's Introductory Functoinal Anlaysis With Applications
Hint: note that, with the norm defined via the inner product, we have $$ \left\| \sum_{k=1}^N \langle x,e_k \rangle e_k \right\|^2 = \sum_{k=1}^N |\langle x,e_k \rangle|^2 $$ because the vectors $e_k$ are orthonormal. Also, note that for all $N$, $$ \|x\|^2 \geq \left\| \sum_{k=1}^N \langle x, e_k \rangle e_k \right\|^2 $$ We now know that the sum $\sum_{k=1}^\infty |\langle x,e_k \rangle|^2$ converges, which means that $\sum_{k=1}^N \langle x,e_k \rangle e_k$ converges. However, we must now show that its limit is $x$. In order to do this, it suffices to show that $x - \sum_{k=1}^\infty \langle x,e_k \rangle e_k$ is orthogonal to each $e_k$.
The orthogonal projection $P_{N}x$ of $x$ onto the subspace $M_{N}$ spanned by $\{ e_1,e_2,\cdots,e_N\}$ is given by $P_{N}x=\sum_{n=1}^{N}(x,e_n)e_n$. The orthogonal projection onto $M_{N}$ is the same as the closest point projection onto $M_{N}$ (just like in the good 'ole days of your Calculus class.) Therefore $$ \|x-P_{N}x\| \le \|x-(\alpha_1 e_1 + \cdots +\alpha_N e_N)\| $$ holds for all choices of scalars $\{\alpha_n\}_{n=1}^{N}$. The orthogonal (equivalently, closest-point) projection onto a larger subspace is at least as close. Hence, $$ \|x-P_{N'}x\| \le \|x-P_{N}x\| \le \|x-(\alpha_1 e_1 + \cdots +\alpha_N e_N)\|,\;\;\; N' \ge N. $$ Therefore, if you can approximate $x$ to within a distance of $\epsilon$ by some $m \in M$, then the orthogonal series is within $\epsilon$ of $x$ for large enough $N$.