Show that for a finitely differentiable function there exists a polynomial that both their $k$-th derivatives converge
We may assume $[a,b]=[-1,1]$; furthermore we assume that $f\in C^n\bigl([-1,1]\bigr)$, where $n\geq0$. The following is a proof by induction.
When $n=0$ (i.e., $f$ is continuous on $[-1,1]$) and an $\epsilon>0$ is given then according to the Stone-Weierstrass theorem we can find a polynomial $p$ such that $$|f(x)-p(x)|\leq\epsilon\qquad(-1\leq x\leq 1)\ .$$ (Such a polynomial has nothing to do with any Taylor expansion the function $f$ might have.)
Assume now that the statement is true for $n-1\geq0$ and that $f\in C^n\bigl([-1,1]\bigr)$. Applying it to $f'$ we obtain a polynomial $p$ whose derivatives up to order $n-1$ approximate the corresponding derivatives of $f'$: $$|(f')^{(k)}(x)- p^{(k)}(x)|<\epsilon \qquad(0\leq k\leq n-1)\ .\tag{1}$$ We now define the polynomial $P$ by $$P(x):=f(0)+\int_0^x p(t)\ dt\qquad(-1\leq x\leq 1)\ .$$ Then $$|f(x)-P(x)|\leq\left|\int_0^x \bigl(f'(t)-p(t)\bigr)\ dt\right|\leq |x|\sup_{-1\leq t\leq1}|f'(t)-p(t)|<\epsilon\ ,$$ and using $(1)$ it is easy to see that $P$ satisfies stated requirements for $f$.
Without the assumption that $f^{(n)}$ is continuous the stated claim is false. A sequence of polynomials $(q_k)_{k\geq1}$ converging uniformly to some $g$ on the interval $[-1,1]$ necessitates that $g$ is continuous to begin with. As an example consider the function $$f(x):=\cases{x^2\sin{1\over x}\quad&$(0<|x|\leq 1)$ \cr 0&$(x=0)$ .\cr}$$ This function is differentiable on all of $[-1,1]$, but as $$f'(x)=2x\sin{1\over x}-\cos{1\over x}\qquad(x\ne0)$$ there is no polynomial that can approximate $f'$ with an error $<1$ in the $\sup$-norm.