How to prove this statement? (Real analysis)
Two Taylor expansions at $x\in(0,1)$ are \begin{eqnarray} 0&=&f(0)=f(x)+f'(x)(0-x)+\frac{f''(a)}{2}x^2,\\ 0&=&f(1)=f(x)+f'(x)(1-x)+\frac{f''(b)}{2}(1-x)^2. \end{eqnarray} Here $a,b\in(0,1)$, hence $|f''(a)|\le A$ and $|f''(b)|\le A$. The second one minus the first one gives $$ f'(x)=\frac{f''(a)}{2}x^2-\frac{f''(b)}{2}(1-x)^2. $$ Now estimate $$ |f'(x)|\le\frac{A}{2}(\underbrace{x^2+(1-x)^2}_{\le 1})\le\frac{A}{2}. $$ The function $f'(x)$ is continuous, so the estimate can be extended to the closed interval.