Taylor series for $\sqrt{x}$?

Short answer: The Taylor series of $\sqrt x$ at $x_0 = 0$ does not exist because $\sqrt x$ is not differentiable at $0$. For any $x_0 > 0$, the Taylor series of $\sqrt x$ at $x_0$ can be computed using the Taylor series of $\sqrt{1 + u}$ at $u_0 = 0$.


Long answer: The Taylor series of a function $f$ that is infinitely differentiable at a point $x_0$ is defined as

$$ \sum_{n=0}^\infty \frac{f^{(n)}(x_0)}{n!}(x-x_0)^n = f(x_0) + \frac{f'(x_0)}{1!}(x-x_0) + \frac{f''(x_0)}{2!}(x-x_0)^2 + \ldots \quad . $$ Therefore:

  • Asking for "the Taylor series of $f$" makes only sense if you specify the point $x_0$. (Often this point is implicitly assumed as $x_0 = 0$, in this case it is also called the Maclaurin series of $f$.)
  • The Taylor series of $f$ at $x_0$ is only defined if $f$ is infinitely differentiable at $x_0$. (But the Taylor series need not be convergent for any $x \ne x_0$, and even if it converges in a neighborhood of $x_0$, the limit can be different from the given function $f$.)

Each Taylor series is a power series $ \sum_{n=0}^\infty a_n (x-x_0)^n $ and the connection is roughly the following: If there exists a power series such that $$ f(x) = \sum_{n=0}^\infty a_n (x-x_0)^n \text{ in a neighborhood of $x_0$} $$ then

  • $f$ is infinitely differentiable at $x_0$, and
  • $a_n = {f^{(n)}(x_0)}/{n!}$ for all $n$, i.e. the power series is exactly the Taylor series.

Now applying that to your question: You are asking for the Taylor series of $f(x) = \sqrt{ x}$. If you meant the Taylor series at $x_0 = 0$: It is not defined because $\sqrt {x}$ is not differentiable at $x_0 = 0$. For the same reason, there is no power series which converges to $f$ in a neighborhood of $0$.

But $f(x) = \sqrt{ x}$ can be developed into a Taylor series at any $x_0 > 0$. The general formula is given in Mhenni Benghorbal's answer. The reason that often only the Taylor series for $\sqrt{1 + x}$ is given in the books is that – for the square-root function – the general case can easily be reduced to the special case: $$ \sqrt {\mathstrut x} = \sqrt {\mathstrut x_0 + x - x_0} = \sqrt {\mathstrut x_0}\sqrt {1 + \frac {\mathstrut x-x_0}{x_0}} $$ and now you can use the Taylor series of $\sqrt{1+u}$ at $u_0 = 0$.

The same "trick" would work for functions like $g(x) = x^\alpha$ because $g(x) = g(x_0) \cdot g(1 + \frac {x-x_0}{x_0})$


I assume you are talking about the Taylor series at $0$ for $\sqrt{x}$. Let's try to compute the Taylor series at $0$: $$ f(x)=f(0)+f'(0)(x-0)+f''(0)\frac{(x-0)^2}2+\dots $$ $f(0)=0$, but $f'(x)=\frac1{2\sqrt{x}}$ blows up at $x=0$. Since $\sqrt{x}$ doesn't have a first derivative at $0$, it doesn't have a Taylor series there.


Note: Strictly speaking, what is proved below is that $\sqrt{x}$ cannot have an asymptotic expansion of the form $a_0 + a_1 x + o(x)$ as $x \to 0$.

There is no Taylor series for it at $0$. If there were, it would be $$\sqrt{x} = a_0 + a_1 x + a_2 x^2 + \dots.$$

Obviously, $a_0$ would have to be $0$, but $\sqrt{x}$ is much larger as $x \to 0$ than any expansion starting with $a_1 x$. For example, we'd have $$\frac{1}{\sqrt{x}} = \frac{\sqrt{x}}{x} = a_1 + a_2 x + \dots \rightarrow a_1,$$ as $x \to 0$, but $\frac{1}{\sqrt{x}}$ doesn't have a finite limit as $x \to 0$.

On the other hand, it's easy to obtain the Taylor expansion for $\sqrt{x}$ at $a > 0$ from the one for $\sqrt{1 + x}$ at $0$. Setting $h = x - a$, you have $$\sqrt{x} = \sqrt{a + h} = \sqrt{a}\sqrt{1 + h/a},$$ and then you expand $\sqrt{1 + h/a}$ in powers of $h/a$.