The asymptotic expansion of an integral of an exponential function

The minimum of $1-u^2$ occurs at $u=1$, and near there we have $1-u^2 \approx 2(1-u)$, which is linear in the quantity $1-u$. This suggests we make the change of variables $1-u^2 = v$ (which is linear in $v$), giving

$$ \int_0^1 e^{-x(1-u^2)}\,du = \frac{1}{2} \int_{0}^{1} e^{-xv} (1-v)^{-1/2}\,dv. $$

Then, following Watson's lemma, we can get the asymptotic expansion by expanding the subdominant term, $(1-v)^{-1/2}$, around $v=0$ and integrating term-by-term from $v=0$ to $v=\infty$:

$$ \begin{align} \int_0^1 e^{-x(1-u^2)}\,du &= \frac{1}{2} \int_{0}^{1} e^{-xv} (1-v)^{-1/2}\,dv \\ &\approx \frac{1}{2} \sum_{k=0}^{\infty} \binom{-1/2}{k} (-1)^k \int_0^\infty e^{-xv} v^k\,dv \\ &= \frac{1}{2} \sum_{k=0}^{\infty} \binom{-1/2}{k} \frac{(-1)^k k!}{x^{k+1}} \end{align} $$

as $x \to \infty$, which matches the series given in the other answer.


Let's prove the special case of Watson's lemma we use in this answer. We'll assume that $g(v)$ is analytic at $v=0$ and that $\int_0^a \lvert g(v) \rvert\,dv$ exists (these are certainly true for $g(v) = (1-v)^{-1/2}$), and show that

$$ I(v) = \int_0^a e^{-xv}g(v)\,dv \approx \sum_{k=0}^{\infty} \frac{g^{(k)}(0)}{x^{k+1}} $$

as $x \to \infty$. In other words, we will show that the asymptotic expansion for the integral can be obtained by expanding $g(v)$ in Taylor series around $v=0$ and integrating term-by-term.

Since $g(v)$ is analytic at $v=0$, there is a $\delta \in (0,a]$ such that $g^{(k)}(v)$ is analytic on $[0,\delta]$ for all $k$. Further, Taylor's theorem tells us that for any positive integer $N$ and any $v \in [0,\delta]$, there is a $v^* \in [0,v]$ for which

$$ g(v) = \sum_{k=0}^{N} \frac{g^{(k)}(0)}{k!} v^k + \frac{g^{(N+1)}(v^*)}{(N+1)!} v^{N+1}. \tag{1} $$

We will split the integral $I(v)$ at this $\delta$, and estimate each piece separately. To this end, we define

$$ I(v) = \int_0^\delta e^{-xv}g(v)\,dv + \int_\delta^a e^{-xv}g(v)\,dv = I_1(v) + I_2(v). $$

We only need a rough estimate on $I_2(v)$:

$$ \lvert I_2(v) \rvert \leq \int_\delta^a e^{-xv} \lvert g(v) \rvert \,dv \leq e^{-\delta x} \int_0^a \lvert g(v) \rvert\,dv, $$

where we have assumed the last integral on the right is finite. Thus

$$ I(v) = I_1(v) + O(e^{-\delta x}) $$

as $x \to \infty$.

Now, from $(1)$ we have

$$ I_1(v) = \sum_{k=0}^{N} \frac{g^{(k)}(0)}{k!} \int_0^\delta e^{-xv} v^k \,dv + \frac{1}{(N+1)!} \int_0^\delta e^{-xv} g^{(N+1)}(v^*) v^{N+1}\,dv. $$

The last integral can be bounded by

$$ \left\lvert \int_0^\delta e^{-xv} g^{(N+1)}(v^*) v^{N+1}\,dv \right\rvert \leq \left( \sup_{0 < v < \delta} \left\lvert g^{(N+1)}(v) \right\rvert \right) \int_0^\infty e^{-xv} v^{N+1}\,dv = \frac{\text{const.}}{x^{N+2}}. $$

Thus

$$ I_1(v) = I_1(v) = \sum_{k=0}^{N} \frac{g^{(k)}(0)}{k!} \int_0^\delta e^{-xv} v^k \,dv + O\!\left(x^{-N-2}\right) $$

as $x \to \infty$. Finally we reattach the tails to the integrals in the sum,

$$ \begin{align} \int_0^\delta e^{-xv} v^k \,dv &= \int_0^\infty e^{-xv} v^k\,dv - \int_\delta^\infty e^{-xv} v^k\,dv \\ &= \frac{k!}{x^{k+1}} + O\!\left(e^{-\delta x}\right), \end{align} $$

and substitute these into the sum to get

$$ I_1(v) = \sum_{k=0}^{N} \frac{g^{(k)}(0)}{x^{k+1}} + O\!\left(x^{-N-2}\right) $$

and hence

$$ I(v) = \sum_{k=0}^{N} \frac{g^{(k)}(0)}{x^{k+1}} + O\!\left(x^{-N-2}\right) $$

as $x \to \infty$. Since $N$ was arbitrary, this is precisely the statement that $I(v)$ has the asymptotic expansion

$$ I(v) \approx \sum_{k=0}^{\infty} \frac{g^{(k)}(0)}{x^{k+1}} $$

as $x \to \infty$.


The integral can be solved in closed form in terms of the Dawson function.
$$f(x)=F(\sqrt{x})/\sqrt{x} \sim \frac{1}{2x}+\frac{1}{4x^2}+\frac{3}{8x^3} +...$$ Mathematica was used to perform the asymptotic expansion for $s \to \infty.$