Links between difference and differential equations?

First order example

Consider the difference equation $$\begin{equation*} x_{n+1} = a x_n,\tag{1} \end{equation*}$$ where $x_0$ is given, with solution $$\begin{equation*} x_n = a^n x_0. \tag{2} \end{equation*}$$

Let $x_n = x(t_n)$, where $t_{n+1} = t_n + h$ and $t_0 = 0$, so $t_n = n h$. Rewrite the difference equation (1) as $$\frac{x(t_{n}+h)-x(t_n)}{h} = \frac{(a-1)}{h} x(t_n).$$ If $h$ is small, this can be approximated by the differential equation $$\begin{equation*} x'(t) = \frac{a-1}{h}x(t),\tag{3} \end{equation*}$$ with solution $$\begin{equation*} x(t) = x(0) \exp \left(\frac{a-1}{h} t\right).\tag{4} \end{equation*}$$ Notice that $\exp\left(\frac{a-1}{h} t\right) = a^{n} + O(n(a-1)^2)$, so for $n\ll 1/(a-1)^2$ we find good agreement with (2).

Going in the reverse direction, from the differential equation (3) to the difference equation (1), is called Euler's method.

There is a subtlety when approximating a difference equation by a differential equation. For this problem we require that $|a-1| \ll 1$ since we need $x_{n+1}- x_n = O(h) \ll 1$. Otherwise, we should expect (and will find) the approximation of (1) by (3) to be poor. Sometimes it is necessary to reformulate the difference equation so the difference between successive terms is small.${}^\dagger$

${}^\dagger$ For example, suppose $a$ in (1) were large and positive. Let $y_0=x_0$ and let $y_{n+1} = a^{1/p} y_n$, where $p$ is some large integer so $|a^{1/p} - 1|\ll 1$. Note that $y_{n p} = x_n$. The solution to the corresponding differential equation for $y$ will be a good approximation to $y_n$, and this in turn can be used to approximate $x_n$.

Second order example

Consider the differential equation for the harmonic oscillator $$\begin{equation*} x''+x = 0, \hspace{5ex} x(0)=1, \hspace{5ex}x'(0)=0, \tag{5} \end{equation*}$$ the solution for which is $x(t) = \cos t$. The simplest related difference equation is $$\begin{equation*} \frac{1}{h^2}(x_{n+2}-2x_{n+1}+x_n) + x_n = 0, \tag{6} \end{equation*}$$ with $x_0 = x_1 = 1$. (We show how to get (6) from (5) below.) There are standard techniques for solving simple recurrence relations such as (6) in closed form. We find $$\begin{equation*} x_n = \frac{1}{2} \left((1+i h)^{n}+(1 - i h)^{n}\right). \end{equation*}$$ Note that $x_n$ is real since $x_n^* = x_n$. This is the closed form for the solution a computer would find solving (6) iteratively. Recall that $n=t_n/h$. For small $h$, $x_n = \cos t_n + \frac{1}{2} h t_n \cos t_n + O(h^2)$, so the numerical solution to (6) will well approximate $\cos t_n$ for $h t_n \ll 1$. In the limit $h\to 0$, $x_n \to \cos t_n,$ as expected.

Derivatives and finite differences

Morally, a difference equation is a discrete version of a differential equation and a differential equation is a continuous version of a difference equation. The method of numerical integration of ODEs is essentially the rewriting of a differential equation as a difference equation which is then solved iteratively by a computer. This is a big subject with many subtleties. The method can also be applied to nonlinear and partial differential equations. Below we give a brief dictionary between finite difference and differential operators.

Define the shift operator $E$ such that $E x_n = x_{n+1}$. The difference operator $\Delta = E-1$ then gives $\Delta x_n = x_{n+1}-x_n$. These operators are connected to differentiation by Taylor series. Let $D x_n = x_n' = d x(t)/d t|_{t=t_n}$. Then $$x_{n+1} = E x_n = \sum_{k=0}^\infty \frac{h^k}{k!} D^k x_n = e^{h D}x_n.$$ Thus, as an operator, $$E = e^{h D},$$ and so $$D = \frac{1}{h} \ln E = \frac{1}{h} \ln(1+\Delta) = \frac{1}{h}\sum_{k=1}^\infty (-1)^{k+1} \frac{\Delta^k}{k}.$$ (Think of these operators as acting on polynomials, possibly of very high order.) This formalism gives us a way to convert any ODE into a difference equation and vice-versa. Notice that higher order derivatives can be approximated by $D^k\approx (\Delta/h)^k$. Thus, for example, $$x'' = D^2 x \rightarrow \frac{1}{h^2}\Delta^2x_n = \frac{1}{h^2}\Delta(x_{n+1}-x_n) = \frac{1}{h^2} (x_{n+2}-2 x_{n+1} + x_n).$$

When using Euler's method we let $D \approx \Delta/h$. We could just as well keep higher order terms in $\Delta$ to get recursion relations with three or more terms. It is a sign of the nontrivial nature of the subject that this simple change leads to numerical instabilities. There are many named algorithms that do improve and generalize Euler's method, building on the ideas sketched above. (See, for example, the Runge-Kutta methods, a family of robust algorithms used to numerically solve linear and nonlinear differential equations.)

Addendum

This is in response to Drew N's question about seeing directly that $$D = \frac{1}{h}\sum_{k=1}^\infty (-1)^{k+1} \frac{\Delta^k}{k}$$ is the differential operator. Here we show that $D t^n = n t^{n-1}$ for $n=0,1,\ldots$, which shows that $D$ is the differential operator on polynomials of arbitrarily high degree. (It is straightforward to extend this proof to $n\in\mathbb{C}$.) We have \begin{align*} D t^n &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k}\Delta^k t^n \\ &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k}(E-1)^k t^n \\ &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k} \sum_{j=0}^k{k\choose j}(-1)^{k-j}E^j t^n & \textrm{binomial theorem} \\ &= \frac{1}{h} \sum_{k,j} (-1)^{j+1}\frac{1}{k}{k\choose j}(t+j h)^n \\ &= \frac{1}{h} \sum_{k,j} (-1)^{j+1}\frac{1}{k}{k\choose j} \sum_{l=0}^n {n\choose l}j^l h^l t^{n-l} & \textrm{binomial theorem} \\ &= \left.\frac{1}{h} \sum_{k,j,l} (-1)^{j+1}\frac{1}{k} {k\choose j}{n\choose l} (x D)^l x^j h^l t^{n-l}\right|_{x=1} & D = d/dx, (x D)^l = \underbrace{(x D)\cdots (x D)}_{l} \\ &= \left.-\frac{1}{h} \sum_{k,l} \frac{1}{k} {n\choose l} h^l t^{n-l} (x D)^l \sum_{j=0}^k {k\choose j}(-x)^j \right|_{x=1} \\ &= \left.-\frac{1}{h} \sum_{k,l} \frac{1}{k} {n\choose l} h^l t^{n-l} (x D)^l (1-x)^k \right|_{x=1} & \textrm{binomial theorem} \\ &= \left.-\frac{1}{h} \sum_{l} {n\choose l} h^l t^{n-l} (x D)^l \sum_{k=1}^\infty \frac{1}{k} (1-x)^k \right|_{x=1} \\ &= \left.\frac{1}{h} \sum_{l} {n\choose l} h^l t^{n-l} (x D)^l \log x\right|_{x=1} & \textrm{series for natural logarithm} \\ &= \frac{1}{h} \sum_{l=0}^n {n\choose l} h^l t^{n-l} \delta_{l1} & \textrm{see below} \\ &= \frac{1}{h} {n\choose 1} h t^{n-1} \\ &= n t^{n-1}. \end{align*} Note that $(x D)x^j = j x^j$ so $$(x D)^l x^j = j^l x^j.$$ Also
\begin{align*} (x D)^0 \log x|_{x=1} &= \log 1 = 0 \\ (x D)^1 \log x &= x \frac{1}{x} = 1 \\ (x D)^2 \log x &= (x D)1 = 0 \\ (x D)^3 \log x &= (x D)^2 0 = 0. \end{align*} Thus, $$(x D)^l \log x|_{x=1} = \delta_{l1},$$ where $\delta_{ij}$ is the Kronecker delta.


Yes, obviously there is some correspondence (numerical solutions of ordinary differential equations are discrete dynamical systems, many proofs in bifurcation theory uses continuous time dynamical systems to analyze discrete original problems, etc).

However, the most profound attempt to build a theory that unites difference and differential equations is the time scale calculus.


(Adding to user26872's answer as this was not obvious to me so it might help someone else going through his derivation)

The identity

$$x_{n+1} = Ex_n = \sum\limits_{k=0}^\infty \frac{h^k}{k!}D^k x_n=e^{hD}x_n$$

is true if we consider the following. Let $x_n = x(t_n)$. Let's assume that $x(t)$ is differentiable around some point $t_n$, then it's Taylor series can be defined as

$$x(t) = \sum\limits_{k=0}^\infty \frac{x^{(k)}(t_n)}{k!}(t-t_n)^k.$$

If we now perform the same expansion at $t' = t_n+h$ we have

$$x(t_n+h) = \sum\limits_{k=0}^\infty \frac{x^{(k)}(t_n)}{k!}h^k = \sum\limits_{k=0}^\infty \frac{h^k D^k}{k!}x(t) = e^{hD} x(t_n)$$

and so

$$x_{n+1} = Ex_n \leftrightarrow x(t_n + h) = E x(t_n)$$

thus giving

$$E = e^{hD}.$$