Difference between continuity and uniform continuity

First of all, continuity is defined at a point $c$, whereas uniform continuity is defined on a set $A$. That makes a big difference. But your interpretation is rather correct: the point $c$ is part of the data, and is kept fixed as, for instance, $f$ itself. Roughly speaking, uniform continuity requires the existence of a single $\delta>0$ that works for the whole set $A$, and not near the single point $c$.


The difference is in the ordering of the quantifiers.

  • Continuity:

For all $x$, for all $\varepsilon$, there exist such a $\delta$ that something something.

  • Uniform continuity:

For all $\varepsilon$, there exists such a $\delta$ that for all $x$ something something.

For something to be continuous, you can check "one $x$ at a time", so for each $x$, you pick a $\varepsilon$ and then find some $\delta$ that depends on both $x$ and $\varepsilon$ so that $|f(x)-f(y)|<\varepsilon$ if $|x-y|<\delta$. As you can see if you try it on $f(x)=1/x$ on $(0,1)$, you can find such a $\delta$ for every $x$ and $\varepsilon$. However, if you fix $\varepsilon$, the values for $\delta$ that you need become arbitrarily small as $x$ approaches $0$.

If you want uniform continuity, you need to pick a $\varepsilon$, then find a $\delta$ which is good for ALL the $x$ values you might have. As you see, for $f(x)=1/x$, such a $\delta$ does not exist.


The subtle difference between these two definitions became more clear to me when I read their equivalent sequence definitions. First take the definition of a continuous function.

Definition A function $f: D\to\mathbb{R}$ is said to be continuous at the point $x_0$ in $D$ provided that for every sequence $\{x_n\}$ in $D$ that converges to $x_0$, the image sequence $\{f(x_n)\}$ converges to $f(x_0)$.

Now compare this to a uniformly continuous function.

Definition A function $f: D\to\mathbb{R}$ is said to be uniformly continuous provided that for any two sequences $\{y_n\}$ and $\{x_n\}$ in $D$ have the property $$\lim_{n\to\infty}(y_n-x_n)=0,$$ then $$\lim_{n\to\infty}(f(y_n)-f(x_n))=0$$

Notice how the second definition mentions no convergence to a point, but that two sequences are tending toward the same value and at the same rate. These sequences can both be divergent sequences when alone, but their terms can become arbitrarily close to each other.

The classic example is $f:\mathbb{R}\to\mathbb{R}, f(x)=x^2$ is continuous but not uniformly continuous. Take the two sequences $\{y_n\}=\{\sqrt{n^2+1}\}$ and $\{x_n\}=\{n\}$. (Note, both sequences diverge). Take the $\lim_{n\to\infty}{y_n-x_n}$, and solve by multiplying numerator and denominator by its conjugate. $$\lim_{n\to\infty}(\sqrt{n^2+1}-n)=\lim_{n\to\infty}\frac{n^2 +1-n^2 }{\sqrt{n^2+1}+n}=\lim_{n\to\infty}\frac{1}{\sqrt{n^2+1}+n}=0.$$ Now, looking at $\lim_{n\to\infty}{f(y_n)-f(x_n)}$ we get the following $$\lim_{n\to\infty}{(\sqrt{n^2+1})^2-n^2}=\lim_{n\to\infty}{n^2+1-n^2}=1$$ So this goes against the definition of uniform continuity. We need the difference of function values to also go to $0$, as well, in order for it to be uniformly continuous.