Is uniform continuity related to the rate of change of the function?

Rates of change without differentiability is indeed a problem. What you can measure, in a sense, is average rates of change. You can still define the average rate of change of $f$ over $[a, b]$ to be $$\frac{f(b) - f(a)}{b - a}.$$ Note that there's no differentiability required here.

When $f$ is a Lipschitz-continuous function (stronger than uniform continuity), this is always bounded, regardless of the choice of $a$ and $b$ (in fact, this is the definition of being Lipschitz-continuous).

When $f$ is uniformly continuous, the situation is a little more subtle. It's not helpful to consider the single numerical value $\frac{f(b) - f(a)}{b - a}$, but instead view the largest possible allowance for $f(b) - f(a)$ as a function of the length of the interval, $b - a$. With a Lipschitz-continuous function, halving the largest allowable value of $f(b) - f(a)$ corresponds to halving the length $b - a$, but this isn't the case with a general uniformly continuous function. Instead, we have that $f(b) - f(a)$ tends to $0$, at some (possibly non-linear) rate as $b - a$ tends to $0$, regardless of the values that $a$ and $b$ actually take.

Let's take an example: $f(x) = \sqrt{x}$. Note that $f$ is not Lipschitz-continuous, as $$\frac{f(b) - f(a)}{b - a} = \frac{\sqrt{b} - \sqrt{a}}{b - a} = \frac{1}{\sqrt{b} + \sqrt{a}},$$ and taking $b$ and $a$ arbitrarily small makes this value tend to $\infty$. However, if we fix $\varepsilon > 0$, we can force $|f(b) - f(a)| < \varepsilon$ by making $b - a$ small, regardless of the values of $a$ and $b$. That is, $f$ is uniformly continuous. To see this, first suppose $b - a < \varepsilon^2$. Then $$b < \varepsilon^2 + a \le \varepsilon^2 + 2\sqrt{a}\varepsilon + a = (\varepsilon + \sqrt{a})^2,$$ hence $\sqrt{b} - \sqrt{a} < \varepsilon$. So, it's not a linear function of $\varepsilon$, but by forcing $b - a < \varepsilon^2$, we are forcing $f(b) - f(a)$ to be smaller than $\varepsilon$.

By taking smaller and smaller intervals, we can still have $\frac{f(b) - f(a)}{b - a}$ to be larger and larger, but when we fix $b - a$ to be at least some minimal length, we can make $f(b) - f(a)$ small.