Can we do better than Azuma-Hoeffding when the variance is small?
Exponential inequalities for sums of independent random variables (r.v.'s) can be extended to martingales in a standard and completely general manner; see Theorem 8.5 or Theorem 8.1 for real-valued martingales, and Theorem 3.1 or Theorem 3.2 for martingales with values in 2-smooth Banach spaces in this paper.
In particular, Theorem 8.7 in the same paper implies the following martingale version of the Bennett (8b)--Hoeffding (2.9) inequality given for sums of independent r.v.'s:
If $(X_j)_{j=0}^n$ is a real-valued martingale with respect to a filter $(F_j)_{j=0}^n$ of $\sigma$-algebras such that for $d_j:=X_j-X_{j-1}$ we have $|d_j|\le a$ and $\sum_1^n E(d_j^2|F_{j-1})\le b^2$ for some real $a,b>0$ and all $j$, then \begin{equation} P(X_j-X_0\ge r)\le\exp\Big\{\frac{b^2}{a^2}\,\psi\Big(\frac{ra}{b^2}\Big)\Big\} \end{equation} for $r\ge0$, where $\psi(u):=u-(1+u)\ln(1+u)$.
According to Theorem 3, this bound is the best possible exponential bound on $P(X_j-X_0\ge r)$ in terms of $a,b^2,r$.
Adding to Iosif Pinelis' answer, there are two points here. First, as he says, the fact that we have a martingale rather than i.i.d. variables doesn't change much as proofs generally extend. So, second is what bounds are available for i.i.d. variables.
Others have mentioned keywords including Bennet's, Bernstein, Freedman inequalities. Another not yet mentioned is based on subgaussian variables. For example, Normal$(\mu,\sigma^2)$ variables are $\sigma^2$-subgaussian (even without restricting them to a bounded region!), and a sum of $N$ of them is $N\sigma^2$-subgaussian, so it satisfies
$$ \Pr\left[ |X_N - \mathbb{E} X| \geq \epsilon N \right] \leq 2 \exp\left(\frac{- N \epsilon^2}{2 \sigma^2} \right) $$
More details: A variable is $\sigma^2$-subgaussian if $\mathbb{E} e^{\lambda Y} \leq e^{\frac{\lambda^2 \sigma^2}{2}}$ for all $\lambda \in \mathbb{R}$. A sum of a $\sigma_1^2$ and a $\sigma_2^2$ subgaussian variable, independent, is $\sigma_1^2 + \sigma_2^2$ subgaussian (this extends immediately to martingales). A $\sigma^2$-s.g. variable satisfies the tail bound $\Pr[Y - \mathbb{E} Y \geq t] \leq e^{\frac{-t^2}{2\sigma^2}}$ on both sides. I wouldn't be surprised if something close to Iosif Pinelis' example result can be derived from a fact about subgaussian parameters of bounded variables, but I don't know that fact/proof myself offhand.
You can find more in the "Concentration Inequalities" book of Boucheron et al, for example.