Large deviations for discrete uniform distribution

The answer to your question is contained in the following local limit theorem for large deviations, due to V. Petrov, Theorem 6:

Suppose that $X,X_1,X_2,\dots$ are iid random variables such that $X$ only take values in the set $L:=\{a+kH\colon k\in\mathbb Z\}$ for some real $a$ and some real $H>0$, and suppose that the step $H$ is maximal with this property. Let $S_n:=X_1+\dots+X_n$, $R(h):=Ee^{hX}<\infty$ for all real $h>0$, $m(h):=(\ln R)'(h)$, $\sigma(h):=\sqrt{m'(h)}>0$, and $A_0:=\lim_{h\to\infty} m(h)=\sup_{h>0} m(h)$. Then $$P(S_n=nx)=\frac H{\sigma(h_x)\sqrt{2\pi n}}\,\exp\{n\ln R(h_x)-nh_x x\}(1+O(1/n)), $$ where $x$ varies arbitrarily in any compact subinterval of the interval $(EX,A_0)$ so that $nx\in L$, and $h_x$ is the only root $h$ of the equation $m(h)=x$.

In your case, $a=0$, $H=1$, $$R(h)=\frac{e^{(r+1)h}-1}{(r+1)(e^h-1)}, $$ $EX=r/2$, and $A_0=r$.

In the particular case when $r=1$, we have $R(h)=(e^h+1)/2$, $m(h)=1/(1+e^{-h})$, $\sigma^2(h)=e^{-h}/(1+e^{-h})^2$, $h_x=\ln\frac x{1-x}$, and hence $$P(S_n=nx)=\frac1{\sqrt{2\pi nx(1-x)}}\,J(x)^n(1+O(1/n)), $$ where $x$ varies arbitrarily in any compact subinterval of the interval $(1/2,1)$ so that $nx$ is an integer, and $$J(x):=\tfrac12\,x^{-x}(1-x)^{x-1}. $$ Here is the graph of $J$:

enter image description here


As Iosif Pinelis mentioned this is quite standard in large deviations theory so let me explain a bit the idea of the theorem he quote.

Let $Y$ a random variable defined as $\mathbb{P}(Y=y)=\frac{1}{Z(\alpha)}e^{-\alpha y}$ for $y=0,\cdots,r$ with $Z(\alpha) = \sum_y e^{-\alpha y}$ and choose $\alpha$ such that $\mathbb{E}(Y)=\frac{k}{n}$. Let $(Y_i)_{i\leq n}$ $n$ independant copy of $Y$. Then $$\mathbb{P}[\sum_i Y_i =k]$$ can be estimated with the CLT. Moreover we have $$\mathbb{P}[X_1=y_1,X_2=y_2,\cdots,X_n=y_n]=\mathbb{P}[Y_1=y_1,Y_2=y_2,\cdots,Y_n=y_n]\frac{Z(\alpha)^n e^{\alpha \sum_i y_i}}{(r+1)^n}$$ and then $$\mathbb{P}[\sum_i X_i =k] = \Big(\frac{Z(\alpha) }{r+1}\Big)^n e^{\alpha k} \mathbb{P}[\sum_i Y_i =k] $$ which solve your problem.

In order to find $\alpha$ we use that $$\mathbb{E}[Y]=\frac{1}{Z(\alpha)}\sum_{y\leq r}ye^{-\alpha y}=-\frac{d}{d\alpha} \log(Z(\alpha))$$ which gives an easy equation in $\alpha$. Similarly the variance of $Y$ is calculated with the second derivative of $Z(\alpha)$

[To translate with Iosif Pinelis answer : $\alpha=h$, $Z(\alpha)=(r+1)\mathbb{E}[e^{-\alpha X}]=(r+1)R(-\alpha)$]