maximal distance of nearby iid unifrom random variables
Let us just adapt the reasoning at stats-SE. Let $d_0:=X_{(1)}$ and $d_n:=1-X_{(n)}$. Let $\Delta_j:=d_{j-1}$ for $j\in[n+1]:=\{1,\dots,n+1\}$, so that \begin{equation} \max_{1\le k\le n-1}d_k=\max_{2\le j\le n}\Delta_j. \end{equation} The first key observation at stats-SE is that the $\Delta_j$'s are exchangeable, since $(\Delta_1,\dots,\Delta_{n+1})$ equals $(E_1,\dots,E_{n+1})/S$ in distribution, where $E_1,\dots,E_{n+1}$ are iid standard exponential r.v.'s and $S:=E_1+\dots+E_{n+1}$ (see e.g. formula (2.2.1)). The second key observation at stats-SE is that for any $J$ in the set $\binom{[n+1]}r$ of subsets of cardinality $r$ of the set $[n+1]$ and any $x\ge0$ \begin{equation*} P(\Delta_j>x\ \forall j\in J)=P(\Delta_j>x\ \forall j\in[r])=(1-rx)_+^n, \end{equation*} where $u_+^n:=\max(0,u)^n$.
So, by the inclusion-exclusion formula, we obtain the cdf of $\max_{1\le k\le n-1}d_k$: \begin{align*} P(\max_{1\le k\le n-1}d_k\le x)&=1-P(\max_{2\le j\le n}\Delta_j> x) \\ &=1-P(\max_{1\le j\le n-1}\Delta_j> x) \\ &=\sum_{r=0}^{n-1}(-1)^r \sum_{J\in\binom{[n-1]}r}P(\Delta_j>x\ \forall j\in J) \\ &=\sum_{r=0}^{n-1}(-1)^r \binom{n-1}r (1-rx)_+^n \end{align*} for all $x\ge0$.
Addition in response to a comment by the OP: Let
\begin{equation*}
M_n:=\max_{1\le k\le n-1}d_k.
\end{equation*}
Then, using the above expression for the cdf of $M_n$, we have
\begin{multline*}
E M_n^2=\int_0^\infty 2x\,P(M_n>x)dx=\sum_{r=1}^{n-1}(-1)^{r-1} \binom{n-1}r \int_0^{1/r} 2x\,(1-rx)^n\,dx \\
= \sum_{r=1}^{n-1}(-1)^{r-1} \binom{n-1}r\,\frac{2}{r^2(n+2)(n+1)}
=\frac{\psi(n)^2+2 \gamma \psi(n)- \psi'(n)+\pi ^2/6+\gamma ^2}{(n+2)(n+1)},
\end{multline*}
where $\gamma$ is the Euler constant and $\psi:=\Gamma'/\Gamma$ is the digamma function; the latter expression for $E M_n^2$ was obtained by using a computer algebra package. We have $\psi(n)\sim\ln n$ and $\psi'(n)\to0$ as $n\to\infty$ wiki: Polygamma function, whence
\begin{equation*}
E M_n^2\sim\frac{\ln^2 n}{n^2}.
\end{equation*}
Alternatively, without using a computer algebra package, we can obtain exact and asymptotic expressions for $E M_n^2$ as follows: using the formula $\frac1{r^2}=\int_0^\infty xe^{-rx}dx$ for $r>0$, then using the binomial formula, then making the substitution $u=1-e^{-x}$, and then integrating by parts, we have \begin{multline*} \frac{(n+2)(n+1)}{2}\,E M_n^2 = \sum_{r=1}^{n-1}(-1)^{r-1} \binom{n-1}r\,\frac1{r^2} = -\int_0^\infty x\,dx\sum_{r=1}^{n-1} \binom{n-1}r\,(-e^{-x})^r \\ = \int_0^\infty x\,dx\,(1-(1-e^{-x})^{n-1}) = -\int_0^1\ln(1-u)\,\frac{1-u^{n-1}}{1-u}\,du \\ = -\sum_{j=0}^{n-2}\int_0^1\ln(1-u)\,u^j\,du = \sum_{j=0}^{n-2}\frac1{j+1}\int_0^1\,\frac{1-u^{j+1}}{1-u}\,du = \sum_{j=0}^{n-2}\frac1{j+1}\sum_{k=0}^j\int_0^1\,u^k\,du \\ = \sum_{j=0}^{n-2}\frac{H_{j+1}}{j+1} = \sum_{k=1}^{n-1}\frac{H_k}k, \end{multline*} so that \begin{equation} E M_n^2=\frac2{(n+2)(n+1)}\,\sum_{k=1}^{n-1}\frac{H_k}k, \end{equation} where $H_k$ is the $k$th harmonic number. Since $H_k\sim\ln k$ as $k\to\infty$, we have \begin{equation} E M_n^2\sim\frac2{n^2}\,\int_1^n\frac{\ln x}x\,dx=\frac{\ln^2 n}{n^2} \end{equation} as $n\to\infty$, the same result as before.