Sum of random variables at least $\log n$

Let $\ell(n)=\lfloor \ln(n)\rfloor+1$, and let all $E[X_i]=\frac{1}{\ell(n)}$ for $i\leq \ell(n)$ and $E[X_i]=0$ otherwise. Essentially, you are concentrating all the "mass" into the first $\ell(n)\approx\ln(n)$ variables (the minimum number whose sum can exceed $\ln(n)$), divided evenly.

The probability that $X_i=1$ for $i\leq \ell(n)$ is $\left(\frac{1}{\ell(n)}\right)^{\ell(n)}\approx\left(\frac{1}{\ln(n)}\right)^{\ln(n)}=n^{-\ln\ln(n)}\approx n^{-\ln\ln(n)+1}$, where the last term is the Chernoff Bound you obtained.

Note that the (multiplicative) error "hidden" in the first of the two $\approx$ (due to having to use $\lfloor \ln(n)\rfloor+1$ instead of $\ln(n)$ because the $X_i$ are discrete) is of the order of $\ln(n)$, so smaller than that "hidden" in the second $\approx$ which is $n$. The latter (which is, after all, just a $+1$ added to a $-\ln\ln(n)$ exponent) is mostly a consequence of the approximations required to produce a manageable formula like the $\frac{e^\delta}{(1+\delta)^{1+\delta}}$ one you used, from the "full-power" Chernoff-Hoeffding bound written in terms of relative entropy for $n$ independent $X_i$ with values in $[0,1]$ and expected sum $\mu$:

$\Pr\left[\sum_{i=1}^n X_i \geq \mu+\lambda)\right]\leq e^{−nH_{\mu/n}(\mu/n+ \lambda)}$, where $H_p(x)=x\ln(\frac{x}{p})+(1−x)\ln(\frac{1−x}{1-p})$ is the relative entropy of $x$ with respect to $p$.