Asymptotics of functional of i.i.d. Rademacher random variables
EDITED: As pointed out by Anthony and John, my 2am solution was anything but. In summary, the conjecture is TRUE for $C$ smaller than approximately 0.6880137 and false for larger $C$.
The exact value is $$ S(C,n) = \sum_{k=0}^n 2^{-n}\binom{n}{k} \exp\bigl(Cn^{-3}(n-2k)^4\bigr)$$ since the product of the first two terms is the probability that $\sum_i X_i=n-2k$.
Consider the term $k=(\tfrac12+\beta)n$, where $-\frac12\le\beta\le\frac12$. Stirling's approximation for constant $\beta\ne\pm\frac12$ is $$2^{-n}\binom nk \approx \bigl(2\pi(\tfrac12-\beta)(\tfrac12+\beta)n\bigr)^{-1/2} \bigl(2 (\tfrac12-\beta)^{\tfrac12-\beta} (\tfrac12+\beta)^{\tfrac12+\beta}\bigr)^{-n}$$
Therefore the value of the term is close to $$\bigl(2\pi(\tfrac12-\beta)(\tfrac12+\beta)n\bigr)^{-1/2} \exp\bigl(f(\beta)n\bigr),$$ where $$ f(\beta) = -(\tfrac12-\beta)\ln(\tfrac12-\beta) - (\tfrac12+\beta)\ln(\tfrac12+\beta) - \ln 2 + 16C\beta^4. $$ (Note that there is an order $1/n$ term from Stirling's formula missing. Since it doesn't depend on $C$ and the value of the sum is 1 for $C=0$, the effect of the missing term is easily corrected at the end.)
The function $f(\beta)$ has a local maximum of value 0 at $\beta=0$ for every $C$. There is a constant $C_0\approx 0.6880137$ (note: less than $\ln 2$) so that for $C\lt C_0$ we have $f(\beta)\lt 0$ for all $\beta\ne 0$. For $C\gt C_0$ there are places where $f(\beta)\gt 0$ and the sum will be exponential.
Therefore, when $C\lt C_0$ the sum is dominated by terms near $\beta=0$. Around there, $$f(\beta) = -2\beta^2 + (16C-\tfrac43)\beta^4 + O(\beta^6),$$ so the term is close to $$(\pi n/2)^{-1/2} e^{-2\beta^2 n}\bigl(1+(16C-\tfrac43)\beta^4n\bigr) $$ with terms for $\beta=O(n^{-1/2})$ mattering. Approximating the sum by an integral, I get $$ S(C,N) = 1 + 3C/n + O(n^{-2}),$$ for $C\lt C_0$.
ADDED: The exact value of $C_0$ is $$\frac{1}{64\beta_0^3}\ln\Bigl(\frac{1+2\beta_0}{1-2\beta_0}\Bigr),$$ where $\beta_0$ is the zero near 0.4953 of $$\bigl(-\tfrac12+\tfrac34\beta\bigr)\ln(1-2\beta) + \bigl(-\tfrac12-\tfrac34\beta\bigr)\ln(1+2\beta).$$ To 50 digits: $C_0 = 0.68880137394879099153980106986892429419403651844842$.
I believe the conjecture is true for sufficiently small $C$. Previously I was trying to disprove it using Large deviation theory. But I missed a sign at the last step. But the same argument can be turned around.
In compact form, one gets $$ \lim \frac{1}{n} \log P(\sum_i X_i > \alpha n) = -I(\alpha),$$ where $I(\alpha) = \sup_\theta (\alpha \theta - \log \mathbb{E} e^{\theta X_1})$. One can compute $$I(\alpha) = (\frac{1}{2} - \alpha)\log (\frac{1}{2} - \alpha) + (\frac{1}{2} + \alpha)\log(\frac{1}{2} + \alpha) + \log 2.$$
More precisely, the lower bound (Cramer's inequality) states that for any $\epsilon > 0$, as long as $n$ is sufficient large, we get $P(\sum_i X_i \ge \alpha n) \ge e^{-n (I(\alpha) + \epsilon)}$; see this lecture notes. The upper bound is without the $\epsilon$ and can be proved easily using exponential Tchebyshev. We only need the upper bound here.
Now write the original expectation as $\int_0^1 1 + P(|\sum X_i | > \alpha n) \frac{d}{d\alpha} e^{C \alpha^4 n} d\alpha$, by the so-called integration by parts formula in probability; see page 66 lemma 1.4.30 here. The latter is bounded by $$ \int_0^1 1 d\alpha + \int_0^1 4 \alpha^3 C n e^{(C \alpha^4 - I(\alpha))n} d\alpha.$$
After evaluating the third moment of a Gaussian of variance $n^{-1}$ near $\alpha = 0$, one sees that the integral near $0$ goes to zero as $C n^{-2}$, and the rest of the integral is clearly very small.
Note this argument doesn't work for Gaussian because the upper bound of integration is $\infty$ in that case. This is a good check of my often reckless arithmetic.