What is $\sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} e^{-s(k - np)^2}$?

I don't think there is a closed form. As for when $n\rightarrow\infty$, it seems that the limit is $0$ for $s>0$. Here is a sketch of the proof, hoping there are no embarrassing typos.

$S_{n,p,s}=\mathbb{E}\Big[\exp(-s(X_n-\mathbb{E}[X_n])^2\Big]$ where $\mathbb{E}$ means expectation with respect to a binomial distribution $Bi(n,p)$.

For $n$ fixed, $X_n$ can be thought as the sum of $n$ i.i.d Bernoulli random variables, say $X_n\stackrel{\text{law}}{=}B_1+\ldots + B_n$

Then the expression inside expectation becomes $\exp\Big(-sp(1-p)n\big(\frac{X_n-np}{\sqrt{p(1-p)n}}\big)^2\Big)$

By Central limit theorem $\Big(\frac{X_n-np}{\sqrt{p(1-p)n}}\Big)^2$ converges in law to $\chi^2$. There exists a coupling in which the convergence is point wise a.s. This and dominated convergence imply the convergence to $0$ if $s>0$.


Here are plot estimates of $h_n(t)=\mathbb{E}[-t(X_n-np)^2]$ obtained by sampling Bernoulli random variables with $p=0.5$. This provides empirical evidence that indeed $h_n(t)\xrightarrow{n\rightarrow\infty} 0$ for $t>0$. An R script is given below.

Graph of h_n

    library(latex2exp)
    ### Function of interest
    myfunction <- function(s,x){
        exp(-s*x*x)
    }
    ### Generate point estimates at time t
    myfunction.t <- function(bin.central,t){
        mean(myfunction(t,bin.central))
    }
    Vmyfunction.t <- Vectorize(myfunction.t,vectorize.args = "t")

    ### Generate samples of (X-np)^2 where X~Binomial(n,p)
    npop <- 10000 
    nsample <- 10000
    p<- .5
    ber.sample <- matrix(rbinom(npop*nsample,1,p),npop,nsample) - p
    ber.tot.sample1 <- apply(ber.sample[1:1000,],2,sum)
    ber.tot.sample2 <- apply(ber.sample[1:2500,],2,sum)
    ber.tot.sample3 <- apply(ber.sample[1:5000,],2,sum)
    ber.tot.sample4 <- apply(ber.sample,2,sum)
    t <- seq(0,2,by = .05)
    ### Generate plot of point estimates of E[exp(-t*(X-np)^2)] at time t.

    plot(t,Vmyfunction.t(ber.tot.sample1,t),type = "l",col = "magenta",
             xlab="t", ylab="mean")
    title(main=TeX("Expencted value of $\\exp(-s(X-np)^2)$"),
                sub=TeX("$X\\sim binom(n,0.5)$"))
    lines(t,Vmyfunction.t(ber.tot.sample2,t),type = "l",col = "red")
    lines(t,Vmyfunction.t(ber.tot.sample3,t),type = "l",col = "green")
    lines(t,Vmyfunction.t(ber.tot.sample4,t),type = "l",col = "darkgreen")
    legend("topright", cex=0.6,
                 c("1000","2500","5000","10000"),
                 title="n",
                 fill=c("magenta","red","green","darkgreen"),
                 horiz=TRUE)
############ END ##########

Just an attempt in trying to simplify the sum (without any idea to reach a closed form):

$$S_{n, p, s} = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} e^{-s(k - np)^2}$$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \sum_{m=0}^\infty (-1)^m \frac{\left[s(k - np)^2\right]^m}{m!}$$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \sum_{m=0}^\infty \frac{(-1)^m}{m!} s^m \sum_{r=0}^{2m} k^r (-np)^{2m-r} $$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \sum_{m=0}^\infty \frac{(-1)^{m}}{m!} s^m \sum_{r=0}^{2m} (-k)^r (np)^{2m-r} $$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \sum_{m=0}^\infty \frac{(-1)^{m}}{m!} (sn^2p^2)^m \sum_{r=0}^{2m} \left(-\frac{k}{np}\right)^r $$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \sum_{m=0}^\infty \frac{(-1)^{m}}{m!} (sn^2p^2)^m \left[ \frac{\left(-\frac{k}{np}\right)^{2m+1}-1}{\left(-\frac{k}{np}\right)-1} \right] $$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \left[ \frac{\exp(-sk^2) \left(\frac{k}{np}\right)+\exp(-sn^2p^2)}{\left(\frac{k}{np}\right)+1} \right] $$ $$ = \sum_{k=0}^n {n \choose k} p^k (1-p)^{n-k} \left[ \frac{k\exp(-sk^2)+np\exp(-sn^2p^2)}{k+np} \right] $$

See if this helps in leading to a closed form at certain approximations of $n$, as well as any condition on $np$.

Tags:

Summation