How to estimate a specific infinite matrix sum
Let $s = k-1$, and write $x^T M x = \|x\|^2 + s (e \cdot x)^2$ where $e = (1,\ldots,1)$. Let $P_j = \{x \in \mathbb Z^n: e \cdot x = j\}$. Each $P_j$ is a translate of $P_0$, which is a subgroup of $\mathbb Z^n$. Then
$$S_{M,k} = \sum_{j\in \mathbb Z} e^{-s j^2} \sum_{x \in P_j} e^{-\|x\|^2} = \sum_{j \in \mathbb Z} e^{-s j^2} T(j)$$
where $T(j) = \sum_{x \in P_j} e^{-\|x\|^2}$. Now $P_{j+n} = e + P_j$, and if $x \in P_j$ we have $\|e + x\|^2 = n + 2 j + \|x\|^2$. Thus $$T(n+j) = e^{-n-2j} T(j)$$ so that $$T(mn + j) = e^{-m^2 n - 2 mj} T(j)$$ We just have to compute (or approximate) $T(0), \ldots, T(n-1)$, and then $$\eqalign{S_{M,k} &= \sum_{j=0}^{n-1} \sum_{m \in \mathbb Z} e^{-s(j+mn)^2} e^{-m^2n-2mj} T(j) = \sum_{j=0}^{n-1} e^{-sj^2} T(j) \sum_{m \in \mathbb Z} e^{-(n^2s+n) m(m+2j/n)}\cr &= \sum_{j=0}^{n-1} e^{-sj^2} T(j) \theta_3(i(ns+1)j, \exp(-n^2 s))} $$ where $\theta_3$ is a Jacobi theta function.
This is an optimistic approach and since it is long, I write it as an answer. First we set $$H(M,x)=\Sigma_{x\in \mathbb{Z}^n}{e^{-x^TMx}},$$ Which is your introduced summation. Let $J$ be the all one matrix with size $n$ and $I$ be the identity matrix. Since we have:$$M= kI+(k-1)(J-I),$$ we can see that the eigenvalues of the matrix $M$ are as follow: $$\lambda_1=\cdots=\lambda_{n-1}=1,\lambda_n=(n+1)(k-1).$$ Let $D={\rm diag}(\lambda_1,\lambda_2,\ldots,\lambda_n)$, which is a diagonal matrix of size $n$. Assume that $u_1,u_2,\ldots,u_n$ are the normalized eigenvectors of the later eigenvalues, respectively. Notice that you can find easily these eigenvectors in such a way that in the matrix $U=[u_1,u_2,\ldots,u_n]$, each two columns are orthogonal. By spectral decomposition, we have:$$M=U^TDU.$$ Now, we define the below summation: $$H(M,X)=\Sigma_{x_{ij}\in \mathbb{Z}}{e^{-X^TMX}},$$ where $X$ is an arbitrary integer matrix with size $n$. So, if we can estimate this summation, we can estimate an estimation for your summation. But, we have: $$H(M,X)=\Sigma_{x_{ij}\in \mathbb{Z}}{e^{-(UX)^TMUX}}=(UX)^Te^{-D}UX=e^{-\lambda_1}p_1p_1^T+\cdots+e^{-\lambda_n}p_np_n^T=\frac{1}{e}({p_1p_1^T+p_2p_2^T+\cdots+p_{n-1}p_{n-1}^T})+\frac{1}{e^{(n+1)(k-1)}}p_np_n^T,$$where in the last summation, I supposed that $UX$ is invertible, since in almost all cases it is. Note that we have $p_i=y_i^Tu_i^T$, where $X=[y_1,\ldots,y_n]^T$. Now, since you know completely the eigenvectores of $M$, by some matrix inequality and some special values of the matrix $X$, you can find a good upper bound for this summation. I think it can be seen as integer linear optimization problem.
I think this approach can be optimized. Also, maybe there are some better way by using Fourier transformation.