Investigate maxima of Gaussian integral over sphere.

Let us write $\tilde{f}(x) = \frac{1}{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}e^{-\alpha|x-\omega|^2}dS(\omega)$. Since $\tilde{f}$ is radially symetric, we may assume $x = re$ where $r = |x|$ and $e = (1,0,\ldots,0)$. Then we may write

$$\tilde{f}(re) = \frac{1}{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}e^{-\alpha(r^2-2r\langle e,\omega\rangle+1)}dS(\omega) = \frac{e^{-\alpha(r^2+1)} }{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}e^{2\alpha r\langle e,\omega\rangle}dS(\omega).$$
Disregard the factor $e^{-\alpha}$ and define $f(r) = \frac{e^{-\alpha r^2} }{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}e^{2\alpha r\langle e,\omega\rangle}dS(\omega)$. We can observe that $X: \omega \mapsto \langle e,\omega\rangle\in [-1,1] $ is a random variable with probability density function $p_n(x) = c_n (1-x^2)^{\frac{n-3}{2}}$ on $[-1,1]. $($c_n$ is a positive constant such that $\int_{[-1,1]}p_n(x)dx =1$.)

Thus we have $f(r) = e^{-\alpha r^2}\int_{-1}^{1} e^{2\alpha r x}p_n(x)dx\;(*)$ and note that $M_X(t) = \int_{-1}^{1} e^{t x}p_n(x)dx$ is a moment-generating function(mgf) of $X$ , and $C_X(t) = \log M_X(t)$ is cumulant-generating function of $X$.

By taking logarithm on both sides of $(*)$, we maximize the following functional $$-\alpha r^2 + C_X(2\alpha r).$$

The first-order condition for this is:

$$r = C'_X(2\alpha r) = \frac{M'_X(2\alpha r)}{M_X(2\alpha r)}\quad \cdots (**).$$ Since $r=0$ is a trivial solution of $(**)$ we assume $r>0$.


What follows next is a probabilistic interpretation of this equation. Firstly, $$C'_X(t) = \frac{M'_X(t)}{M_X(t)} = E\left[X \frac{e^{tX}}{M_X(t)}\right] = \int x\cdot \frac{e^{tx}p(x)}{\int e^{ty}p(y)dy}dx$$ is an expected value of another random variable $X_t$ with the new density $\frac{e^{tx}p(x)}{\int e^{ty}p(y)dy}$. Thus $r = C'_X(2\alpha r) $ should be less than 1.


Secondly, $C''_X(t) = \frac{M''_X(t)}{M_X(t)}- \left(\frac{M'_X(t)}{M_X(t)}\right)^2$ in the similar way can be seen as variance of $X_t$ and thus is greater than $0$ (not zero since the distribution is not degenerate.) So $C'_X(t)$ is a strictly increasing function. Moreover, it should be that $C'_X(t) \uparrow 1$ as $t\to \infty$ since the distribution of $X_t$ converges to that of degenerate random variable $X\equiv 1$.

From the above argument, we see that there is unique $$\alpha= A(r) := \frac{(C'_X)^{-1}(r)}{2r}$$ for each $0<r<1$ satisfying $(**)$. The uniqueness of $r$'s corresponding to each $\alpha$ follows once if we show that $A(r)$ is strictly increasing on $(0,1)$. Since $C'_X$ is an increasing bijection from $(0,\infty)$ to $(0,1)$, we can substitute $r$ for $r = C'_X(u)$, and showing $1/A(r) = \frac{2C'_X(u)}{u}$ is decreasing in $u\in(0,\infty)$ is okay.


Our (almost) final claim is that $r \mapsto C'_X(r)/r$ is a decreasing function. Let us write by change of variable $u = (x+1)/2$, $$M_X(r) = e^{-r}\int_0^1 e^{2ru} \frac{u^{\frac{n-3}{2}}(1-u)^{\frac{n-3}{2}}}{B((n-1)/2,(n-1)/2)}du =: e^{-r}\Phi(2r).$$(Here, $B(x,y)$ is a beta function and $\Phi$ is a mgf of the beta distribution.)
We can calculate $\Phi(r)$ explicitly and omitting the procedure , we have

$$\Phi(r) =\sum_{k=0}^{\infty} \frac{a_k}{k!}r^k,\quad a_k := \prod_{j=0}^{k-1} \frac{\frac{n-1}{2}+j}{n-1+j}.$$ (https://en.wikipedia.org/wiki/Beta_distribution) From this, we have $$C'_X(r/2) = 2\left(\frac{\Phi'(r)}{\Phi(r)}-1/2\right). $$ Some more labor yields:

$$\Phi'(r) - \frac{1}{2}\Phi(r) = \sum_{k\geq 0} (a_{k+1}-a_k/2)\frac{r^k}{k!} = r \sum_{k=0}^{\infty}(a_{k+2}-a_{k+1}/2)\frac{r^k}{(k+1)!},$$ $$\frac{1}{4}\frac{C'_X(r/2)}{r/2} = \frac{\sum_{k=0}^{\infty}(a_{k+2}-a_{k+1}/2)\frac{r^k}{(k+1)!}}{\sum_{k=0}^{\infty} a_k\frac{r^k}{k!}}.$$


Using $\frac{a_{k+1}}{a_k} = \frac{1}{2}(1 + \frac{k}{n+k-1})$, we have $$\sum_{k=0}^{\infty}(a_{k+2}-a_{k+1}/2)\frac{r^k}{(k+1)!} = \sum_{k=0}^{\infty}\frac{a_{k+1}}{n+k}\frac{r^k}{k!}.$$ Finally, $$\frac{d}{dr}\left(\frac{1}{4}\frac{C'_X(r/2)}{r/2}\right) = \frac{\sum_{k=0}^{\infty}\frac{a_{k+2}}{n+k+1}\frac{r^k}{k!}\sum_{k=0}^{\infty} a_k\frac{r^k}{k!} -\sum_{k=0}^{\infty}\frac{a_{k+1}}{n+k}\frac{r^k}{k!}\sum_{k=0}^{\infty} a_{k+1}\frac{r^k}{k!} }{\left(\sum_{k=0}^{\infty} a_k\frac{r^k}{k!}\right)^2}.$$ Now the denominator is itself a power series whose $s$-th coefficient $c_s$is given by $$c_s = \sum_{j=0}^s \left(\frac{a_{j+2} a_{s-j}}{n+j+1}-\frac{a_{j+1} a_{s-j+1}}{n+j}\right)\frac{1}{j!(s-j)!}.$$ Using the recursion formula of $a_k$, $$\begin{multline} c_s= -\sum_{j=0}^s \frac{(s-2j)(n-1)+2(s-j)}{(n+j)(n+j+1)(n+2s-2j-1)}\frac{a_{j+1} a_{s-j+1}}{j!(s-j)!}\\ = -\frac{1}{2}\sum_{j=0}^s \left [\frac{(s-2j)(n-1)+2(s-j)}{(n+j)(n+j+1)(n+2s-2j-1)} +\frac{(2j-s)(n-1)+2j}{(n+2j-1)(n+s-j)(n+s-j+1)} \right]\frac{a_{j+1} a_{s-j+1}}{j!(s-j)!}. \end{multline}$$ And, $$\begin{multline}\frac{(s-2j)(n-1)+2(s-j)}{(n+j)(n+j+1)(n+2s-2j-1)} +\frac{(2j-s)(n-1)+2j}{(n+2j-1)(n+s-j)(n+s-j+1)} \\ =(n-1)(s-2j)\left[ \frac{1}{(n+j)(n+j+1)(n+2s-2j-1)} - \frac{1}{(n+2j-1)(n+s-j)(n+s-j+1)}\right] + \text{positive terms}\\ >\frac{(n-1)(s-2j)^2\{2j(s-j) +(n-1)(s-3) -4\}}{(n+j)(n+j+1)(n+2j-1)(n+s-j)(n+s-j+1)(n+2s-2j-1)} \geq 0, \end{multline}$$ if $s\geq 3, 1\leq j\leq s-1$.

In case $j=0$ or $j=s$, it is quite easy to check that $$\frac{s}{n(n+2s-1)} - \frac{s}{(n+s)(n+s+1)}\geq 0.$$ So, negativity of $c_s, s\geq 3$ is established. Some more labors yield: $$c_0 = 0, c_1 = c_2 = -\frac{1}{4n^2(n+2)}.$$

Since all $c_s \leq 0$, the denominator is less than 0, and the claim $\frac{d}{dr}\left(\frac{C'_X(r)}{r}\right)<0$ is proved.
Finally, observe that $\lim_{r\to 0^+}A(r) = \lim_{u\to 0^+}\frac{u}{2C'_X(u)} = \frac{1}{2C''_X(0)} = \frac{n}{2}.$ This is because $C''_X(0)$ is a variance of $X=\langle e,\omega\rangle=\omega_1$, and this value is equal to $$\frac{1}{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}\omega_1^2 dS(\omega)=\frac{1}{|\mathbf{S}^{n-1}|}\int_{\mathbf{S}^{n-1}}\frac{\omega_1^2 +\cdots \omega_n^2}{n} dS(\omega) = \frac{1}{n}.$$ Hence, $A((0,1)) = (\frac{n}{2}, \infty)$ and we conclude that:
(1) if $0<\alpha \leq \frac{n}{2}$, then the unique global maximizer is $r=0$.
(2) if $\alpha >\frac{n}{2}$, then the unique global maximizer is $r = A^{-1}(\alpha)>0$. Here, $0$ is not a maximizer even though it satisfies the FOC, since the second-order condition is not satisfied: $$(-\alpha r^2 + C_X(2\alpha r))''\Big|_{r=0}=2\alpha(\frac{2\alpha}{n}-1) >0. $$