Prove that $|k(x)|\le C|x|^{-n}$ under suitable hypothesis on $k\in\mathcal{C}^1(\Bbb R^n\setminus\{0\})$

I'm only writing a small addition to Eric's nice proof as an answer because it's too big for a comment. I think the following may be a little simpler: Suppose $|x|=r.$ Because $rS$ is connected and the integral of $k$ over $rS$ is $0,$ there exists $x_0\in rS$ such that $k(x_0) = 0.$ Now any two points on $rS$ can be connected by a smooth path in $rS$ of length $\le \pi r.$ So let $\gamma :[0,1]\to rS$ be such a path connecting $x_0$ with $x,$ with $\gamma (0) = x_0, \gamma (1)=x.$ Define $g(t) = k(\gamma(t)).$ Then

$$k(x) = g(1)-g(0) = \int_0^1 g'(t) \, dt = \int_0^1 \nabla k(\gamma(t)) \cdot \gamma'(t) \, dt.$$

Take absolute values to see

$$|k(x)| \le r^{-n-1} \int_0^1 |\gamma'(t)|\,dt \le r^{-n-1} \pi r = \pi r^{-n}.$$


Assume the gradient condition. For any function $f : \mathbb{R}^n \to \mathbb{R}$, define $f_\varepsilon$ by $f_\varepsilon(x) = \varepsilon^{-n} f(x/\varepsilon)$. Note $$ \int_{S^{n-1}} k_\varepsilon(x) d\sigma_{n-1}(x) = \int_{[\lvert x \rvert = \varepsilon^{-1}]} k(x) d\sigma_{n-1}(x) = 0 \quad \forall \varepsilon > 0 $$ and $$ \lvert (\nabla k_\varepsilon)(x) \rvert = \lvert \varepsilon^{-n-1} (\nabla k)(x/\varepsilon) \rvert \leq 1 \quad \forall x \in S^{n-1}. $$ Now for any $x,y \in S^{n-1}$, we can find a path along a great circle of $S^{n-1}$ connecting $x$ and $y$ of length $\leq \pi$. So by the fundamental theorem of calculus, $$ \lvert k_\varepsilon(x) - k_\varepsilon(y) \rvert \leq \pi \sup_{z \in S^{n-1}} \lvert (\nabla k_\varepsilon)(z) \rvert \leq \pi \quad \forall x,y \in S^{n-1}. $$ But if $\lvert k_\varepsilon(x) \rvert > 2 \pi$ for any $x \in S^{n-1}$, then $k_\varepsilon$ is positive on $S^{n-1}$, and so cannot have mean value $0$. Thus $\lvert k_\varepsilon(x) \rvert \leq 2 \pi$ for all $x \in S^{n-1}$. We have $$ \lvert k(x) \rvert = \lvert x \rvert^{-n} \lvert k_{\lvert x \rvert^{-1}}(x/\lvert x \rvert)\rvert \leq 2\pi \lvert x \rvert^{-n} $$ as desired.

Now assume the $\alpha$ condition. Observe how the $\alpha$ bound reacts to $k_\varepsilon$: $$ \lvert k_\varepsilon(x + h) - k_\varepsilon(x) \rvert \leq {\lvert h \rvert^\alpha} \quad \forall x \in S^{n-1}, \forall \lvert h \rvert \leq 2^{-1}. $$ Now for any two points $x,y \in S^{n-1}$, connect them by $k$ additions of some $h_1, \ldots, h_k$ with $\lvert h_j \rvert < 1/2$ with $x + \sum_{i=1}^j h_i \in S^{n-1}$ and $x + \sum_{i=1}^k h_i = y$ for all $1 \leq j \leq k$, where $k$ does not depend on $x$ and $y$. Splitting up with the triangle inequality shows that $\lvert k_\varepsilon(x) - k_\varepsilon(y) \rvert$ is bounded independently of $x, y \in S^{n-1}$. The rest of the proof is the same as in the gradient case.