Is Standard Deviation the same as Entropy?

They are not the same. If you have a bimodal distribution with two peaks and allow the spacing between them to vary, the standard deviation would increase as the distance between the peaks increases. However, the entropy $$H(f) = -\int f(x) \log f(x) dx$$ doesn't care about where the peaks are, so the entropy would be the same.


More counterexamples:

  1. Let X take be a discrete random variable taking two values $(-a,a)$ with equal probability. Then the variance $\sigma_X^2=a^2$ increases with $a$, but the entropy is constant $H(X)=1$ bit.

  2. Let $X$ be a discrete rv taking values on $1 \cdots N$ with some arbitrary non-uniform distribution $p(X)$. If we permute the values of $p(X)$, the variance will change (decrease if we move the larger values towards the center), but the entropy is constant.

  3. Let $X$ be a continuous rv with uniform distribution on the interval $[-1,1]$ $p(X)=1/2$. Let modify it so that its probability (on the same support) is bigger towards the extremes : say, $p(Y)=|Y|$. Then $\sigma^2_Y > \sigma_X^2$ but $H(Y)< H(X)$ (the uniform distribution maximes the entropy for a fixed compact support).


Entropy and Standard Deviation are certainly not the same, but Entropy in most cases (if not all) depends on the Standard Deviation of the distribution. Two examples:

For the Exponential distribution with density function $$\lambda e^{-\lambda x},\;\; x\ge 0,\, SD=1/\lambda$$we have

$$H(X) = 1-\ln\lambda = 1+\ln SD$$

So as SD increases, so is (here differential) Entropy.

For the Normal distribution, with density function $$\frac{1}{\sigma\sqrt{2\pi}}\, e^{-\frac{(x - \mu)^2}{2 \sigma^2}}, \;\; SD = \sigma$$ we have

$$H(X) = \frac12 \ln(2 \pi e \, \sigma^2) = \frac12 \ln(2 \pi e) +\ln SD $$ so again differential Entropy increases with SD.

(Note that differential Entropy can be negative).