Thorem stating how much of a population is within $n$ standard deviations from the mean

Yes, some of it is true, and comes from Tschebychev inequality. It says that $$P(|X-\mu|\le n\sigma)\ge 1-\frac1{n^2}.$$ This gives the mentioned $0.75$ for $n=2$, but the $0.5$ actually appears if you take $n=\sqrt2=1.41\ldots$. It says nothing for $n=1$.

This is valid for any distribution, provided it has a well defined finite mean and a finite variance/s.d. For other distributions exact values can be determined, which are necessarily equal or greater (quite bigger, for most usual distributions) than those given by this theorem.


You're thinking of Chebyshev's inequality.

For any $k$, in any distribution with a mean and standard deviation, the probability of being more than $k$ standard deviations away from the mean is no more than $\frac1{k^2}$.

Most distributions, of course, are much tighter than this; the theorem is the worst case.