local behavior of a finite Borel measure

If I am not wrong, the answer is no (even for $\sigma$-finite measures). We can assume we are working with a finite measure $\mu$ in $[0,1]^d$.

Consider the sets $A_k =${$ x \ : \ \liminf_{r\to 0} |\log \mu(B(x,r))| r > 1/k$}. It is enough to show that the sets $A_k$ have zero measure.

Now, $A_k = \bigcap_N \bigcup_{n>N} B_n$ where $B_n =$ {$x \ : \ \mu(B(x,1/n))< e^{-n/k} $}.

To see this, notice that for every $x\in A_k$, there exists $n_0$ such that for every $n>n_0$ we have that $1/n |\log \mu(B(x,1/n))| > 1/k$.

So, it is enough to show that $\sum_{n>N} \mu(B_n)\to 0$ as $N\to \infty$.

For every $n>0$ we can divide $[0,1]^d$ in $C n^d$ cubes whose interiors cover $[0,1]^d$ and such that any of them is has diameter smaller than, say, $1/(10n)$.

Now, $\mu(B_n) \leq \sum_{i\in I} \mu(Q_i)$ where the $Q_i$ are the cubes which intersect $B_n$. Now, for each $Q_i$ with $i\in I$ we have that $$\mu(Q_i) \leq e^{-n/k}$$

since it is contained in $B(x,1/n)$ of a point $x\in B_n$.

So, $\mu(B_n) \leq C n^d e^{-n/k} \to 0$ and this implies that $\sum_{n>N} \mu(B_n) \to 0$ as $N\to \infty$.


I think in order to prove the sharp result you need to use a covering theorem. The best option seems to be Vitali's covering theorem for Radon measures, which says the following: let $\mu$ be a Radon measure on $\mathbb{R}^d$, let $A\subset\mathbb{R}^d$ be a set, and let $\mathcal{B}$ be a family of balls such that each point of $A$ is the center of balls in $\mathcal{B}$ of arbitrarily small radius. Then there is pairwise disjoint collection $\{ B_i\}\subset\mathcal{B}$ such that $\mu(A\backslash \bigcup_i B_i)=0$. For a proof, see e.g. Mattila's book, Theorem 2.8.

With this theorem it is easy to prove the following: if $\mu$ is a Radon measure on $\mathbb{R}^d$, then $$ \liminf_{r\to 0}\frac{\mu(B(x,r))}{r^d} > 0 \quad \text{for } \mu\text{-a.e. } x. $$

To prove this, suppose $\mu$ is a measure such that the claim fails. We may assume without loss of generality that $\mu$ has bounded support (otherwise, let $\mu_N$ be the restriction of $\mu$ to $B(0,N)$; the theorem holds for all $\mu_N$ so it also holds for $\mu$).

Fix a small $\varepsilon>0$. Let $\mathcal{B}$ be the family of all balls $B$ of radius $r\le 1$ such that $\mu(B) < \varepsilon r^d$. Moreover, let $A$ be the set of all $x$ such that $\mu(B(x,r_i)) < \varepsilon r_i^d$ for a sequence $r_i\to 0$. Then $A$ satisfies the hypotheses of Vitali's covering lemma for the family $\mathcal{B}$. Also note that $A\supset A'$, where $$ A' = \{x\in\mathbb{R}^d: \liminf_{r\to 0}\frac{\mu(B(x,r))}{r^d} = 0\}. $$ By assumption, $\mu(A')>0$.

Now let $\{ B_i\}$ be the disjoint collection of balls provided by Vitali's covering lemma. On one hand, we have $$ \mu(\bigcup_i B_i) \ge \mu(A) \ge \mu(A') >0. $$ On the other hand, by definition of the family $\mathcal{B}$ and the fact that $\{B_i\}$ is disjoint, $$ \mu(\bigcup_i B_i) = \sum_i \mu(B_i) \le \varepsilon \sum_i r_i^d < C \varepsilon, $$ where $r_i$ is the radius of $B_i$ and $C$ is a constant that depends only on the size of the support of $\mu$. By taking $\varepsilon$ small enough we obtain a contradiction.

As a corollary, we find that, indeed, it is impossible to have $\mu(B(x,r))\sim r^\alpha$ for any $\alpha>d$.


If you are willing to replace $\mathbb{R}^n$ by a Banach space $B$, then it is perfectly possible. For example, the Wiener measure $\mu$ on the space of continuous function from $[0,1]$ to $\mathbb{R}$ satisfies that for $\mu$-almost every $x$, $\mu(B(x,r)) \sim \exp(-1/r^2)$ for small $r$. (The keyword for this is "small ball estimates".) If you are willing to forgo separability of $B$, then you can even construct probability measures with the property that, for every $x \in B$, $\mu(B(x,1)) = 0$!

The canonical example for this is to take for $\mu$ the law of a sequence of i.i.d. normal random variables and for $B$ the space of sequences $x = (x_n)$ such that $$ \|x\| := \sup_{n\ge 1} \Big(2^{-n}\sum_{k=2^{n-1}}^{2^n-1} x_k^2\Big)^{1/2} < \infty\;. $$ The claim then follows from the strong law of large numbers.