Numerical phenomenon. Who can explain?

EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit.

Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.

Let $X_i \sim \operatorname{Uniform}(0, r)$, and let $Y_n = \prod_{i=1}^{n} X_i$. Note that $\log(Y_n) = \sum_{i=1}^n \log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.

The more useful formulation here is that $\frac{\log(Y_n)}{n} = \frac 1 n \sum \log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $\mathbb E[\log(X_i)]$. We have $$\mathbb E \log(X_i) = \int_0^r \log(x) \cdot \frac 1 r \, \textrm d x = \frac 1 r [x \log(x) - x] \bigg|_0^r = \log(r) - 1.$$

If $r < e$, then $\log(Y_n) / n \to c < 0$, which implies that $\log(Y_n) \to -\infty$, hence $Y_n \to 0$. Similarly, if $r > e$, then $\log(Y_n) / n \to c > 0$, whence $Y_n \to \infty$. The fun case is: what happens when $r = e$?


I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-\infty, \ln(R) ] $ with density function given by $ p(y) = e^y / R, y \in (-\infty, \ln(R)] $. The expected value of this distribution is $$ \int_{-\infty}^{\ln(R)}\frac{y e^y}{R} \,\mathrm dy = \ln(R) - 1 .$$ Solving for zero gives the answer to the riddle! Love it!