product distribution of two uniform distribution, what about 3 or more

We can at least work out the distribution of two IID ${\rm Uniform}(0,1)$ variables $X_1, X_2$: Let $Z_2 = X_1 X_2$. Then the CDF is $$\begin{align*} F_{Z_2}(z) &= \Pr[Z_2 \le z] = \int_{x=0}^1 \Pr[X_2 \le z/x] f_{X_1}(x) \, dx \\ &= \int_{x=0}^z \, dx + \int_{x=z}^1 \frac{z}{x} \, dx \\ &= z - z \log z. \end{align*}$$ Thus the density of $Z_2$ is $$f_{Z_2}(z) = -\log z, \quad 0 < z \le 1.$$ For a third variable, we would write $$\begin{align*} F_{Z_3}(z) &= \Pr[Z_3 \le z] = \int_{x=0}^1 \Pr[X_3 \le z/x] f_{Z_2}(x) \, dx \\ &= -\int_{x=0}^z \log x dx - \int_{x=z}^1 \frac{z}{x} \log x \, dx. \end{align*}$$ Then taking the derivative gives $$f_{Z_3}(z) = \frac{1}{2} \left( \log z \right)^2, \quad 0 < z \le 1.$$ In general, we can conjecture that $$f_{Z_n}(z) = \begin{cases} \frac{(- \log z)^{n-1}}{(n-1)!}, & 0 < z \le 1 \\ 0, & {\rm otherwise},\end{cases}$$ which we can prove via induction on $n$. I leave this as an exercise.


If $X_1$ is uniform, then $-\log X_1 \sim \textrm{Exp}(1)$. Therefore, $$- \log X_1 \dots X_n = -\log X_1 + \dots -\log X_n$$ is a sum of independent exponential random variables and has Gamma distribution with parameters $(n,1)$ and density $g(y) = \frac{1}{(n-1)!} y^{n-1}e^{-y}$ for $y\geq 0$. Let $f$ be the density of the product $X_1 \dots X_n$, then the Jacobi's transformation formula yields $$ f( h^{-1}(y) ) | \partial h^{-1}(y) | = g(y), $$ with $h(x) = -\log x$ and $h^{-1}(y) = \exp(-y)$. The substitution $y=h(x)$ in the above equation gives $$ f(x) = \frac{1}{(n-1)!}(-\log x)^{n-1} \, 1_{ (0,1]}(x).$$


An adaptation of this answer is given here.


PDF of a Function of a Random Variable

If $P(X\le x)=F(x)$ is the CDF of $X$ and $P(Y\le y)=G(y)$ is the CDF of $Y$ where $Y=f(X)$, then $$ F(x)=P(X\le x)=P(Y\le f(x))=G(f(x))\tag1 $$ Taking the derivative of $(1)$, we get $$ F'(x)=G'(f(x))\,f'(x)\tag2 $$ where $F'$ is the PDF of $X$ and $G'$ is the PDF of $Y$.


PDF of the Product of Independent Uniform Random Variables

If $[0\le x\le1]$ is the PDF for $X$ and $Y=\log(X)$, then by $(2)$ the PDF of $Y$ is $e^y[y\le0]$. The PDF for the sum of $n$ samples of $Y$ is the $n$-fold convolution of $e^y[y\le0]$ with itself. The Fourier Transform of this $n$-fold convolution is the $n^\text{th}$ power of the Fourier Transform of $e^y[y\le0]$, which is $$ \int_{-\infty}^0 e^{-2\pi iyt}e^y\,\mathrm{d}y=\frac1{1-2\pi it}\tag3 $$ Thus, the PDF for the sum of $n$ samples of $Y$ is $$ \begin{align} \sigma_n(y) &=\int_{-\infty}^\infty\frac{e^{2\pi iyt}}{(1-2\pi it)^n}\,\mathrm{d}t\tag{4a}\\ &=\frac{e^y}{2\pi i}\int_{1-i\infty}^{1+i\infty}\frac{e^{-yz}}{z^n}\,\mathrm{d}z\tag{4b}\\ &=e^y\frac{(-y)^{n-1}}{(n-1)!}\,[y\le0]\tag{4c} \end{align} $$ Explanation:
$\text{(4a)}$: take the inverse Fourier Transform
$\text{(4b)}$: substitute $t=\frac{1-z}{2\pi i}$
$\text{(4c)}$: if $y\gt0$, close the contour on the right half-plane, missing the singularity at $z=0$
$\phantom{\text{(4c):}}$ if $y\le0$, close the contour on the left half-plane, enclosing the singularity at $z=0$

We can get the PDF for the product of $n$ samples of $X$ by applying $(2)$ to $(4)$ $$ \bbox[5px,border:2px solid #C0A000]{\pi_n(x)=\frac{(-\log(x))^{n-1}}{(n-1)!}\,[0\le x\le1]}\tag5 $$ enter image description here