Existence of iid random variables

Saz your approach is correct. In fact, it is known as the Kolmogorov's existence theorem. One book where you can find the proof is on page 482 of Probability and measure of Patrick Billingsley. On this book, you can see how to construct independent processes and random variables via finite dimensional distributions. These independent processes and random variables will be projections of the original ones, and they will indeed have the same distribution.


I cannot comment yet, so I'm posting this as an answer.$\def\ci{\perp\!\!\!\perp}$

This is probably not what you were asking, but I think it's interesting and relevant enough to post.

It's known possible to construct arbitrary distributions from uniform variables. Furthermore, given a $\mathcal U[0,1]$ variable, it's possible to produce from it an i.i.d sequence of such variables, which can then be used to obtain more general distributions. We can always extend a space to obtain such variables by$$\hat{\Omega}=\Omega\times[0,1]\text{, }\hat{\mathscr{A}}=\mathscr{A}\otimes\mathscr{B}\text{, }\hat{P}=P\otimes\lambda $$ in which case $\vartheta(\omega,t):= t$ is $\mathcal{U}[0,1]$ and $\vartheta\ci \mathscr{A}$.

For more details see Kallenberg - Foundations of Modern Probability (2002), in particular the discussion before Theorem 6.10 (transfer).


To give this a simple answer:

Yes, the approach described in the question works fine.