Interpretation of sigma algebra

Gambling is a good starting-point for probability. We can treat $\sigma$-field as a structure of events as we need to define the addition and multiplication for numbers. The completeness of the real numbers is suitable for our calculations, and $\sigma$-field plays the same role.

I hope the following gambling example helps you to understand the filtration and conditional expectation.

Assuming that two people, say player A and player B, bet on the results of two coin tosses. H: head T: tail

At the time $0$, A and B do not know anything about the result except that one of the events in $\Omega=\{HH,HT,TH,TT\}$ will happen. Hence the information at time $0$ that they both know is $\mathcal{F}_0=\{\emptyset,\Omega\}$.

At the time $1$, the coin had been tossed only once; and they know that the events in the $\sigma$-field $\mathcal{F}_1=\{\emptyset, \Omega, \{HH,HT\},\{TH,TT\}\}\supset \mathcal{F}_0 $ could happen.

At the time $2$, the coin had been tossed twice; and they know that the events in the $\sigma$-field $\mathcal{F}_2=\{\emptyset, \Omega,\{HH,HT\},\{TH,TT\},\{HH\},\{HT\},\{TH\},\{TT\}\}\supset \mathcal{F}_1$ could happen which means they know everything about the gambling results.

Please notice the evolution of information characterized by the filtrations $\mathcal{F}_0,\mathcal{F}_1,\mathcal{F}_2.$ With time passing, the unknown world $\Omega$ is divided more finely. It is something like water flows through pipes.

Assuming that they bet on the following results and the coin is fair. $$X(\omega)=\left\{ \begin{array}{l} 2, \omega=HH,\mbox{means the first tossing is H, and the second tossing is H}\\ 1, \omega=HT,\mbox{means the first tossing is H, and the second tossing is T}\\ 1, \omega=TH,\mbox{means the first tossing is T, and the second tossing is H} \\ 0, \omega=TT,\mbox{means the first tossing is T, and the second tossing is T}\\ \end{array} \right.$$

Then, we have

$$E[X|\mathcal{F}_0](\omega)=1\qquad\text{for every}\ \omega $$ $$E[X|\mathcal{F_2}](\omega)=X(\omega)\qquad\text{for every}\ \omega $$ $$E[X|\{HH,HT\}]=2P(HH|\{HH,HT\})+1P(HT|\{HH,HT\})$$ $$+1P(TH|\{HH,HT\})+0P(TT|\{HH,HT\})=\frac{3}{2}$$ $$E[X|\{TH,TT\}]=2P(HH|\{TH,TT\})+1P(HT|\{TH,TT\})$$ $$+1P(TH|\{TH,TT\})+0P(TT|\{TH,TT\})=\frac{1}{2} $$

$$E[X|\mathcal{F_1}](\omega)=\left\{ \begin{array}{l} \frac{3}{2}, \omega\in \{HH,HT\}\\ \frac{1}{2}, \omega \in \{TH,TT\} \end{array} \right. $$

I hope those would be helpful.


As pointed out in the comments in the previous answer, the collection $\mathcal{F}_2$ is not a $\sigma$-algebra because is not closed under unions and intersections.

The right argument is the following:

At time 0, A and B do not know anything about the result except that one of the events in $\Omega:=\{HH,HT,TH,TT\}$ will happen. Hence the information at time 0 that they both can talk about is the $\sigma$-algebra generated by a unique set $\Omega$, say $\mathcal{F}_0$.

At time 1, the coin had been tossed only once; and that they know are the events in the collection $\{\{HH,HT\},\{TH,TT\}\}$. Hence the information at time 1 that they both can talk about is the $\sigma$-algebra generated by latter collection of sets, say $\mathcal{F}_1$.

At time 2, the coin had been tossed twice; and they know that the events in the collection $\{\{HH\},\{HT\},\{TH\},\{TT\}\}$ could happen which means they know everything about the gambling results. Thus the information at time 2 that they both can talk about is the $\sigma$-algebra generated by latter collection of sets, say $\mathcal{F}_2$.

Since at each time $t>0$ each generating collection is formed with partions of the sets of the previous collection, clearly one has $\mathcal{F}_0\subset\mathcal{F}_1\subset\mathcal{F}_2$.

Notice that in each time $t$ the generation collection is the finer $\mathcal{F}_t$-measurable partition of $\Omega$.