Moment generating function of the stochastic integral $\int_0^t \alpha_s \, dW_s$

If we apply Itô's formula to the function

$$f(x) := \exp(\lambda x)$$

and the Itô process $(Y_t)_{t \geq 0}$, then we find

$$e^{\lambda Y_t}-1 = \lambda \int_0^t e^{\lambda Y_s} \alpha_s \, dW_s + \frac{\lambda^2}{2} \int_0^t e^{\lambda Y_s} \alpha^2(s) \, ds.$$

Since the first term on the right-hand side is a martingale, we get for $\phi_{\lambda}(t):= \mathbb{E}e^{\lambda Y_t}$

$$\phi_{\lambda}(t) -1 = \frac{\lambda^2}{2} \int_0^t \phi_{\lambda}(s) \alpha^2(s) \, ds.$$

This ordinary differential equation can be solved explicitely,

$$\mathbb{E}e^{\lambda Y_t} = \phi_{\lambda}(t) = \exp \left( \frac{\lambda^2}{2} \int_0^t \alpha(s)^2 \, ds \right).$$

This proves that $Y_t$ is normal with mean $0$ and variance $\int_0^t \alpha(s)^2 \, ds$.

Edit As @NateEldredge pointed out, we have to ensure that $(e^{\lambda Y_s})_{s \geq 0}$ is suitable integrable; for a proof that this is indeed the case see e.g. René L. Schilling/Lothar Partzsch: Brownian motion - An introduction to stochastic processes, Chapter 18 (2nd edition) or my other answer.


It follows easily from Itô's formula that

$$M_t := \exp \left( \lambda Y_t - \frac{\lambda^2}{2} \int_0^t \alpha(s)^2 \, ds \right)$$

satisfies

$$M_t -1 = \lambda \int_0^t M_s dY_s, \tag{1}$$

i.e. $(M_t)_{t \geq 0}$ is a non-negative local martingale with continuous sample paths. This implies that $(M_t)_{t \geq 0}$ is a supermartingale (see e.g. this question); in particular, $\mathbb{E}M_t\leq 1 $. As $\alpha$ is a deterministic function, we get $$\mathbb{E}e^{\lambda Y_s} \leq \exp \left( \frac{\lambda^2}{2} \int_0^t \alpha(s)^2 \, ds \right)<\infty.$$ This means that the moment-generating function of $Y_t$ is well-defined. Moreover, using this estimate, we see that $(M_t)_{t \geq 0}$ is a true martingale, i.e. $\mathbb{E}M_t = 1$. By definition of $M_t$, this is equivalent to

$$\mathbb{E}\exp(\lambda Y_t) = \exp \left( \frac{\lambda^2}{2} \int_0^t \alpha^2(s) \, ds \right).$$