Does convergence in probability implies convergence of the mean?
Let $(\Omega,\mathcal{F},P)=([0,1],\mathcal{B}([0,1]),\lambda)$. Then the sequence $X_n=n\mathbf{1}_{(0,1/n)}$ converges almost surely and hence also in probability to $X=0$, but $\lim_{n\to \infty}{\rm E}[X_n]=1\neq {\rm E}[X]$.
However, if we require that $\{X_n\mid n\geq 1\}$ is uniformly integrable, then the result holds. In fact it only requires that $X_n$ converges to $X$ in distribution. See this answer for hints on the proof.
Hint: try a sequence of random variables that are $0$ with high probability but large with low probability.