Expectation of CDF of continuous random variable $X$, evaluated at $X$
You are correct that for a continuous random variable, $X$, with probability density function $f_X$, the expected value is: $$\mathbb{E}[X]=\int_{-\infty}^{+\infty} x\cdot f_X(x)\operatorname{d}x$$ This can be extended. Let $g$ be a function of the continuous random variable $X$. The expected value of this function is: $$\mathbb{E}[g(X)]=\int_{-\infty}^{+\infty} g(x)\cdot f_X(x)\operatorname{d}x$$ So, for example, $\mathbb{E}[X^2]=\int\limits_{-\infty}^{+\infty} x^2\cdot f_X(x)\operatorname{d}x$.
Thus the expected value of the cumulative distribution function, $F_X$, is: $$\mathbb{E}[F_X(X)]=\int_{-\infty}^{+\infty} F_X(x)\cdot f_X(x)\operatorname{d}x$$ However, by definition the probability density function is the derivative of the cumulative distribution function. $f_X(x) =\frac{\operatorname{d} F_X(x)}{\operatorname{d}x }$ $$\begin{align}\therefore \mathbb{E}[F_X(X)] & =\int_{0}^{1} F_X(x)\operatorname{d}F_X(x) \\ & = \left[\tfrac 1 2F_X(x)^2\right]_{F_X(x)=0}^{F_X(x)=1} \\ & = \tfrac 12 \end{align}$$
Letting $Y=F_{X}(X)$, the key is find the distribution of $Y$ it turns out $Y\sim Uni(0,1)$. To show a quasi proof of this I will look at case $F_{X}(X)$ is invertible (which is case for many continuos distribution) so we know that distribution are characterized by their CDFs so we have,
$$F_{Y}(y)=P(Y\leq y)=P(F_{X}(X)\leq y)$$ $$=P(F^{-1}_{X}(F_{X}(X))\leq F^{-1}_{X}(y))=P(X\leq F^{-1}_{X}(y))=F_X(F^{-1}_{X}(y))=y \textrm{ for 0<y<1}$$
where this is CDF of $Uni(0,1)$
From above the we see that $E(Y)=E(F_{X}(X))=\frac{1}{2}$