Show $\mathbb{E}[f(X)g(X)] \geq \mathbb{E}[f(X)]\mathbb{E}[g(X)]$ for $f,g$ bounded, nondecreasing

Hint: Let $X_1,X_2$ be independent copies of $X$, and note that $$(g(X_1)-g(X_2))(f(X_1)-f(X_2))\geq 0.$$


The fact itself is pretty intuitive: Two increasing transformations of a random variable are positively correlated.

First note that $X$ may be assumed to have a uniform distribution on $[0,1]$ (via the quantile transformation).

Now let $f(x) = \mathbf{1}_{[x_0,1]}(x)$, $x_0\in[0,1)$. Since $$ h(x) = \frac{x-x_0}{1-x_0}\le x,\quad x\in[x_0,1], $$ we get $$ E[f(X) g(X)] = E[\mathbf{1}_{X\ge x_0} g(X)]\ge E[\mathbf{1}_{X\ge x_0} g(h(X))] \\ = \int_{x_0}^1 g(h(x))dx = (1-x_0)\int_0^1 g(z) dz = P(X\ge x_0)E[g(X)] = E[f(X)] E[g(X)]. $$

By linearity, the desired inequality holds for non-negative non-decreasing step functions $f$. But for a constant $f$ we have an equality, so the inequality holds for non-decreasing step functions $f$ of any sign. The proof is completed by approximating any function with step functions.