How to prove $ E(|X-Y|) \le E(|X+Y|)$ when $X,Y$ are i.i.d variables?

After a little inspection, we see that $$ E(|X+Y|-|X-Y|) = 2E[Z(1_{XY\geq 0}-1_{XY<0})] $$ where $Z = \min(|X|,|Y|)$.

Remember that for any non-negative random variable $T$, $$ E(T) = \int_0^\infty P(T>t)\,dt. $$

We apply this with $T=Z\,1_{X \geq 0, Y\geq 0}$, $T=Z\,1_{X < 0, Y< 0}$ and $T=Z\,1_{X \geq 0, Y< 0}$. Since $\{Z > t\} = \{|X| > t\}\cap\{|Y| > t\}$, we obtain

$$ E(Z \,1_{X \geq 0,Y\geq 0}) = \int_0^\infty P(X > t)P(Y > t)\,dt = \int_0^\infty P(X > t)^2\,dt $$

$$ E(Z\, 1_{X < 0, Y < 0}) = \int_0^\infty P(X < -t)P(Y < - t)\,dt = \int_0^\infty P(X < -t)^2\,dt $$

$$ E(Z\,1_{X \geq 0, Y< 0}) = E(Z\,1_{\{X < 0, Y \geq 0\}}) = \int_0^\infty P(X > t)P(X < -t)\,dt $$

So finally, $$ E(|X+Y|-|X-Y|) = 2\int_0^\infty (P(X>t)-P(X<-t))^2\,dt \geq 0 $$


Remark 1. The inequality is an equality if and only if the distribution of $X$ is symetric, that is $P(X > t) = P(X < -t)$ for any $t \geq 0$.


Remark 2. When $|X|=1$ a.s. the inequality is nothing but the semi-trivial fact that if $X$ and $Y$ are independent with same distribution, then $P(XY \geq 0) \geq \dfrac{1}{2}$.


Remark 3. It is worthwile to mention a nice corollary : $E(|X+Y|) \geq E(|X|)$. The function $x \mapsto |x|$ is convex hence $|X| \leq \frac{1}{2}(|X+Y|+|X-Y|)$. Taking expectations we find $$ \Bbb E(|X+Y|-|X|) \geq \frac{1}{2}\Bbb E(|X+Y|-|X-Y|) \geq 0. $$ Furthermore, there is an equality if and only if $X=0$ a.s.


Edit: Question has changed. Will give answer when time permits.

By the linearity of expectation, the inequality $E(X-Y)\le E(X+Y)$ is equivalent to $-E(Y)\le E(Y)$, which in general is false. It is true precisely if $E(Y)\ge 0$.

Independence is not needed for the argument. Neither is the hypothesis that the random variables have the same distribution.


Below is a set of remarks that’s too long to be put in a comment.

Conjecture. The inequality becomes an equality iff $-X$ has the same distribution as $X$.

Remark 1. The “if” part of the conjecture is easy : if $X$ and $-X$ have the same distribution, then by the independence hypothesis $(X,Y)$ and $(X,-Y)$ have the same joint distribution, therefore $|X+Y|$ and $|X-Y|$ share the same distribution, so they will certainly share the same expectation.

Remark 2. Let $\phi_n(t)=t$ if $|t| \leq n$ and $0$ otherwise. If the inequality holds for any $(\phi_n(X),\phi_n(Y))$ for any $n$, then it will hold for $(X,Y)$ also, by a dominated convergence argument. So we may assume without loss of generality that the support of $X$ is bounded.