convergence in probability induced by a metric

If $d(X,Y)=0$, then $Z=\frac{|X-Y|}{1+|X-Y|}$ is a nonnegative random variable whose expectation is zero, so it must be zero a.e., so $|X-Y|=0$ a.e., so $X=Y$ a.e.

Suppose that $(X_n) \to X$ in probability. Let $\varepsilon >0$. Then for large enough $n$ we have $P(|X_n-X| > \varepsilon ) \leq \varepsilon$. For those $n$ we have

$$ \begin{array}{lcl} d(X_n,X) &=& {\bf E}(\frac{|X_n-X|}{1+|X_n-X|}{\bf 1}_{|X_n-X| \leq \varepsilon}) +{\bf E}(\frac{|X_n-X|}{1+|X_n-X|}{\bf 1}_{|X_n-X| > \varepsilon}) \\ &\leq& {\bf E}(|X_n-X|{\bf 1}_{|X_n-X| \leq \varepsilon}) +{\bf E}({\bf 1}_{|X_n-X| > \varepsilon}) \\ &\leq& {\bf E}(\varepsilon{\bf 1}_{|X_n-X| \leq \varepsilon}) +P(|X_n-X| > \varepsilon)=2\varepsilon \end{array} $$

So $d(X_n,X) \to 0$.

Conversely, suppose that $d(X_n,X) \to 0$. Let $\varepsilon >0$. Then for large enough $n$ we have $d(X_n,X) \leq \frac{\varepsilon^2}{1+\varepsilon}$. For those $n$ we have

$$ \frac{\varepsilon^2}{1+\varepsilon} \geq {\bf E}(\frac{|X_n-X|}{1+|X_n-X|}{\bf 1}_{|X_n-X| > \varepsilon}) \geq {\bf E}(\frac{\varepsilon}{1+\varepsilon}{\bf 1}_{|X_n-X| > \varepsilon})= \frac{\varepsilon}{1+\varepsilon} P(|X_n-X| > \varepsilon ) $$

So $P(|X_n-X| > \varepsilon) \leq \varepsilon$. This shows that $(X_n) \to X$ in probability.