Conditional expectation of $\max(X,Y)$ and $\min(X,Y)$ when $X,Y$ are iid and exponentially distributed

As indicated in the comments, a useful idea when maxima and minima are involved is to consider well adapted events. Here, introducing $Z=\min\{X,Y\}$ and $W=\max\{X,Y\}$, one sees that $[z\leqslant Z,W\leqslant w]$ is $[z\leqslant X\leqslant w]\cap[z\leqslant Y\leqslant w]$ for every nonnegative $z$ and $w$ such that $z\leqslant w$. Here is a computation: since the probability that a standard exponential random variable is $\geqslant x$ is $\mathrm e^{-x}$ for every nonnegative $x$, the events $[z\leqslant X\leqslant w]$ and $[z\leqslant Y\leqslant w]$ both have probability $\mathrm e^{-z}-\mathrm e^{-w}$. Hence, $$ \mathrm P(z\leqslant Z,W\leqslant w)=(\mathrm e^{-z}-\mathrm e^{-w})^2. $$ Differentiating this with respect to $z$ and $w$ yields the density of $(Z,W)$ as $$ 2\mathrm e^{-z-w}\cdot[0\leqslant z\leqslant w]. $$ This formula is all right but, because of the indicator functions in it, I am afraid to make mistakes when using it, so I try to simplify it. Let $V=W-Z$, then $Z\geqslant0$, $V\geqslant 0$, and using $v=w-z$, the density becomes $$ 2\mathrm e^{-z-(v+z)}\cdot[0\leqslant z\leqslant v+z]=2\mathrm e^{-2z}\cdot[z\geqslant 0]\cdot\mathrm e^{-v}\cdot[v\geqslant0]. $$ This proves that $Z$ and $V$ are independent with $Z$ exponential of parameter $2$ and $V$ of parameter $1$ and yields at last the answer to the initial question as $$ \mathrm E(W\mid Z)=\mathrm E(V+Z\mid Z)=\mathrm E(V)+Z=1+Z. $$ The same technique yields that the order statistic $(X^{(k)})_{1\leqslant k\leqslant n}$ of an i.i.d. sample $(X_k)_{1\leqslant k\leqslant n}$ of standard exponential random variables, defined by the conditions that $\{X^{(1)},X^{(2)},\ldots,X^{(n)}\}=\{X_1,X_2,\ldots,X_n\}$ and that $X^{(1)}<X^{(2)}<\cdots <X^{(n)}$, is distributed like $(Z_1,Z_1+Z_2,\ldots,Z_1+Z_2+\cdots+Z_n)$ for independent exponential random variables $(Z_k)_{1\leqslant k\leqslant n}$ such that the distribution of $Z_k$ is exponential with parameter $n-k+1$. A consequence is that, for every $1\leqslant k\leqslant\ell\leqslant n$, $$ \mathrm E(X^{(\ell)}\mid X^{(k)})=X^{(k)}+\sum\limits_{i=n-\ell+1}^{n-k}\frac1i. $$


For two independent exponential distributed variables $(X,Y)$, the joint distribution is $$ \mathbb{P}(x,y) = \mathrm{e}^{-x-y} \mathbf{1}_{x >0 } \mathbf{1}_{y >0 } \, \mathrm{d} x \mathrm{d} y $$ Since $x+y = \min(x,y) + \max(x,y)$, and $\min(x,y) \le \max(x,y)$ the joint distribution of $(U,V) = (\min(X,Y), \max(X,Y))$ is $$ \mathbb{P}(u,v) = \mathcal{N} \mathrm{e}^{-u-v} \mathbf{1}_{v \ge u >0 } \, \mathrm{d} u \mathrm{d} v $$ The normalization constant is easy to find as $$ \int_0^\infty \mathrm{d} v \int_0^v \mathrm{d} u \,\, \mathrm{e}^{-u-v} = \int_0^\infty \mathrm{d} v \,\, \mathrm{e}^{-v} ( 1 - \mathrm{e}^{-v} ) = 1 - \frac{1}{2} = \frac{1}{2} = \frac{1}{\mathcal{N}} $$ Thus the conditional expectation we seek to find is found as follows (assuming $u>0$): $$ \mathbb{E}(\max(X,Y) \vert \min(X,Y) = u) = \frac{\int_0^\infty v \mathrm{d} P(u,v)}{\int_u^\infty \mathrm{d} P(u,v)} = \frac{\int_u^\infty \mathcal{N} v \mathrm{e}^{-u-v} \mathrm{d} v}{\int_u^\infty \mathcal{N} \mathrm{e}^{-u-v} \mathrm{d} v} = 1 + u $$


If $Z = \min(X,Y)$ and $W = \max(X,Y)$, then for $w > z$, $$\begin{align*} F_{Z,W}(z,w) &= P\{Z \leq z, W \leq w\}\\ &= P\left[\{X \leq z, Y \leq w\} \cup \{X \leq w, Y \leq z\}\right]\\ &= P\{X \leq z, Y \leq w\} + P\{X \leq w, Y \leq z\} - P\{X \leq z, Y \leq z\}\\ &= F_{X,Y}(z, w) + F_{X, Y}(w,z) - F_{X,Y}(z,z) \end{align*} $$ while for $w < z$, $$\begin{align*} F_{Z,W}(z,w) &= P\{Z \leq z, W \leq w\} = P\{Z \leq w, W \leq w\}\\ &= P\{X \leq w, Y \leq w\}\\ &= F_{X,Y}(w,w). \end{align*} $$ Consequently, if $X$ and $Y$ are jointly continuous random variables, then $$f_{Z,W}(z,w) = \frac{\partial^2}{\partial z \partial w}F_{Z,W}(z,w) = \begin{cases} f_{X,Y}(z,w) + f_{X,Y}(w,z), & \text{if}~w > z,\\ \\ 0, & \text{if}~w < z. \end{cases} $$ The conditional density of $W$ given $Z = z$ is $$ f_{W \mid Z}(w \mid z) = \frac{f_{Z,W}(z,w)}{f_Z(z)} = \begin{cases} \frac{f_{X,Y}(z,w) + f_{X,Y}(w,z)}{\int_z^{\infty} f_{X,Y}(z,w) + f_{X,Y}(w,z)\ \mathrm dw}, & w > z,\\ 0, & w < z, \end{cases} $$ and so with $f_{X,Y}(x,y) = e^{-x-y}$ for $x, y \geq 0$ $$ \begin{align*}E[W \mid Z = z] &= \frac{\int_z^\infty w\cdot f_{X,Y}(z,w) + w\cdot f_{X,Y}(w,z)\ \mathrm dw}{ \int_z^\infty f_{X,Y}(z,w) + f_{X,Y}(w,z)\ \mathrm dw}\\ &= \frac{\int_z^\infty w\cdot e^{-w-z} + w\cdot e^{-w-z}\ \mathrm dw}{ \int_z^\infty e^{-w-z} + e^{-w-z}\ \mathrm dw}\\ &= \frac{2e^{-2z}\int_z^\infty w\cdot e^{-w}\ \mathrm dw}{ 2e^{-2z}} = \frac{2e^{-z}[\left . (-we^{-w})\right\vert_z^{\infty} + \int_z^{\infty}e^{-w}\ \mathrm dw]}{2e^{-2z}}\\ &= 1 + z. \end{align*} $$