variance of maximum
Your solution looks good. One can get the same bound using the Efron-Stein inequality. Using the notation in your answer, \begin{align} \operatorname{Var}(\max_i X_i) &\le \frac{1}{2}\sum_{k=1}^n E[(\max_i X_i - \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\})^2] \\ &\le \frac{1}{2} \sum_{k=1}^n E[(X_k - Y_k)^2]. \end{align} To justify the last inequality, note that
- if $\max_i X_i > \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\}$ then $k = i^*$ (where $X_{i^*} = \max_i X_i$) and thus $X_k = \max_i X_i > \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\} \ge Y_k$
- if $\max_i X_i < \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\}$ then $Y_k = \max\{X_1,\ldots, X_{k-1}, Y_k, X_{k+1}, \ldots, X_n\} \ge \max_i X_i \ge X_k$.
Here is the trick. Let $(Y_1,\dots,Y_n)$ be an independent copy. Then \begin{align*} 2\text{Var}(\max X_i) = E(\max X_i - \max Y_i)^2 = \int_0^\infty P((\max X_i - \max Y_i)^2>t)dt. \end{align*} If $X_{i^*} = \max X_i$ and $Y_{j^*} = \max Y_j$, then since $Y_{j^*} \geq Y_{i^*}$ and $X_{i^*} \geq X_{j^*}$, we have that $|X_{i^*}-Y_{j^*}| > \sqrt{t}$ implies either \begin{align*} X_{i^*}-Y_{i^*}>\sqrt{t} \hspace{1em}\text{ or } \hspace{1em}Y_{j^*}-X_{j^*} > \sqrt{t}. \end{align*} Hence \begin{align*} \int_0^\infty P((\max X_i - \max Y_i)^2 > t) dt &\leq \int_0^\infty P\left(\bigcup_{i=1}^n \{(X_i - Y_i)^2 > t\}\right) dt \\ &\leq \sum_{i=1}^n \int_0^\infty P((X_i-Y_i)^2>t)dt \\ &= \sum_{i=1}^n E(X_i-Y_i)^2 \\ &= 2\sum_{i=1}^n \text{Var}(X_i). \end{align*}