Variance of $T_n = \min_i \{ X_i \} + \max_i \{ X_i \}$
The distribution of either $\max$ or $\min$ is easy; the distribution of $\max+\min$ is more involved.
$$ \max\{X_1,\ldots,X_n\} \le x \text{ if and only if } (X_1\le x\ \&\ \cdots\ \&\ X_n\le x) $$ and by independence, the probability of that is the $n$th power of the probability of $X_1\le x.$ Thus it is $ (x/\theta)^n.$ The density is the derivative of that with respect to $x.$ The density of $\min$ is found similarly. But they are positively correlated.
We have $f_{\min}(x) = \dfrac n {\theta^n} (\theta-x)^{n-1}.$
Let $I= \text{the index $i$ in } \{1,\ldots,n\} \text{ for which } X_i= \min.$ Then \begin{align} \Pr(\max+\min\le x) & = \operatorname E(\Pr(\max+\min\le x \mid \min, I)) \\[10pt] & = \operatorname E(\Pr(\max\le x-\min\mid \min,I)) \\[10pt] & = \operatorname E\left( \left( \frac{(x - \min)-\min}{\theta - \min}\right)^{n-1} \right) \\[10pt] & = \int_0^\theta \left( \frac{x-2u}{\theta-u} \right)^{n-1} \frac n {\theta^n} (\theta-u)^{n-1} \, du \\[10pt] & = \frac n {\theta^n} \int_0^\theta (x-2u)^{n-1} \, du \end{align} and this equals something depending on $x$ and $\theta.$ Diffentiating with respect to $x$ gives the density of $\max-\min.$
The best unbiased estimator of $\theta$ in these circumstances is $\dfrac{n+1} n \max\{X_1,\ldots,X_n\}.$ Since the conditional distribution of $\min$ given $\max$ does not depend on $\theta,$ bringing $\min$ into the estimation process after $\max$ is already there just adds noise and makes the variance bigger.