Maximum Likelihood Estimator for $\theta$ when $X_1,\dots, X_n \sim U(-\theta,\theta)$
I don't understand your solution, so I'm doing it myself here.
Assume $\theta > 0$. Setting $y_i = |x_i|$ for $i = 1, \dots, n$, we have
$$\begin{align} L(\theta)=\prod_{i=1}^{n}f_{X_i}(x_i)&=\prod_{i=1}^{n}\left(\dfrac{1}{2\theta}\right)\mathbb{I}_{[-\theta, \theta]}(x_i) \\ &=\left(\dfrac{1}{2\theta}\right)^n\prod_{i=1}^{n}\mathbb{I}_{[-\theta, \theta]}(x_i) \\ &= \left(\dfrac{1}{2\theta}\right)^n\prod_{i=1}^{n}\mathbb{I}_{[0, \theta]}(|x_i|) \\ &= \left(\dfrac{1}{2\theta}\right)^n\prod_{i=1}^{n}\mathbb{I}_{[0, \theta]}(y_i)\text{.} \end{align}$$ Assume that $y_i \in [0, \theta]$ for all $i = 1, \dots, n$ (otherwise $L(\theta) = 0$ because $\mathbb{I}_{[0, \theta]}(y_j) = 0$ for at least one $j$, which obviously does not yield the maximum value of $L$). Then I claim the following:
Claim. $y_1, \dots, y_n \in [0, \theta]$ if and only if $\max_{1 \leq i \leq n}y_i = y_{(n)} \leq \theta$ and $\min_{1 \leq i \leq n}y_i = y_{(1)}\geq 0$.
I leave the proof up to you. From the claim above and observing that $y_{(1)} \leq y_{(n)}$, we have $$L(\theta) = \left(\dfrac{1}{2\theta}\right)^n\prod_{i=1}^{n}\mathbb{I}_{[0, \theta]}(y_i) = \left(\dfrac{1}{2\theta}\right)^n\mathbb{I}_{[0, y_{(n)}]}(y_{(1)})\mathbb{I}_{[y_{(1)}, \theta]}(y_{(n)}) \text{.}$$ Viewing this as a function of $\theta > 0$, we see that $\left(\dfrac{1}{2\theta}\right)^n$ is decreasing with respect to $\theta$. Thus, $\theta$ needs to be as small as possible to maximize $L$. Furthermore, the product of indicators $$\mathbb{I}_{[0, y_{(n)}]}(y_{(1)})\mathbb{I}_{[y_{(1)}, \theta]}(y_{(n)}) $$ will be non-zero if and only if $\theta \geq y_{(n)}$. Since $y_{(n)}$ is the smallest value of $\theta$, we have $$\hat{\theta}_{\text{MLE}} = y_{(n)} = \max_{1 \leq i \leq n} y_i = \max_{1 \leq i \leq n }|x_i|\text{,}$$ as desired.