Independence of sum and difference for random vectors
$(X,Y)$ is a vector of dimension $2N$ whose entries are independent $N(0,1)$ random variables and so its covariance matrix is the $(2N)\times (2N)$ identity matrix $I_{2N}$. The $2N$ variables enjoy a multivariate normal distribution, that is, they are jointly Gaussian random variables, and $(X,Y)$ is called a Gaussian vector.
Any linear transformation of a Gaussian random vector results in a Gaussian vector (this is sometimes taken as the definition of a Gaussian vector), and so $$(X+Y,X-Y) = (X,Y)\left[\begin{matrix}1&1\\1&-1\end{matrix}\right] = (X_1,\ldots,X_n,Y_1,\ldots Y_n)\left[\begin{matrix}I_N&I_N\\I_N&-I_N\end{matrix}\right]$$ is also a Gaussian vector with covariance matrix $$\left[\begin{matrix}I_N&I_N\\I_N&-I_N\end{matrix}\right]^TI_{2N}\left[\begin{matrix}I_N&I_N\\I_N&-I_N\end{matrix}\right] = \left[\begin{matrix}2I_N&0\\0&2I_N\end{matrix}\right]$$ which shows independence of $X+Y$ and $X-Y$ and also reveals that the $X_i\pm Y_i$ are $N(0,2)$ random variables.
Now try and see if all the above will work when the zero-mean Gaussian vectors $X$ and $Y$ are independent of each other but $(X_1,X_2,\ldots, X_n)$ has the same covariance matrix $\Sigma$ as $(Y_1,Y_2,\ldots, Y_n)$ with $\Sigma$ not necessarily restricted to being the identity matrix or even a diagonal matrix, that is, the $X_i$ (and similarly the $Y_i$) need not be independent of each other.
Since $X$ and $Y$ have Gaussian distributions, $S$ and $D$ do as well. So, all you have to show is that $S$ and $D$ are uncorrelated. To do this, we calculate some expectations.
$E[S^TD] = E[(X+Y)^T(X-Y)] = E[X^TX]-E[Y^TY]=\mathbf 1-\mathbf 1=0$
and
$E[S]^TE[D] = (E[X]+E[Y])^T(E[X]-E[Y]) = (0+0)^T(0-0) = 0$.
Since $E[S^TD] = E[S]^TE[D]$ then $S$ and $D$ are uncorrelated, hence independent.
Off the top of my head, it doesn't seem like this is going to work for arbitrary distributions, since if $E[X^TX]\neq E[Y^TY]$ but $X$ and $Y$ still have $0$ mean then the whole thing falls apart. For non-Gaussian distributions, uncorrelated does not even imply independent, but since it's still a necessary condition, it seems like the same counterexample holds.
The pdf of the joint distribution of two iid normal variables is a function of the distance from the origin alone. (You may have seen this as part of a derivation of the constant factor in the pdf of $\mathcal N(0,1).$)
This means you can rotate the pdf as much as you want around the origin and it will still be the joint pdf of two iid normal variables. You can also reflect the pdf around an axis without changing it, for example by substituting $-X$ for $X$.
Now consider what transformation is required to map $(X,Y)$ to $(S,D)$. (That may be a bit of overkill for the problem as asked, but it's a useful mental image to acquire.)
This particular trick does not work for variables that are not independent. Nor does it necessarily work for iid variables that are not normally distributed.
It is not hard to come up with an example of two non-independent random variables $X$ and $Y$ such that $X+Y$ and $X-Y$ also are not independent.