Symmetric Matrices with trace zero
OK, here is a complete answer: every traceless and (complex) symmetric matrix $A$ can be written as a commutator of the form $[R,R^T]$ for some matrix $R$. Since comments have been left to my old answer, instead of editing the old answer, I give a new one here. A similar idea to those employed in my answers to other related problems (see q95537 and q251678) is applicable. The key observation is the following:
Lemma. Suppose $A$ is an $n\times n$ traceless complex symmetric matrix.
(a) When $n$ is odd, $A$ is complex orthogonally similar to a matrix whose last diagonal entry is zero.
(b) When $n$ is even, $A$ is complex orthogonally similar to a matrix whose trailing principal $2\times2$ submatrix and leading principal $(n-2)\times(n-2)$ submatrix are both traceless.
Proof of the lemma. Suppose $n$ is odd and none of the zero diagonal entries of $A$ is zero. As $A$ is also traceless, it must have two diagonal entries $x$ and $z$ such that $x\ne\pm z$. If $s$ and $c$ are two complex numbers such that $s^2+c^2=1$, then $$ \pmatrix{c&s\\ -s&c}\pmatrix{x&y\\ y&z}\pmatrix{c&-s\\ s&c} = \pmatrix{\ast&\ast\\ \ast&xs^2-2ysc+zc^2}. $$ So, to prove the lemma for the odd case, it suffices to show that $xs^2-2ysc+zc^2=0$ is solvable in $(s,c)$. Rewrite the equation as $x+(z-x)c^2=2ysc$. Square both sides, we get $x^2 + 2x(z-x)c^2 + (z-x)^2c^4 = 4y^2(c^2-c^4)$ or $$ [(z-x)^2+4y^2]c^4 + [2x(z-x) - 4y^2]c^2 + x^2 = 0.\tag{$\ast$} $$ Since the constant term $x^2$ is nonzero, if the coefficients of $c^4$ and $c^2$ vanish, the equation is insolvable. However, this cannot occur, otherwise we would have $(z-x)^2 = -4y^2 = -2x(z-x)$ and in turn $(z+x)(z-x)=0$, contradicting our assumption that $x\ne\pm z$. Therefore $(\ast)$ is solvable. That is, $A$ is orthogonally similar (via the use of a "complex Givens rotation") to a matrix with a zero diagonal entry. By performing a similarity transform via a permutation matrtix, the result follows.
Now suppose $n$ is even. If $A$ has a pair of diagonal entries $(x,-x)$, the result follows by permutation similarity. On the contrary, if $A$ has have two diagonal entries $x$ and $z$ such that $x\ne\pm z$, then the above argument shows that $A$ is orthogonally similar to a matrix whose last diagonal entry is zero. But then the principal $(n-1)\times(n-1)$ submatrix of $A$ is traceless, symmetric and has odd order. By applying (a) to this submatrix, the result follows. $\square$
Proof of the main statement. We now prove that every complex symmetric matrix $A$ can be written as a commutator of the form $[R,R^T]$. It turns out that the construction of $R$ is fairly simple. Split $R$ into the sum of its complex symmetric part $H$ and complex skew symmetric part $K$. Straightforward calculation shows that $[R,R^T]=2[K,H]$. So the problem boils down to solving $\frac A2=[K,H]$. We will deal with even $n$ and odd $n$ separately.
Case 1: $n$ is even. We will prove by mathematical induction that $\frac A2=[K,H]$ is solvable with a nonsingular $K$. The base case is easy: $$ \underbrace{\pmatrix{x&y\\ y&-x}}_{A/2} = \underbrace{\pmatrix{0&-1\\ 1&0}}_K \underbrace{\pmatrix{y&-\tfrac x2\\ -\tfrac x2&0}}_H - \pmatrix{y&-\tfrac x2\\ -\tfrac x2&0}\pmatrix{0&-1\\ 1&0}. $$ Now, suppose the induction hypothesis holds up to matrices of order $n-2$. Consider a traceless symmetric matrix $A$ of order $n$. $$ \frac A2=\left[\begin{array}{c|c} A_{n-2}&\begin{array}{cc}\mathbf a_1&\mathbf a_2\end{array}\\ \hline \begin{array}{c}\mathbf a_1^T\\ \mathbf a_2^T\end{array}&A_2 \end{array}\right], \quad H=\left[\begin{array}{c|c} H_{n-2}&\begin{array}{cc}\mathbf h_1&\mathbf h_2\end{array}\\ \hline \begin{array}{c}\mathbf h_1^T\\ \mathbf h_2^T\end{array}&H_2 \end{array}\right], \quad K=\left[\begin{array}{c|c} K_{n-2}&0\\ \hline0&K_2\end{array}\right], $$ where $A_{n-2},\,H_{n-2},\,K_{n-2}$ are $(n-2)\times(n-2)$ and $A_2,\,H_2,\,K_2$ are $2\times2$. By our lemma, we may assume that both $A_{n-2}$ and $A_2$ are traceless. As $K_2$ is skew symmetric, $K_2=\alpha Q$ for some complex scalar $\alpha$, where $Q=\pmatrix{0&-1\\ 1&0}$. So, the equation $\frac A2=KH-HK$ is equivalent to the following system of equations: \begin{align} K_{n-2}H_{n-2}-H_{n-2}K_{n-2}&=A_{n-2},\tag{1}\\ K_2H_2-H_2K_2&=A_2,\tag{2}\\ \pmatrix{K_{n-2}&-\alpha I_{n-2}\\ \alpha I_{n-2}&K_{n-2}}\pmatrix{\mathbf h_1\\ \mathbf h_2} &=\pmatrix{\mathbf a_1\\ \mathbf a_2}.\tag{3} \end{align} By induction assumption, $(1)$ and $(2)$ are solvable and both $K_{n-2}, K_2$ can be chosen to be nonsingular. For $(3)$, since $K_{n-2}$ commutes with $\alpha I_{n-2}$, the determinant of the square matrix on the left side of $(3)$ is $\det(K_{n-2}^2+\alpha^2 I_{n-2})$. As $K_2=\alpha Q$, by scaling $\alpha$ and $H_2$ appropriately, the value of $\alpha$ can be chosen such that it is nonzero and $\det(K_{n-2}^2+\alpha^2 I_{n-2})\ne0$. Therefore $(3)$ is also solvable and $K$ is nonsingular.
Case 2: $n$ is odd. The case $n=1$ is trivial. Now suppose $n\ge3$. By our lemma, we may assume that the last diagonal entry of $A$ is zero. Let $$ \frac A2=\left[\begin{array}{c|c}A_{n-1}&\mathbf a\\ \hline\mathbf a^T&0\end{array}\right], \quad H=\left[\begin{array}{c|c}H_{n-1}&\mathbf h\\ \hline\mathbf h^T&0\end{array}\right], \quad K=\left[\begin{array}{c|c}K_{n-1}&0\\ \hline0&0\end{array}\right], $$ where $A_{n-1},H_{n-1},K_{n-1}$ are $(n-1)\times(n-1)$. The equation $\frac A2=KH-HK$ is then equivalent to the system of equations \begin{align} K_{n-1}H_{n-1}-H_{n-1}K_{n-1}&=A_{n-1},\tag{4}\\ K_{n-1}\mathbf h&=\mathbf a\tag{5}. \end{align} By the result for the even case, $(4)$ is solvable with a nonsingular skew symmetric matrix $K_{n-1}$. Hence $(5)$ is also solvable.
Here you can find an alternative proof for the real symmetric case. I am writting to answer Clin's question. Notice that the approach used by user1551 at his post on May 13 gives a different solution.
Let $S$ be a real symmetric matrix of order k and trace zero. Let $S=\sum_{i=1}^n\lambda_iv_iv_i^t$ be a spectral decomposition of $S$ with $\lambda_1\geq\lambda_2\geq\ldots\geq\lambda_n$ $(\lambda_i\neq 0)$(Edit: and $v_1,\dots,v_n$ orthonormal).
Define $\beta_i=\lambda_1+\dots+\lambda_i$. Notice that $\beta_n=0$.
Now, $\beta_i>0$ for $i<n$, otherwise $\beta_{i+1}<0$ and $\beta_n<0$.
Define the matrices $V_{k\times n-1}, W_{k\times n-1}$ using the vectors $v_i$ as columns: $V=(v_1,\dots,v_{n-1})$ and $W=(v_2,\dots,v_{n})$. Notice that $V^tV=W^tW=Id_{n-1}$.
Define the diagonal matrix $D_{n-1\times n-1}$ with $\beta_1,\dots,\beta_{n-1}$ in the diagonal.
Notice that $S=\sum_{i=1}^n\lambda_iv_iv_i^t=\sum_{i=1}^{n-1}\beta_iv_iv_i^t-\sum_{i=1}^{n-1}\beta_iv_{i+1}v_{i+1}^t$. Therefore, $S=VDV^t-WDW^t$.
Finally $S=VD^{\frac{1}{2}}W^tWD^{\frac{1}{2}}V^t-WD^{\frac{1}{2}}V^tVD^{\frac{1}{2}}W^t=RR^t-R^tR$, where $R=VD^{\frac{1}{2}}W^t$.