very slow convergence of Picard method for solving nonlinear system of equations
Here's a simple idea you could try with very little extra effort: Your GIF shows that it's oscillating back and forth, a phenomenon that can also occur in classical gradient descent algorithms if the problem is bad conditioned. A very popular and powerful method to alleviate this kind of problem is called momentum, which basically consists of averaging over previous iterations.
So instead of throwing away all the previous iterates, you can do something like
$$ x_{k+1} = (1-\beta)g(x_{k}) + \beta x_k$$
Note that when $\beta=0$, we recover a standard fixed point iteration. Consider a simple fixed point problem like $x=\cos(x)$, which exhibits the oscillatory phenomenon. Then, starting from the same seed here are the residuals $|x_*-x_k|$ for different values of $\beta$:
$$ \small\begin{array}{lllllll} k & \beta=0 & \beta=0.1 &\beta=0.2 &\beta=0.3 &\beta=0.4 &\beta=0.5 \\\hline 0 & 5.45787 &5.45787 &5.45787 &5.45787 &5.45787 &5.45787 \\1 & 0.2572 & 0.777267 & 1.29733 & 1.8174 & 2.33747 & 2.85754 \\2 & 0.19566 & 0.538475 & 0.690985 & 0.555697 & 0.107195 & 0.610102 \\3 & 0.116858 & 0.162927 & 0.0696096 & 0.00419339 & 0.00218156 & 0.0454083 \\4 & 0.0835784 & 0.0908543 & 0.0249916 & 0.000723828 & 8.0351e-06 & 0.0070347 \\5 & 0.053654 & 0.0431759 & 0.00828335 & 0.000124022 & 3.34983e-08 & 0.0011389 \\6 & 0.0371882 & 0.0224696 & 0.00282738 & 2.12772e-05 & 1.39595e-10 & 0.000185622 \\7 & 0.0245336 & 0.0112062 & 0.000955803 & 3.64953e-06 & 5.81757e-13 & 3.02859e-05 \\8 & 0.0167469 & 0.00571477 & 0.000324182 & 6.26001e-07 & 2.44249e-15 & 4.94232e-06 \\9 & 0.0111768 & 0.00288222 & 0.000109831 & 1.07377e-07 & 1.11022e-16 & 8.06552e-07 \end{array} $$
A well chosen momentum can speed up convergence tremendously! A variant of this idea specific to fixed point iterations appears to be known as Anderson Acceleration.