Is Gauss-Seidel guaranteed to converge on *semi* positive definite matrices?

This is an interesting question, to which the answer is positive.

Here is the proof. Of course, for the method to make sense, we must assume that the diagonal $D>0$. The notations below are borrowed from Section 12.3.2 of my book Matrices (2nd edition, Springer-Verlag, GTM 216). Let $G=(D-E)^{-1}E^T$ be the iteration matrix. One checks easily that $\ker A$ is the eigenspace of $G$, associated with the eigenvalue $\lambda=1$. The corresponding eigenspace for $G^T$ is $(D-E)^T\ker A$, for which we find that $G$ has the invariant subspace $(D-E)^{-1}R(A)$. This can be verified directly with the help of the formula $G(D-E)^{-1}A=(D-E)^{-1}AG$. It turns out that $1$ is semi-simple: if $Gw=w+v$ and $Gv=0$, we obtain $v^T(D+A)v=0$, hence $v=0$. Therefore $${\mathbb R}^n=\ker A\oplus (D-E)^{-1}R(A)$$ is a decomposition into $G$-invariant subspaces.

There remains to prove that for every vector $x^0$, the sequence $x^m:=G^mx^0$ is convergent. Let us decompose $x^m=y^m+z^m$, according to the invariant subspaces above. We have $y^m=y^0$, so this part is obviously convergent. There remains to prove that the spectral radius of the restriction $g$ of $G$ to $(D-E)^{-1}R(A)$ is smaller than $1$. This is proved exactly the same way as in Lemma 20 of the reference. We have to prove that if $x\in(D-E)^{-1}R(A)$, then $(Gx^T)A(Gx)\le x^TAx$, and equality implies $x=0$. With $y=(D-E)^{-1}Ax$, we have $$(Gx^T)A(Gx)-x^TAx=-y^TDy\le0.$$ If the right-hand side vanishes, then $y=0$, which means $x\in\ker A$, hence $x=0$. Q.E.D


Here are some general comments (drawn from the nice book: Accuracy and stability of numerical algorithms by N. J. Higham) that might prove helpful.

Suppose $A$ is singular. Split $A=M-N$, where $M$ is nonsingular. Iterate

$$Mx_{k+1}=Nx_k+b.$$

Define the "action" matrix $H=M^{-1}N$, using which the above iteration becomes $$x_{k+1} = Hx_k + M^{-1}b.$$

Now, unroll this loop starting at $x_1$, to get

$$x_{k+1} = H^kx_1 + \sum_{j=0}^{k-1} H^jM^{-1}b.$$

For the sequence $(x_k)$ to be convergent, a necessary condition is that the sequence $(H^k)$ converges. This, in turn can be ensured if $rank(I-H) = rank( (I-H)^2 )$. Eventually, though, the limit of $(x_k)$ will depend on where we started, i.e., $x_1$.