Solving "Lyapunov-like" equation AX+X'B=C
After some mathematics I found a (pair of) method that can utilize LyapunovSolve
.
g = a + b\[Transpose];
ig = Inverse[g];
Print["Cond Num = ", Norm[g] Norm[ig]];
h = (c + c\[Transpose])/2;
u = LyapunovSolve[a.ig, -ig\[Transpose].b,
c - a.ig.h - h.ig\[Transpose].b];
u = (u - u\[Transpose])/2; (* Re-symmetrize, secrect ingredient *)
x = ig.(h + u);
Norm[a.x + Transpose[x].b - c] (* verify *)
Or:
d = a - b\[Transpose];
id = Inverse[d];
Print["Cond Num = ", Norm[d] Norm[id]];
q = (c - c\[Transpose])/2;
v = LyapunovSolve[a.id, id\[Transpose].b,
c - a.id.q + q.id\[Transpose].b];
v = (v + v\[Transpose])/2; (* Re-symmetrize, secrect ingredient *)
x = id.(q + v);
Norm[a.x + Transpose[x].b - c] (* verify *)
Just pick whatever one that has lower conditional number.
Mathematics behind
From $$ a x + x^T b = c $$ we get $$ (a+b^T) x + x^T (b+a^T) = c + c^T. $$ Rewrite as ($g = a + b^T$, $2h = c + c^T$) $$ g x + (g x)^T = 2h. $$
Define $y$ and $u$ by $$ y = g x = h + u, $$ where $h = h^T$, $u = -u^T$. We can solve $h$ by the $g$ equation above. Then substitute $x=g^{-1} (h + u)$ to the original equation to solve $u$ (the (anti)symmetric $h$ and $u$ are the keys to eliminate the "transpose"): $$ a g^{-1} u - u (g^{-1})^T b = c - a g^{-1} h - h (g^{-1})^T b. $$ After solving $u$ you can get $x$.
Similar steps for the other code.
Edit: Add error statistics.
The error (Norm[a.x + Transpose[x].b - c]
) for different size random matrices $a,b,c$. The blue line uses the algorithm here, red line uses the method in yarchik's answer. Somehow the method here is more accurate.
(Useless old answer that is not targeting the question)
Try the built-in function LyapunovSolve.
e.g.
n = 1000;
a = RandomReal[{-3, 3}, {n, n}];
b = RandomReal[{-3, 3}, {n, n}];
c = RandomReal[{-3, 3}, {n, n}];
Timing[x = LyapunovSolve[a, b, c];]
(* Out: {10.964, Null} *)
Norm[a.x + x.b - c]
(* Out: 4.98744*10^-8 *)
For computation of well-solved mathematical problems, always search built-in function first.
General matrices
For the desired matrix sizes I have doubts that a numerical solution would be feasible. Here is a simplified code using sparse matrices.
tmSylvester[n_]:=Module[{a,b,c,sA,sB,sC,sAB},
a=RandomReal[{-3,3},{n,n}];
b=RandomReal[{-3,3},{n,n}];
c=RandomReal[{-3,3},{n,n}];
sA=SparseArray[Table[{(i-1)n+l,(k-1)n+l}->a[[i,k]],{i,n},{k,n},{l,n}]//Flatten];
sB=SparseArray[Table[{(l-1)n+j,(k-1)n+l}->b[[k,j]],{k,n},{j,n},{l,n}]//Flatten];
sAB=sA+sB;
sC=SparseArray[Table[{(i-1)n+j}->c[[i,j]],{i,n},{j,n}]//Flatten];
First[Timing[LinearSolve[sAB,sC];]]]
Now, let us plot the timing
ListLogPlot[Table[{n,tmSylvester[n]},{n,10,120,10}],Joined->True,PlotTheme->{"Frame","Monochrome"}, FrameLabel->{"Matrix Size","Time(s)"}]
Even at a very optimistic extrapolation it is unlikely that the n=1000
calculation would be routinely possible. There are, however, experts here that might be able to further tune up the linear solver.
Nonsingular matrices
According to F. M. Dopico, J. González, D. Kressner, and V. Simoncini. Projection methods for large-scale T-Sylvester equations, in Mathematics of Computation (2015), under the usual conditions of existence the following equations have equal unique solutions
$$B^{−T} A X − X A^{−T} B = B^{−T} C − B^{−T} C^{T} A^{−T} B;$$ $$AX + X^T B = C, $$ where $A^{-T}\equiv(A^{-1})^T$.
Therefore, we can use the Lyapunov solver
tmDopico[n_]:=Module[{a,b,c},
a=RandomReal[{-3,3},{n,n}];
b=RandomReal[{-3,3},{n,n}];
c=RandomReal[{-3,3},{n,n}];
First[Timing[LyapunovSolve[Transpose[Inverse[b]].a,-Transpose[Inverse[a]].b,Transpose[Inverse[b]].c-Transpose[Inverse[b]].Transpose[c].Transpose[Inverse[a]].b];]]]
Let us check the timing:
ListLogPlot[Table[{n,tmDopico[n]},{n,50,1000,50}],Joined->True,PlotTheme->{"Frame","Monochrome"}, FrameLabel->{"Matrix size","Time(s)"}]
The method should therefore have $\mathcal{O}(n^3)$ scaling under favorite conditions.