Correcting a correlation matrix to be positive semidefinite

Update: the previous method continues the iteration until all eigenvalues are at or above the threshold, that is, it does not check and stop if the current matrix is PD. The following is a version that tests for positive definiteness using PositiveDefiniteMatrixQ at every iteration:

ClearAll[covBending2];
covBending2[mat_, tol_: 1/10000] := If[PositiveDefiniteMatrixQ[mat], mat,
   NestWhile[ (Eigensystem[#][[2]].DiagonalMatrix[
     Max[#, tol] & /@ (Eigensystem[#][[1]])].Transpose[
     Eigensystem[#][[2]]]) &,  N@mat, !PositiveDefiniteMatrixQ[#]&]]

Original answer:

Here is simple (unweighted) Mma version of the Matlab implementation of Covariance Bending.

ClearAll[covBending];
covBending[mat_, tol_: 1/10000]:=If[PositiveDefiniteMatrixQ[mat], mat, 
NestWhile[(Eigensystem[#][[2]].DiagonalMatrix[
  Max[#, tol] & /@(Eigensystem[#][[1]])].Transpose[
  Eigensystem[#][[2]]]) &, N@mat,  Min[Eigensystem[#][[1]]] < tol &]]

examples:

 matrices = ((1/2) (# + Transpose[#]) & /@  RandomReal[{0, 1}, {10, 4, 4}]);
{MatrixForm[#], PositiveDefiniteMatrixQ[#], MatrixForm[covBending[#]],
 PositiveDefiniteMatrixQ[covBending[#]]} & /@  matrices // TableForm

enter image description here


Generally, the reason why matrices that were supposed to be positive semi-definite but are not, is because the constraint of working in a finite precision world often introduces a wee bit of perturbation in the lowest eigenvalues of the matrix, making it either negative or complex. These errors are generally of the order of machine precision, but is enough to return False if you use standard algorithms to check for positive semi-definiteness. This can be easily fixed with Chop. For example, something like the following:

psdMat = #1.Chop@#2.#3\[ConjugateTranspose] & @@ SingularValueDecomposition[mat];

Now if this isn't sufficient to make your matrix positive semi-definite, you should go back and take a closer look at your problem to see if there are other reasons to not expect it to be positive semi-definite.


I like the approach in this paper: The most general methodology to create a valid correlation matrix for risk management and option pricing purposes. Below is a starting point for an implementation.

MakeGoodCorrelationMatrix[badCorrMat_,flag_]:=
Which[
    flag==1,
        Module[{LocalN=Length[badCorrMat],LocalTheta,LocalB,LocalOutput,LocalSolution},
            LocalTheta=Outer[Unique[] &, Range[LocalN], Range[LocalN-1], 1, 1];
            LocalB = Outer[If[#2 < LocalN, Cos[LocalTheta[[#1, #2]]] (Times @@ Sin[LocalTheta[[#1, 1 ;; #2 - 1]]]),
                Times @@ Sin[LocalTheta[[#1, 1 ;; #2 - 1]]]] &, Range[LocalN], Range[LocalN], 1, 1] //Chop;
            LocalOutput = Dot[LocalB, Transpose[LocalB]] //Chop;
            LocalSolution = Minimize[Total[Flatten[(LocalOutput - badCorrMat)^2]], Flatten[LocalTheta]]//Chop;
            LocalOutput /. LocalSolution[[2]] 

            ],  
    True,
        Module[{LocalLambdap = (Max[0.000001, #] & /@ Eigenvalues[badCorrMat]) , LocalS = Eigenvectors[badCorrMat] // Transpose, LocalT, LocalB, LocalOutput}, 
            LocalT = DiagonalMatrix[Dot[#^2,LocalLambdap]^(-1) & /@ LocalS] ; 
            LocalB = Dot[Sqrt[LocalT],LocalS,Sqrt[DiagonalMatrix[LocalLambdap]]]; 

            LocalOutput = Dot[LocalB,Transpose[LocalB]] ;
            LocalOutput = 1/2 (LocalOutput + Transpose[LocalOutput]) (* just to make sure the output is symmetric *)

            ]
]

Example :

badMat = {{1, 0.05, 0.9}, {0.05, 1, 0.8}, {0.9, 0.8, 1}};
Eigenvalues[badMat]
(* {2.22926, 0.950346, -0.179603} *)

goodMat1 = MakeGoodCorrelationMatrix[badMat, 1];
Eigenvalues[goodMat1]
(* {2.11798, 0.882022, 8.32667*10^-17} *)

goodMat2 = MakeGoodCorrelationMatrix[badMat, 2];
Eigenvalues[goodMat2]
(* {2.08911, 0.910888, 9.35756*10^-7} *)