Is it possible to use the LevenbergMarquardt algorithm for fitting a black-box residual function?
Here is the solution based on Faysal's code:
steps=0;
FindMinimum[Null,
{optimVariables, initialGuess}\[Transpose],
Method -> {"LevenbergMarquardt",
"Residual" -> Sqrt[2] residualVector[optimVariables],
"Jacobian" -> {Sqrt[2] jacobianMatrix[optimVariables], EvaluationMonitor :> ++steps},
"StepControl" -> {"TrustRegion", "StartingScaledStepSize" -> 1/1000, "MaxScaledStepSize" -> 1/10, "AcceptableStepRatio" -> 1/3}}]
Note that it is recommended to use exact numbers for the parameters of the "TrustRegion"
method because these parameters are used inside of the algorithm without any check for consistency with WorkingPrecision
. I should also note that the actual residual vector and jacobian must be multiplied by Sqrt[2]
for having FindMinimum
returning the minimum equal to
residualVector[optimVariables].residualVector[optimVariables]
and not to
residualVector[optimVariables].residualVector[optimVariables]/2
as it is by default.
The jacobian may be calculated automatically by the following code:
jacobianMatrix[_] = D[residualVector[optimVariables], {optimVariables}]
One can restrict the jacobian to be evaluated for numerical values only by defining it as:
jacobianMatrix[_List?(VectorQ[#, NumberQ] &)] =
D[residualVector[optimVariables], {optimVariables}]
I had a similar situation once, here's a prototype similar to your problem that I wrote once on the Mathematica forum of LinkedIn, you can adapt it to your needs, the gradient is computed using the "Residual" option of the Method option.
targets={1,2};
f[{x_?NumericQ,y_?NumericQ}]:={(x-3)^5,2 (y-5)^3};
objectiveFunction[{params___?NumericQ}]:=Norm[f[{params}]-targets,2];
R[{params___?NumericQ}]:=f[{params}]-targets;
OptimVariables={Symbol["xx"],Symbol["yy"]};
initialGuess={1,-1};
FindMinimum[
objectiveFunction[OptimVariables]
,
{OptimVariables,initialGuess}\[Transpose]
,
Method->{"LevenbergMarquardt","Residual"->R[OptimVariables]}
]