Scaling of objective function in optimization problems.

When you study a optimization problem theoretically, scaling does not matter. However, sometimes one adds a scaling factor, such that the derivative is a little bit simpler, e.g., instead of $x^2$ one likes to minimize $\frac12 x^2$.

However, scaling is of importance for the numerically solution. If your optimization method is not scaling invariant, then you get a different sequence of iterates (not only due to rounding errors). Newton's method, e.g., is affine invariant, i.e., if you do an affine transformation of your optimization problem, you get (up to the transformation) the same sequence of iterates.

Tags:

Optimization