Reduced Storage, Quasi-Newton Trust Region Approaches to Function Optimization

29 November 1999

New Image

In this paper, we consider several algorithms for reducing storage when using a quasi-Newton method in a dogleg trust region setting for minimizing functions of many variables. Secant methods require 0(n sup 2) to store an approximate Hessian and 0(n sup 2) operations per iteration when minimizing a function of n variables. This storage requirement becomes worrisome when n becomes large. Our algorithms use a BFGS update and require kn storage and 4kn + 0(k sup 2) operations per iteration, but may require more iteration than the standard trust region techniques. Typically k is between 10 and 100. Our dogleg trust region strategies involve expressions with matrix products with both the inverse of this Hessian and with the Hessian itself. Our techniques for updating expressions for the Hessian and its inverse can be used to improve the performance of line search, limited memory algorithms.