Abstract

Abstract : In this paper we discuss several recent conjugate-gradient type methods for solving large-scale nonlinear optimization problems. We demonstrate how the performance of these methods can be significantly improved by careful implementation. A method based upon iterative preconditioning will be suggested which performs reasonably efficiently on a wide variety of significant test problems. Our results indicate that nonlinear conjugate-gradient methods behave in a similar way to conjugate-gradient methods for the solution of systems of linear equations. These methods work best on problems whose Hessian matrices have sets of clustered eigenvalues. On more general problems, however, even the best method may require a prohibitively large number of iterations. We present numerical evidence that indicates that the use of theoretical analysis to predict the performance of algorithms on general problems is not straightforward. (Author)

Keywords

Conjugate gradient methodNonlinear conjugate gradient methodScale (ratio)Nonlinear systemComputer scienceMathematical optimizationMathematicsAlgorithmArtificial intelligencePhysicsGeographyGradient descentArtificial neural networkCartography

Related Publications

Preconditioning of Truncated-Newton Methods

In this paper we discuss the use of truncated-Newton methods, a flexible class of iterative methods, in the solution of large-scale unconstrained minimization problems. At each ...

1985 SIAM Journal on Scientific and Statis... 175 citations

Publication Info

Year
1979
Type
report
Citations
69
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

69
OpenAlex
8
Influential
40
CrossRef

Cite This

Philip E. Gill, Walter Murray (1979). Conjugate-Gradient Methods for Large-Scale Nonlinear Optimization.. . https://doi.org/10.21236/ada078713

Identifiers

DOI
10.21236/ada078713

Data Quality

Data completeness: 75%