This up-to-date book is on algorithms for large-scale unconstrained and bound constrained optimization. Optimization techniques are shown from a conjugate gradient algorithm perspective.
Large part of the book is devoted to preconditioned conjugate gradient algorithms. In particular memoryless and limited memory quasi-Newton algorithms are presented and numerically compared to standard conjugate gradient algorithms.
The special attention is paid to the methods of shortest residuals developed by the author. Several effective optimization techniques based on these methods are presented.
Because of the emphasis on practical methods, as well as rigorous mathematical treatment of their convergence analysis, the book is aimed at a wide audience. It can be used by researches in optimization, graduate students in operations research, engineering, mathematics and computer science. Practitioners can benefit from numerous numerical comparisons of professional optimization codes discussed in the book.
Reviews
There are no reviews yet.