Gradient Type Methods for Large Scale Optimization: Monotone Diagonal Updating Schemes for Unconstrained Optimization - Leong Wah June - Books - LAP LAMBERT Academic Publishing - 9783844319682 - May 5, 2011
In case cover and title do not match, the title is correct

Gradient Type Methods for Large Scale Optimization: Monotone Diagonal Updating Schemes for Unconstrained Optimization

Price
R 792
excl. VAT

Ordered from remote warehouse

Expected to be ready for shipping Apr 7 - 10
Add to your iMusic wish list

The focus of this book is on finding the unconstrained minimizer of a function. Specifically, we will focus on the Barzilai and Borwein (BB) method that is a famous two-point stepsize gradient method. Due to BB method's simplicity, low storage and numerical efficiency, the BB method has received a good deal of attention in the optimization community but despite all these advances, stepsize of BB method is computed by means of simple approximation of Hessian in the form of scalar multiple of identity and especially the BB method is not monotone, and it is not easy to generalize the method to general nonlinear functions. Due to the presence of these deficiencies, we introduce new gradient-type methods in the frame of BB method including a new gradient method via weak secant equation, improved Hessian approximation and scaling the diagonal updating. Our proposed methods consider approximation of Hessian in diagonal matrix. Incorporate with monotone strategies, the resulting algorithms belong to the class of monotone gradient methods with globally convergence. Numerical results suggest that for non-quadratic minimization problem, the new methods clearly outperform the BB method.

Media Books     Paperback Book   (Book with soft cover and glued back)
Released May 5, 2011
ISBN13 9783844319682
Publishers LAP LAMBERT Academic Publishing
Pages 104
Dimensions 150 × 6 × 226 mm   ·   173 g
Language German