Algorithms for Minimization Without Derivatives

Algorithms for Minimization Without Derivatives

by Richard P. Brent
Algorithms for Minimization Without Derivatives

Algorithms for Minimization Without Derivatives

by Richard P. Brent

Paperback

$14.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview


This outstanding text for graduate students and researchers proposes improvements to existing algorithms, extends their related mathematical theories, and offers details on new algorithms for approximating local and global minima. None of the algorithms requires an evaluation of derivatives; all depend entirely on sequential function evaluation, a highly practical scenario in the frequent event of difficult-to-evaluate derivatives.
Topics include the use of successive interpolation for finding simple zeros of a function and its derivatives; an algorithm with guaranteed convergence for finding a minimum of a function of one variation; global minimization given an upper bound on the second derivative; and a new algorithm for minimizing a function of several variables without calculating derivatives. Many numerical examples augment the text, along with a complete analysis of rate of convergence for most algorithms and error bounds that allow for the effect of rounding errors.

Product Details

ISBN-13: 9780486419985
Publisher: Dover Publications
Publication date: 04/17/2013
Series: Dover Books on Mathematics Series
Pages: 206
Product dimensions: 5.30(w) x 8.40(h) x 0.60(d)

Table of Contents

PREFACE TO DOVER EDITION
PREFACE
1 INTRODUCTION AND SUMMARY
1.1 Introduction
1.2 Summary
2 "SOME USEFUL RESULTS ON TAYLOR SERIES, DIVIDED DIFFERENCIES, AND LAGRANGE INTERPOLATION"
2.1 Introduction
2.2 Notation and definitions
2.3 Truncated Taylor series
2.4 Lagrange interpolation
2.5 Divided differences
2.6 Differentiating the error
3 THE USE OF SUCCESSIVE INTERPOLATION FOR FINDING SIMPLE ZEROS OF A FUNCTION AND ITS DERIVATIVES
3.1 Introduction
3.2 The definition of order
3.3 Convergence to a zero
3.4 Superlinear convergence
3.5 Strict superlinear convergence
3.6 The exact order of convergence
3.7 Stronger results for q = 1 and 2
3.8 Accelerating convergence
3.9 Some numerical examples
3.10 Summary
4 AN ALGORITHM WITH GUARANTEED CONVERGENCE FOR FINDING A ZERO OF A FUNCTION
4.1 Introduction
4.2 The algorithm
4.3 Convergence properties
4.4 Practical tests
4.5 Conclusion
4.6 ALGOL 60 procedures
5 AN ALGORITHM WITH GUARANTEED CONVERGENCE FOR FINDING A MINIMUM OF A FUNCTION OF ONE VARIABLE
5.1 Introduction
5.2 Fundamental limitations because of rounding errors
5.3 Unimodality and d-unimodality
5.4 An algorithm analogous to Dekker's algorithm
6 GLOBAL MINIMIZATION GIVEN AN UPPER BOUND ON THE SECOND DERIVATIVE
6.1 Introduction
6.2 The basic theorems
6.3 An algorithm for global minimization
6.4 The rate of convergence in some special cases
6.5 A lower bound on the number of function evaluations required
6.6 Practical tests
6.7 Some extensions and generalizations
6.8 An algorithm for global minimization of a function of several variables
6.9 Summary and conclusions
6.10 ALGOL 60 procedures
7 A NEW ALGORITHM FOR MINIMIZING A FUNCTION OF SEVERAL VARIABLES WITHOUT CALCULATING DERIVATIVES
7.1 Introduction and survey of the literature
7.2 The effect of rounding errors
7.3 Powell's algorithm
7.4 The main modification
7.5 The resolution ridge problem
7.6 Some further details
7.7 Numerical results and comparison with other methods
7.8 Conclusion
7.9 An ALGOL W procedure and test program
BIBLIOGRAPHY
APPENDIX: FORTRAN subroutines
INDEX
From the B&N Reads Blog

Customer Reviews