Tuesday, May 11

Derivative-Free Optimization

3:30 PM-5:30 PM
Room: Atlanta 1 and 2

Minimization problems without gradient information are not rare. Functions that require experiments, either real or computational, table lookup, or black-box internal computations are examples. The speakers in this minisymposium will consider a variety of methods for coping with this difficulty. Direct search methods, methods which build models from the history of the optimization, and methods that use finite differences to find whatever gradient information is in the function will be considered. The talks in this minisymposium will address some of the possibilities for minimization in the absence of explicit gradient information

Organizer: C. T. Kelley
North Carolina State University

3:30-3:55 Derivative Free Optimization Algorithms for Constrained Problems
Andrew R. Conn and Katya Scheinberg, IBM, T. J. Watson Research Center; and Ph. Toint, FUNDP, Brussels, Belgium
4:00-4:25 Quasi-Newton Methods Without Gradients
Tony D. Choi, North Carolina State University; and C. T. Kelley, Organizer
4:30-4:55 A Derivative Free Method for Bound Constrained Optimization
Stefano Lucidi, Università di Roma "La Sapienza", Italy; and Marco Sciandrone, Consiglio Nazionale delle Richerche, Rome, Italy
5:00-5:25 Direct Search Methods for Stochastic Optimization
Michael W. Trosset, College of William & Mary

OP99 Home


Program Updates

Speaker Index




MMD, 12/21/98