10:30 AM / Salon A/B
Tuesday, May 21
This talk will survey the broad subject of derivative-free optimization algorithms. Direct search methods are nonlinear optimization methods that neither require nor explicitly approximate derivatives for the problem to be solved. Instead, at each iteration a set of trial points is generated and their function values are compared with the best solution previously obtained. This information is then used to determine the next set of trial points. This general description encompasses a wide variety of techniques, including the provably convergent pattern search methods that we have studied, together with other approaches such as random search methods and genetic algorithms.
Direct Search Methods
Standard questions about direct search methods are: Under what conditions do they converge to a solution? What advantages, if any, do these methods enjoy over classical methods based on higher-order information? When should they be employed?
The speaker will survey the field of direct search methods, compare them with standard nonlinear programming methods such as quasi-Newton methods, discuss successful applications of them to problems in science and engineering, and suggest guidelines for their use.
Department of Computer Science
College of William and Mary