To introduce students to significant methods for linear and non-linear optimization. The algorithms are developed neuristically with rigorous justification where suitable. Descriptions of the methods are accompanied by the algorithms in a form suitable for computer implementation.
|Optimization of one-dimensional functions: conditions for a local minimum, golden-section search, Powell's method, Newton-Raphson method.||8|
|Multidimensional unconstrained optimization: direct methods (simples, Hooke and Jeeves', conjugate directions), gradient methods (steepest descent, modified Newton-Raphson, linear programming.||8|
|Nonlinear constrained optimization: method of Lagrange, Kuhn-Tucker conditions, penalty-function techniques.||8|