|
|
|
||
Optimization and minimization techniques. Optimization method, global convergence, speed of convergence.
Minimization of a functional, descent techniques, nonlinear conjugate gradient method, Quasi-Newton methods,
trust-region methods. Least-squares problems, the Gauss-Newton method. Theory of constrained optimization,
Lagrange multipliers, convex optimization, penalty and barrier methods, projection and dual methods. The course
is suitable for students focused on industrial mathematics and numerical analysis.
Last update: T_KNM (07.04.2015)
|
|
||
It is not necessary to obtain a course-credit before passing the exam.
The course-credit will be granted for a presentation of results of numerical experiments.
There are two additional attempts to obtain the course-credit. Last update: Tichý Petr, doc. RNDr., Ph.D. (23.04.2020)
|
|
||
J. E. Dennis and R. B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, SIAM 1996, originally published in 1983. R. Fletcher, Practical Methods of Optimization, 2nd edition Wiley 1987, (republished 2000). D. G. Luenberger and Y. Ye, Linear and Nonlinear Programming, Third edition. Springer, New York, MA, 2008. J. Nocedal and S. Wright, Numerical Optimization, Second edition, Springer Verlag 2006. W. Sun and Y-X. Yuan, Optimization theory and methods. Nonlinear programming. Springer Optimization and Its Applications, 1. Springer, New York, 2006. Last update: Tichý Petr, doc. RNDr., Ph.D. (23.04.2020)
|
|
||
The exam is oral. Requirements for the oral exam correspond to the syllabus of the course, presented at the lectures.
Last update: Tichý Petr, doc. RNDr., Ph.D. (16.02.2018)
|
|
||
Theory of unconstrained optimization (necessary and sufficient conditions, the role of convexity, classification of convergence), minimization in a given direction (Golden section search, curve fitting, Newton), inexact line search (Goldstein, Armijo, and Wolfe conditions), basic descent methods (the method of steepest descent and the Newton method), conjugate direction methods (the nonlinear conjugate gradient method), Quasi-Newton methods (the quasi-Newton condition, rank-one update, DFP, BFGS, the Broyden family), trust-region methods, least-squares problems (the Gauss-Newton and the Levenberg-Marquart method). Theory of constrained optimization (Lagrange multipliers, necessary and sufficient conditions). Last update: Tichý Petr, doc. RNDr., Ph.D. (23.04.2020)
|
|
||
Fundamentals of multivariable calculus and numerical linear algebra. Basic knowledge of the Matlab programming language. Last update: Tichý Petr, doc. RNDr., Ph.D. (02.05.2018)
|