Course detail
Mathematical Methods Of Optimal Control
FSI-9MORAcad. year: 2021/2022
The course familiarises students with basic methods used in the modern control theory. This theory is presented as a remarkable example of the interaction between practical needs and mathematical theories. Also dealt with are the following topics:
Optimal control. Bellman's principle of optimality. Pontryagin's maximum principle. Time-optimal control of linear problems. Problems with state constraints. Applications.
Language of instruction
Mode of study
Guarantor
Department
Learning outcomes of the course unit
Prerequisites
Co-requisites
Planned learning activities and teaching methods
Assesment methods and criteria linked to learning outcomes
Grading scheme is as follows: excellent (90-100 points), very good
(80-89 points), good (70-79 points), satisfactory (60-69 points), sufficient (50-59 points), failed (0-49 points). The grading in points may be modified provided that the above given ratios remain unchanged.
Course curriculum
Work placements
Aims
Specification of controlled education, way of implementation and compensation for absences
Recommended optional programme components
Prerequisites and corequisites
Basic literature
Lee, E. B. - Markus L.: Foundations of optimal control theory, New York, 1967.
Pontrjagin, L. S. - Boltjanskij, V. G. - Gamkrelidze, R. V. - Miščenko, E. F.: Matematičeskaja teorija optimalnych procesov, Moskva, 1961.
Recommended reading
Čermák, J.: Matematické základy optimálního řízení, Brno, 1998.
Víteček, A., Vítečková, M.: Optimální systémy řízení, Ostrava, 1999.
Classification of course in study plans
- Programme D-KPI-P Doctoral 1 year of study, summer semester, recommended course
- Programme D-APM-P Doctoral 1 year of study, summer semester, recommended course
- Programme D-APM-K Doctoral 1 year of study, summer semester, recommended course
- Programme D-KPI-K Doctoral 1 year of study, summer semester, recommended course
Type of course unit
Lecture
Teacher / Lecturer
Syllabus
2. Dynamic programming. Bellman's principle of optimality.
3. Maximum principle.
4. Time-optimal control of an uniform motion.
5. Time-optimal control of a simple harmonic motion.
6. Basic properties of optimal controls.
7. Optimal control of systems with a variable mass.
8. Variational problems of flight dynamics.
9. Energy-optimal control problems.
10. Variational problems with state constraints.