Optimal Control
The Technical Doctoral School of IT and Design at Aalborg University
Welcome to Optimal Control
Description:
Optimal control is the problem of finding control strategies for a dynamic system such that a certain performance function is minimized (or maximized). The subject stems from the calculus of variations and was developed into an independent discipline during the early 1950's mainly due to two discoveries: the maximum principle by L.S. Pontryagin and dynamic programming by R. Bellman. Optimal control finds its application in a variety of areas including engineering, economics, biology and logistics.
The course will be conducted as a (traditionally) lecture series with physical attendance. Course evaluation will be done through attendance and homework assignment. It has two main parts, which in headlines are: (1) Foundation of optimal control, and (2) Special topics, including a discussion on numerical implementation.
In the first part of the course, we will concentrate on the foundation of optimal control. We will discuss necessary and sufficient condition for optimality, and various types of constraints. We will address the question of existence of optimal strategies. We cover two main results in optimal control theory, the Hamilton-Jacobi-Bellman (HJB) equation and the (Pontryagin) maximum principle. We show how the dynamic programming principle works for an optimal control problem by using the HJB equation to solve linear quadratic control problems. Moreover, we apply the maximum principle to linear quadratic control problems. We end this part by introducing the notion of viscosity solution to the HJB equation.
In the second part of the course we will give an introduction to two areas of optimal control: singular optimal control where higher order conditions such as the generalized Legendre–Clebsch condition is used to obtain sufficient condition for local optimality, and optimal control of Markov processes where the state variables are not known with certainty (they are the outcome of stochastic differential equations). Finally, we will discuss software solutions for optimal control problems.
Prerequisites:
A basic knowledge of mathematics as obtained through undergraduate engineering studies.
Learning objectives:
Existence of optimal strategies
Hamilton-Jacobi-Bellman equation
Pontryagin's maximum principle
Linear quadratic optimal control problems
Viscosity solution
Singular optimal control
Generalized Legendre–Clebsch condition
Optimal control of Markov processes
Numerical solutions to optimal control problems
Organizer:
John Leth
Lecturers:
John Leth
ECTS: 2
Time: August - TBA
Place: Aalborg University
Zip code: 9220
City: Aalborg
Maximal number of participants: 30
Deadline: TBA
Important information concerning PhD courses:
There is a no-show fee of DKK 3,000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before the start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately four months before start of the course.
We cannot ensure any seats before the deadline for enrolment, all participants will be informed after the deadline, approximately 3 weeks before the start of the course.
To attend courses at the Doctoral School in Medicine, Biomedical Science and Technology you must be enrolled as a PhD student.
For inquiries regarding registration, cancellation or waiting list, please contact the PhD administration at aauphd@adm.aau.dk When contacting us please state the course title and course period. Thank you.