Introduction:
|
The study of dynamical systems, point particles that evolve within a finite or infinite dimensional space, has a long and broad history. In particular, along the late years of the past century, the theory has known a huge growth thanks to the introduction of new techniques coming from different branches of mathematics: differential geometry, topology, functional analysis, probability theory and, with the advent of computation, numerical analysis. In the present modern era, there is a crucial need of finding actual applications of the theory and further developing it in order to surpass the problems that appear on the go. A common and natural answer to this need is the control theory and optimization, which helps to shape the systems up to some extent according to different kinds of requirements, such as physical, economical or environmental. We therefore easily find in the literature applications of the geometric integration of Lagrangian and Hamiltonian mechanics to spatial industry, of game theory to economics or biology, of the Hamilton-Jacobi-Bellman equation to image processing and medicine, etc. In this session we aim to provide a space to discuss recent theoretical and applied results of dynamical systems with a special attention on optimal control. |
|