You are on page 1of 14

Automatic Control

Dynamic systems
Dynamic systems - General definition

A dynamic system can be (roughly) defined as a set of interacting


objects which evolve over time.

Examples:
– vehicles
– mechanical systems
– electrical circuits
– aircrafts
– stock market
– animal population
– atmosphere
– planet systems
– and so on...
Dynamic systems – Automotive examples

Vehicle lateral dynamics


Engine control

Vehicle vertical dynamics


Vehicle longitudinal dynamics
Dynamic systems – Aerospace examples

Aircraft vertical and pitch dynamics

b(t) z(t)

Aircraft roll dynamics

q (t )
Dynamic systems – Aerospace examples

Satellite roll, pitch and yaw dynamics

MD
d
Earth
q

Fc
Dynamic systems – Biomedical examples

Insulin-glucose response in type 1 diabetic


patients. Cancer immunotherapy model.
Dynamic systems – Example: a chaotic system

Lorenz attractor
• Lorenz system. Simplified atmosphere
model:

x,y,z: atmospheric variables related to


pressure, temperature and air speed.

• Typical example of Chaotic system: - Low predictability


- High sensitivity to initial conditions
- Unstable solutions
Dynamic systems – Example: stock market
Dynamic systems - A classical example: pendulum

• Variables:
– 𝜃(𝑡) : angular position
̇
– 𝜔 𝑡 = 𝜃(𝑡) : angular velocity
l
– 𝑇(𝑡) : applied torque.

q (t )
T (t ) • Parameters:
– 𝑚 : mass
– 𝑙 : length
– 𝑔 : gravity acceleration.
mg

• Input: 𝑢 𝑡 = 𝑇 𝑡 .
• Output: we can choose e.g. 𝑦 𝑡 = 𝜃(𝑡).
Dynamic systems - A classical example: pendulum

• According to the 2nd principle of dynamics (Newton’s law):

Jq!!(t ) = - K sin[q (t )] - bq!(t ) + T (t )

where 𝐽 = 𝑚𝑙 0 : moment of inertia


𝐾 = 𝑔𝑚𝑙 : elastic constant
𝛽 : friction coefficient.

• The time evolution of the system is described by differential equations.


• The behavior of the system (more precisely, of its variables) can be
predicted by integration of the differential equations.
• Integration can be performed
– analytically (when possible) or
– numerically (always possible).
Dynamic systems - A classical example: pendulum

Damped pendulum (𝛽 > 0), non-null initial conditions, free evolution (𝑇 𝑡 = 0).

Non-damped pendulum (𝛽 = 0), non-null initial conditions, free evolution (𝑇 𝑡 = 0).


Dynamic systems - A classical example: pendulum

Damped pendulum (𝛽 > 0), null initial conditions, impulse response.

Damped pendulum (𝛽 > 0), null initial conditions, step response.


Dynamic systems – The problem of control

• Controlling a system means modifying it in order to obtain a desired behavior.


• More precisely, controlling a dynamic system consists in providing a command
input 𝑢(𝑡) such that the output 𝑦 𝑡 tracks a desired reference signal 𝑟(𝑡).
• The controlled system should be as much as possible insensitive to
disturbances.

d(t)

u(t)
r(t) ? system y(t)

Problem: find a system, called controller, to use in place of ?, allowing


us to satisfy these requirements.
Dynamic systems – The problem of control

Feed-forward control d(t)

r(t) controller system y(t)

Feed-back control d(t)

r(t) controller system y(t)

You might also like