You are on page 1of 47

Introduction to Control

Systems
Text Books
1. Modern Control Systems, (11th Edition) By: Richard C. Dorf
,University of California, Davis & Robert H. Bishop, The University of
Texas at Austin.

2. Control Systems Engineering, (6th Edition) By: Norman S. Nise,


California State Polytechnic University.

3. Modern Control Engineering, (5th Edition) By: Katsuhiko Ogata,


University of Minnesota.
What is a system?

• Elements in interaction producing a desired response.

• System tends to be goal seeking.


Dynamical system
• Has an internal state and some dynamics (i.e., a rule specifying how
that state evolves in time) .

• Example: Model of a car in motion. The evolution of speed


is given by
dv 
 a (t ) v (t )  state of the system. 
dt  This equation specifies the dynamics.
a (t )  input to the system.
What is a control system?
• A control system is an integration of components
forming a configuration that will provide a desired
response.
• A system is called plant or process in control literature.

Input; Stimulus Output; Response


Process
Desired response Actual response

• There is a cause-and-effect relationship of the process.


System
• Purpose of “control” - to ensure that the system’s output
quantity resembles the quantity desired by the user, despite
the system’s dynamics and disturbances by noise. Usually,
control requires feedback.

Input: Stimulus Output: Response


r(t) Process c(t)
Desired response Actual response

• May be scalar or vector • May be scalar or vector


• Can be defined by the user • User specifies the desired form
• It is subject to noise (or errors) • It can be measured, but not in all systems.
Control
• Control by definition: Control is a process of gathering
information, processing information and making decision,
and acting on a system so that system responses as desired.
Definition: Signals
• A signal is a set of data or information.

Example: telephone or television signal.


Such signals are functions of independent variable time.
However, there are other signals that are independent of time
but depended on other variables.
Process
• Process - device, plant, or system under control.
• The input and output relationship represents the cause-and-
effect relationship of the process.

Manipulated Variable
Input or
Set point Output or
Or reference Controlled Variable
Controller Process
Other variables
• Controlled Variable - It is the quantity or condition that is
measured and Controlled. Normally controlled variable is
the output of the control system.

• Manipulated variable - It is the quantity of the condition


that is varied by the controller so as to affect the value of
controlled variable.
Disturbances
• Disturbance - A disturbance is a signal that tends to
adversely affect the output of the system. It is an unwanted
input of the system.
Types of control system: Open-loop control system
• When a system’s output is not referenced with the input, the
system is called open loop control system.
• The control action of an OLCS depends only on the input signal.
• The accuracy of the system depends on calibration.
• The OLCS are not capable of filtering disturbances or noise.
• The performance of open loop system is severely affected by the
presence of disturbances, or variation in operating/ environmental
conditions.
Open-loop control system
• An OLCS utilizes a controller or control actuator to obtain
the desired response.
• Output has no effect on the control action; that is,
output is neither measured nor fed back.
Input Output
Controller Process

• Examples:- Washing Machine, Toaster, Electric Fan,


Rice cooker, Photocopy machine.
Types of control system: Closed-loop control system
• When the output quantity of a control system is referenced
with the input, then the system is called the closed-loop
control system.
• The output quantity is fed back to influence the control
action and improve the overall system performance.
Input Output
Comparator Controller Process

Feedback
element
Types of control system: Closed-loop control system

• Examples:
• Temperature control systems (air-conditioner),
• Position control systems (robot arms),
• velocity control systems (vehicle cruise control),
Open and closed loop control systems
• In practice, combination of both OLCS and CLCS is
normally used.
• A simple example is the washing machine:
• the process of filling up the tank with water is a CLCS
operation,
• while the process of washing and rinsing is an OLCS
operation.
Open and closed loop control systems
OLCS CLCS
The structure is simple and is cheaper Able to compensate for / filter
to build. disturbances noise.
Suitable when input signal for Not sensitive to noise and changes to
satisfactory system performance can be system parameters or environment.
estimated or approximated and does
not change.
Easier to design a controller for a
CLCS to achieve the desired transient
and steady state response.
Control loop
Open Vs. Closed loop control system
Types of control system: Multivariable control system

Outputs
Temp
Humidity Comparator Controller Process
Pressure

Measurements
Feedback control system
•A system that maintains a prescribed relationship between
the output and some reference input by comparing them
and using the difference (i.e. error) as a means of control is
called a feedback control system.
•Feedback can be positive or negative.
•Closed-loop control system is a feedback system.
Feedback control system

Input error Output


+ Controller Process
-

Feedback
Feedforward vs. feedback
• Feedforward: acts without taking the output into account.
(OLCS)
• Example: dishwasher does not measure the cleanliness of
plates during its operation
• Feedback: the output is specified by taking the input into
account. (CLCS)
• Example: room temperature control system measures the
room temperature during its operation.
Types of control system: Servo system
• A Servo System (or servomechanism) is a feedback control
system in which the output is some mechanical position,
velocity or acceleration.

Antenna Positioning System


Types of control system: Linear control system
• A control system in which output varies linearly with the
input is called a linear control system. (follow Ohm’s law)
y=3*u(t)+5
35

y(t )  3u(t )  5
30
u(t) Process y(t) 25

y(t)
20

15

10

5
0 2 4 6 8 10
u(t)
Types of control system: nonlinear control system
•When the input and output has nonlinear relationship the
system is said to be nonlinear.
Types of control system: nonlinear control system
•Linear control system does not exist in practice.

•Linear control systems are idealized models fabricated by


the analyst purely for the simplicity of analysis and design.

•When the magnitude of signal in a control system is


limited to range in which system components exhibit linear
characteristics, the system is essentially linear.
Time invariant and Time variant
•When the characteristics of the system do not depend upon
time itself then the system is said to time invariant control
system.
y(t )  2u(t )  1

•Time varying control system is a system in which one or


more parameters vary with time.
y(t )  2u(t )  3t
Lumped parameter and Distributed parameter
•Control systems that can be described by ordinary
differential equations are lumped-parameter control systems.
d 2x dx
M C  kx
dt 2 dt

•Whereas the distributed parameter control systems are


described by partial differential equations.
x x 2x
f1  f2 g 2
dy dz dz
Continuous data system
•In continuous data control system all system variables
are function of a continuous time t.

x(t)

t
Discrete data system
•A discrete time control system involves one or more
variables that are known only at discrete time intervals.

X[n]

n
Deterministic control system
•A control system is deterministic if the response to input is
predictable and repeatable.

x(t) y(t)

t t
Stochastic control system

z(t)

t
Static and Dynamic control system
•Static: the output at any time depends on input at that time.
•Dynamic: present output depends on the past input.
Adaptive Control System
• The dynamic characteristics of most control systems are
not constant for several reasons.
• The effect of small changes on the system parameters is
attenuated in a feedback control system.
• An adaptive control system is required when the changes
in the system parameters are significant.
Classification
Control Systems
of control
systems Natural Man-made

Manual Automatic

Open-loop Closed-loop

Non-linear linear
Non-linear linear

Time variant Time invariant


Time variant Time invariant
History of automatic control system

To control speed
of a steam
engine

James Watt’s flyball governor, 1769


Examples of control systems

Water-level float regulator


Examples of control systems

Fluid flow control


Examples of control systems
Examples of control systems
Examples of control systems
Examples of control systems
Types of control
• Regulator: (e.g. flyball governor)
• Maintains constant output despite disturbances
• Compensator (e.g. elevator)
• Drives system from an initial to a final state according to specifications on
the transient response.
• Tracking (e.g. space robot)
• Match output to a non stationary input despite disturbances.
• Optimum control (e.g. hard disk drive)
• Drive system from an initial state to a final state while optimizing a merit
function (e.g. minimum time to target or minimum energy consumption)
• Combination of above
Kinetic (tracking) control systems
• Control variables: displacement/position, speed,
acceleration.
• Fast system response with small delay time.
• Input may be fixed or changing.
• Normally involves electrical manipulators.
Process (regulating) control systems
• Control variables: temperature, flow, level, pressure.
• Slow system response with large delay time.
• System responds to a fixed set point.
• Example: heating elements.
Time varying  Time varying
 Input Output

Frequency dependent  Dynamic System Frequency dependent

Real Physical System


Modeling
Idealized representation
State Variables and LODE’s
Mathematical equations Laplace Transforms
Pole Zero Analysis
Linear System Transfer Function Representation Frequency Response Analysis
Theory Standard input and Prediction of Transient Response
Responses
Steady State Errors Input
System design Modifications Tracking Response Shaping
Control System
Use of feedback Control
Design
Desired input – Output Response

Source: MIT Lecture series


System Dynamics and Control, Spring 2013

You might also like