RIT Dubai
Electrical Engineering Program
EEEE 661
Modern Control
Theory
Dr. Abdulla Ismail
Professor of Electrical Engineering,
axicad@rit.edu
04-3712055
Stability of Feedback Control Systems 1
Modern Control Theory
Outline
• Overview of Feedback Control
• State-Space Analysis of Linear Control
Systems
• Realization of Feedback Control Systems
• Controllability & Observability of Linear
Control Systems
• Stability of Feedback Control Systems
• State Feedback Design
• Output Feedback Design
• Observers (Estimator) Design
• Compensator Design
Stability of Feedback Control Systems 2
Modern Control Theory
References
• R. Dorf, Modern Control Systems, 12th Edition,
Prentice Hall, 2011.
• N. Nise, Modern Control Engineering, 5th Edition,
Prectice Hall, 2010.
• K. Ogata, Modern Control Engineering, 4th Edition,
Prentice-Hall, Upper Saddle River, NJ, 2008.
• W. L. Brogan, Modern Control Theory, McGraw Hill,
NY, 2005.
• B. Friedland, Control System Design: An Introduction to
State-Space Methods, McGraw-Hill, NY.
Stability of Feedback Control Systems 3
Stability of Feedback Control Systems
• Introduction
• Definitions of Stability
• Stability in the Time Domain
• Types of Stability
• Stability of Linear Time Invariant Systems
Stability of Feedback Control Systems 4
Definitions of Stability
Stability of control systems can be defined in several ways:
1. Impulse Response
Asymptotically stable system
Marginally stable system = Constant
Unstable system
5
Stability of Feedback Control Systems
Definitions of Stability
2. System Poles
• Asymptotically stable system: All
system poles lie in the left half plane
(has negative real part).
• Marginally stable system: One or
more system poles lies on the
imaginary axis (have real part equal
to zero), and all these poles are
distinct. Besides, no poles lie in the
right half plane.
• Unstable system:
- The system has at least one pole
in the right half plane (has real part
greater than zero).
- The system have multiple poles on
the imaginary axis.
Stability of Feedback Control Systems 6
Types of Stability
Here, we consider two types of stability.
• The first type is an internal notion of stability and involves the
qualitative behavior of the zero-input state response, i.e., the
response of a related homogeneous state equation that
depends solely on the initial state. This is known as Internal
Stability.
• The second type of stability we focuses on external, or input-output,
behavior. In particular, we characterize state equations for which the
zero-state output response is a bounded signal for every bounded
input signal. This is known as bounded input, bounded-output
stability.
Stability of Feedback Control Systems 7
INTERNAL STABILITY
Stability of Feedback Control Systems 8
INTERNAL STABILITY
Internal stability is a concept clearly understood in the following case:
• A linear control system is given with state variable model S and
transfer function model T
• Suppose the system is third order with 2 stable eigenvalues e1
and e2 and one unstable eigenvalue e3. Furthermore, the
system has one zero z1.
• Suppose the system is not completely controllable/observable,
which results in a pole/zero cancellation, i.e. pole p3 cancelling
zero z1.
• Thus we are left with a second order transfer function Tr with
stable poles (e1 and e2).
• Thus the system state variable model S is unstable but its
transfer function Tr is stable.
• In this case, we say that the system is internally unstable but
externally (BIBO) stable.
Stability of Feedback Control Systems 9
BOUNDED INPUT BOUNDED OUTPUT STABILITY
10
Stability of Feedback Control Systems
Bounded Input, Bounded-Output Stability
11
Stability of Feedback Control Systems
Bounded Input, Bounded-Output Stability
Stability of Feedback Control Systems 12
Bounded Input, Bounded-Output Stability
Stability of Feedback Control Systems 13
Stability of Linear Time Invariant Systems
Given the linear time invariant system described by
(a) The system is asymptotically stable if and only
every eigenvalue of system matrix A has a negative
real part.
(b) The system is marginally stable if and only if every
eigenvalue of system matrix A has a nonpositive real
part, and at least one eigenvalue has a zero real part.
(c) The system is unstable if and only if at least one
eigenvalue of system matrix A has a positive real part, or
there are multiple eigenvalues with a zero real parts.
Stability of Feedback Control Systems 14
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
• The Russian mathematician, mechanician and physicist,
A. M. Lyapunov investigated the problem of stability of
dynamical systems in his doctoral thesis in 1882.
• He presented two methods of which the second
method, or the direct method has found extensive
application in the study of stability of automatic control
systems.
• We shall present here the second method of Lyapunov
for estimating the region of stability of dynamical
systems.
Stability of Feedback Control Systems 15
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Definiteness and Closeness of a Function
Stability of Feedback Control Systems 16
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
is positive definite if the system under consideration is second order, but it is only
semi-definite if the system is third order, since, for x1 = x2 = 0, V is 0 for arbitrary x3 .
• When the function is in quadratic form, it can be expressed as
• Now if P is a symmetric square matrix with constant coefficients, Sylvester’s
theorem is used for determining the definiteness of the function
Sylvester’s Theorem: In order that the quadratic form of Equation be
positive definite, it is necessary and sufficient that the determinant of the principal
minors, that is, the magnitudes
be positive.
Stability of Feedback Control Systems 17
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Lyapunov Stability Theorems
Theorem1: If there exists a function V(x), definite with respect to sign, whose total
derivative with respect to time is also a function of definite sign, opposite in sense to
that of V, then the system defined by under assumption of Equation
is asymptotically stable.
• Theorem 1. is overly restrictive in two senses.
• First, asymptotic stability is assured in an arbitrarily small region about the origin.
• Second, the requirement that dV/dt be negative definite, rather than semi-
definite, causes difficulties when we attempt to generate suitable Lyapunov
functions for nonlinear systems. Both of these shortcomings are overcome by
Theorem 2.
Theorem 2 If there exists a real scalar function V(X), continuous with continuous first
partials, such that
then the system described by Equation under the assumption of Equation
is asymptotically stable in Ω. 18
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Stability of Feedback Control Systems 19
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Generation of Lyapunov Functions For Autonomous Systems
• The major difficulty in applying the second method of Lyapunov is the lack of
a systematic method of determining a suitable V function, especially for
nonlinear systems.
• We shall present here a method for the generation of Lyapunov functions for
linear systems only.
• Consider the autonomous linear system
• Let V be equal to
• The time derivative of V(x) can be determined as
where
If the elements of Q are chosen to be symmetric and positive definite, solving n(n
+ 1)/2 unknown elements of the P matrix can prove the stability of the system if
the resulting matrix P is positive definite. 20
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Example2 As an illustration of this procedure, consider the determination of the
conditions for the asymptotic stability of the third-order system
yields
Stability of Feedback Control Systems 21
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
• By applying Sylvester’s theorem, P is to be positive definite if:
• From condition 3 it is obvious that a3 must be positive.
• Using this condition in condition 1, we see that either both a1a2 – a3 and
a1 are positive or both are negative.
• However, only the first situation satisfies condition 2, and hence the
conditions for asymptotic stability of the system are
22
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Example 3 Consider the system shown in Fig. below, where the state variables have
been chosen as real physical variables.
The system matrix A can be written as:
Then the symmetric matrix P is found
from relation as:
Stability of Feedback Control Systems 23
STABILITY BY THE DIRECT METHOD OF LYAPUNOV
Hence the system is asymptotically stable when K is positive and less than 30.
Stability of Feedback Control Systems 24