You are on page 1of 5

Multivariable Control - Algebraic Riccati Equation Let A, Q, and R be real n n matrices with Q and R symmetric.

. An algebraic Riccati equation (ARE) is AT X + XA + XRX + Q = 0 We associate a 2n 2n matrix called the Hamiltonian matrix with the ARE H= A R Q AT

The Hamiltonian matrix has some useful properties. The eigenvalues of H are symmetric about the imaginary axis. To prove this assertion, rst note 0 I that H has a special type of skew symmetry. In particular, let J = then I 0 JH = 0 I I 0 A R Q AT = Q AT A R

which is a symmetric matrix. Moreover J 2 = I which means that J 1 = J . So we can readily see that J 1 HJ = JHJ = H T J is, of course, a similarity transformation so that the eigenvalues of H are also the eigenvalues of H T . But we also know that if is an eigenvalue of H then is an eigenvalue of H T (where is the complex conjugate of ). This implies that must also be an eigenvalue of H , which means the eigenvalues of H are symmetric with respect to the imaginary axis. Let be an eigenvalue of H with associated eigenvector x. The pair (, x) generate a 1-dimensional subspace of Rn S = {v : v = x, R} so that if v S , the Hv S also. We therefore say that S is an H -invariant subspace. We now show how to construct solutions to the ARE using H -invariant subspaces. Let V C2n be an 2n-dimensional H -invariant subspace and let X1 and X2 by two matrices in Cnn such that V = Im X1 X2

Since V is H -invariant, there is a matrix Cnn such that H X1 X2 = A R Q AT X1 X2 = X1 X2

1 Let us assume X1 is invertible, we can then post multiply by X1 to obtain

A R Q AT

X1 X2

1 X1

= = =

A R Q AT X1 X2 I X
1 X1 1 X1 X1

I X

1 where we let X = X2 X1 . 1

If we no pre-multiply by X I

we obtain I X = X I I X
1 X1 X1

A R Q AT

Simplifying the right and left hand sides of the above equations reduces to XA Q XRX AT X = 0
1 which is our algebraic Riccati equation. This implies therefore that X = X2 X1 solves the ARE.

Note that the solution is independent of the choice of basis spanning V . If we were to choose any X1 T other basis spanning V represented by the image space of matrix where T is a similarity X2 T transformation, then it is clear that
1 X = X2 X1 = (X2 T )(X1 T )1

Thereby showing that any other choice of basis for V results in the same matrix X satisfying the ARE. The converse of the above result holds also. So that if X solves the ARE, we claim we can always X1 1 form an nd X1 and X2 with X1 invertible such that X = X2 X1 and the columns of X2 n-dimensional invariant subspace of H . To prove this assertion, let = A + RX and note that X = XA + XRX Using the ARE, the above equation becomes X = Q AT X we can now rewrite the equation for and the last equation in matrix form as A R Q AT This implies that I X = I X

I spans an n-d H -invariant subspace. So the result is satised by simply X taking X1 = I and X2 = X . Note that this is useful because it shows that solving the ARE is equivalent to solving a system of linear algebraic equations. Note that there may be many solutions to a given ARE which are obtained by making dierent selections for the basis of V . Consider the ARE obtained when 3 2 0 0 0 0 A= , R= , Q= 2 1 0 1 0 0 One can readily verify that the following matrices satisfy the ARE X= 10 6 6 4 , X= 0 0 0 0 , X= 2 2 2 2

If we use the ARE solver in Matlab, however, we only get the zero solution above. In general, we are interested in whether our ARE has stabilizing solutions. In other words is A + RX a Hurwitz matrix. Assume H has no eigenvalues on the j -axis. By the symmetry properties discussed earlier, H must have n eigenvalues on the open right hand side of the complex plans and n on the open left hand

side. This means that the spectrum of H can be partitioned into to sets of stable and unstable eigenvalues. Consider two n-dimensional subspaces spanned by the eigenvectors associated with the stable and unstable eigenvalues of H , X (H ) X+ (H ) = stable subspace = unstable subspace

Lets consider the stable subspace X (H ) and determine a basis so that X (H ) = Im X1 X2

1 1 If X1 exists, then we can set X2 X1 = X which is uniquely determined by the Hamiltonian matrix H . We therefore introduce the operator

Ric : H X as a map from the Hamiltonian matrix onto the ARE solution X associated with the stabilizing subspace, X (H ) of H . The domain of the Ric operator is denoted as dom(Ric) and the value that this operator takes for a specied Hamilontian matrix H is denoted as Ric(H ). We can now state and prove the following theorem Theorem: Suppose H dom(Ric) and X = Ric(H ), then the following holds X is real and symmetric X satises the ARE A + RX is Hurwitz Proof: Consider X1 and X2 constructed as discussed above. Note that there exists a stable real n n matrix such that H If we premultiply by
X1 X1 X2 X2

X1 X2

X1 X2

J we obtain JH X1 X2 =
X1 X2

X1 X2

Recall that JH is symmetric so that the right and left hand sides of the above equation are symmetric and we can conclude that
( X1 X2 + X2 X2 )

= =

(X1 X2 + X2 X1 ) (X1 X2 + X2 X1 )

If we let Z = (X1 X2 + X2 X1 ), then the above equation can be rewritten as Z + Z = 0

which is a Lyapunov equation. Because is Hurwitz, we can show that Z = 0 which implies that
X1 X2 = X2 X1 which thereby shows that X1 X2 is symmetric. Note that since X1 is nonsingular 1 1 X = (X1 ) ( X1 X2 ) X1 and since we know X1 X2 is symmetric, we can conclude X is symmetric as well.

The second assertion is established by starting with H= X1 X2 = X1 X2

4 1 Post-multiplying by X1 we obtain

(1) and pre-multiplying by X

H I

I X

I X

1 X1 X1

we obtain X I H I X =0

Expanding out we obtain the ARE. To show the third assertion, premultiply equation (1) by A + Rx =
1 X1 X1

to obtain

This shows that A + RX has the same eigenvalues as and so A + RX is Hurwitz. When is H in dom(Ric)? We can obtain testable conditions if were willing to assume that H has no imaginary eigenvalues. Recall that this allows us to partition the spectrum of H into n stable and unstable eigenvalues and we can then generate an n-dimensional H -invariant subspace, X (H ), from the stable eigenvalues. The main result well prove is that H dom(Ric) if and only if (A, R) is stabilizable assuming that R is either positive semi-denite or negative semi-denite. To prove this assertion, lets assume H dom(Ric). This means there exist matrices X1 and X2 such that X1 X (H ) = Im X2 with X1 invertible. This only occur if ker(X1 ) = {0} (trivial subspace) So well focus on the kernel of X1 . First note that kerX1 is -invariant. Now assume that her(X1 ) is not trivial, this means there exists (, x) with x = 0 such that x = x So consider 0 I This implies that QX1 A X2 = X1 Let x her(X1 ) and multiply through by x to obtain QX1 x A X2 x = X2 x The rst term is zero and the last term reduces to X2 x. We can therefore simply this to (A + I )X2 x = 0 which says that (, X2 x) is an eigenvalue/vector pair for A . One can also show that Rx = 0 which means that
x X2

X1 X2

X1 X2

A + I

=0

If (A, R) is stabilizable, then X2 x = 0 and we know X1 x = 0 which implies x = 0 and so ker(X1 ) is trivial and X1 is invertible.

If we apply the above results to specic Hamiltonians associated with AREs used in H2 and H synthesis we obtain the following results. In particular consider the Hamiltonian matrix H= A C T C BB T AT

This is associated with the H2 Full-Information problem. Using our prior results we see that the associated ARE has a stabilizing positive semi-denite solution if (A, B ) is stabilizable and (A, C ) is detectable. The following Hamiltonian arises in the output-feedback problem H = = A C T C
T

0 AT

B C T D

R 1

DT C

BT

A BR1 DC C (I DR1 DT )C

BR1 B T (A BR1DT C )T

The ARE associated with this Hamiltonian matrix has a stabilizing solution if (A, B ) is stabilizable A jI B and has full column rank (the detectability condition). If we return to our original C D B1 B1 A s OF problem with the plants state space realization being G = C 1 D11 D12 , we saw that D21 0 C2 the assumptions we placed on the OF problem to obtain a solution were (A1 , B2 ) stabilizable and (C2 , A) detectable Orthogonality assumptions A jI B2 has full column rank and the matrix The matrix C1 D12 full row rank

A I C1

B1 D21

has

We now see that the rst and third assumptions are required for the existence of a stabilizing solution to the ARE.