You are on page 1of 4

Tutorial-Quiz 2: Solving Linear Algebraic Equations

Date: 27.10.2010 Time: 7 pm to 8:30 pm


Solution Key

1. Prove the following inequalities

(a) kABk ≤ kAk kBk (3 points)


Solution: Consider
kABxk = kA(Bx)k
From the definition of induced matrix norm, we have

kAyk ≤ kAk kAyk

which implies
kA(Bx)k ≤ kAk kBxk ≤ kAk kBk kxk
Dividing both sides of inequality by kxk and maximizing both sides of the inequality
with respect to kxk , we have

max kABxk max


≤ kAk kBk = kAk kBk
x, kxk 6= 0 kxk x, kxk 6= 0

Or
kABk ≤ kAk kBk

(b) C(AB) ≤ C(A)C(B) (3 points)


Solution: Using result proved in part A, we have

kABk ≤ kAk kBk

and
° ° ° °° °
°(AB)−1 ° ≤ °A−1 ° °B−1 °

Combining above inequalities, we have


° ° ¡° ° ¢¡ ° °¢
kABk °(AB)−1 ° ≤ °A−1 ° kAk kBk °B−1 °

Or
c(AB) ≤ c(A)c(B)

1
2. Consider system
⎡ ⎤ ⎡ ⎤
1 1/2 1/3 1/4 1
⎢ 1/2 1/3 1/4 1/5 ⎥ ⎢ −1 ⎥
⎢ ⎥ ⎢ ⎥
A=⎢ ⎥ ; b =⎢ ⎥
⎣ 1/3 1/4 1/5 1/6 ⎦ ⎣ 1 ⎦
1/4 1/5 1/6 1/7 −1

(a) It is desired to solve for Ax = b using relaxation method with ω = 1.5. If the
initial guess to start the iterations is chosen as follows
h iT
(0)
x = 1 1 1 1

then, will the relaxation iterations converge? Justify your answer. (4 points)
Additional Information: det(A) = 1.6534 × 10−7 .
Solution: Given matrix A is symmetric. Taking determinants of the principle
minors, we have
⎡ ⎤
" # 1 1/2 1/3
1 1/2 1 ⎢ ⎥ 24
1 > 0, det = > 0 and det ⎣ 1/2 1/3 1/4 ⎦ = >0
1/2 1/3 12 216 × 240
1/3 1/4 1/5

and det(A) >0. Thus, it follows that A is a symmetric and positive definite
matrix. From Theorem 5, it follows that relaxation iterations, with ω = 1.5 < 2,
will converge to the solution starting from ANY initial guess.
(b) Find condition number c1 (A) OR c∞ (A) of matrix A if
⎡ ⎤
16 −120 240 −140
⎢ −120 1200 −2700 1680 ⎥
⎢ ⎥
A−1 = ⎢ ⎥
⎣ 240 −2700 6480 −4200 ⎦
−140 1680 −4200 2800

and comment upon the conditioning of matrix A. ( 3 points).


Solution:1-norm:
25
kAk1 = kAk∞ =
° −1 ° ° ° 12
°A ° = °A−1 ° = 13620
1 ∞
c1 (A) = c∞ (A) = 28375

Matrix A is highly ill conditioned.

2
3. A linear algebraic equation of from Ax = b with tridiagonal n × n matrix A appears
when finite difference method is used to solve a second order ODE-BVP. If it is desired
to solve the linear algebraic equation using Jacobi method, then Jacobi matrix J,
defined as J = S−1 T = −D−1 (L + U), that appears in the iterative calculations is as
follows
⎡ ⎤ ⎡ ⎤
2 −1 0 ... 0 0 1 0 ... 0
⎢ ⎥ ⎢ ⎥
⎢ −1 2 −1 ... 0 ⎥ ⎢ 1 0 1 ... 0 ⎥
⎢ ⎥ 1⎢ ⎥
A=⎢ ⎥ ⎢
⎢ ... ... ... ... ... ⎥ ; J = 2 ⎢ ... ... ... ... ... ⎥

⎢ ⎥ ⎢ ⎥
⎣ 0 ... −1 2 −1 ⎦ ⎣ 0 ... 1 0 1 ⎦
0 ... 0 1 2 0 ... 0 1 0

(a) Show that the vector


h iT
(k)
x = sin(kπh) sin(2kπh) sin(3kπh) ... sin((n − 1)kπh) sin(nkπh)

is an eigenvector of J, i.e. x(k) satisfies Jx(k) = λk x(k) with eigenvalue λk =


cos(kπh). Here, h = 1/(n + 1) and k can take any value from the set {1, 2, ...n}.
(5 points)
Hint: It is sufficient to show calculations carried out for the 1’st row of matrix
J, any j th row of matrix J and for the last row of matrix J.
Solution: (i) First row of J :
1
sin(2kπh) = sin(kπh)cos(kπh)
2
The above equality follows from the trigonometric identity sin(2θ) = 2 sin(θ) cos(θ).
(ii) Any j’th Row of J :
1
[sin((j − 1)kπh) + sin((j + 1)kπh)] = sin(jkπh)cos(kπh)
2
The above equality follows from the trigonometric identity sin(θ+β)+sin(θ−β) =
2 sin(θ) cos(β).
(iii) Last row of J :

λk xn(k) = sin(nkπh)cos(kπh)
1
= [sin((n − 1)kπh) + sin((n + 1)kπh)]
2
Using the fact that h = 1/(n+1), we have

sin((n + 1)kπh) = sin(kπ) = 0

3
Thus,

λk x(k)
n = sin(nkπh)cos(kπh)
1
= sin((n − 1)kπh)
2
= Last element of vector Jx(k)

(b) Will the Jacobi iterations converge? Justify your answer. (2 points)
Solution: Since ¯ µ ¶¯
¯ kπ ¯¯
¯
|cos(kπh)| = ¯cos <1
n+1 ¯
for k = 1,2, ...n, it follows that
£ ¤
ρ [J] = ρ S−1 T < 1

and the necessary and sufficient condition for convergence of Jacobi iterations is
satisfied. Thus, jacobi iterations will converge starting from ANY initial guess.

Additional Information
Theorem 1 A sufficient condition for the convergence of Jacobi and Gauss-Seidel methods
is that the matrix A of linear system Ax = b is strictly diagonally dominant.

Theorem 2 The Gauss-Seidel iterations converge if matrix A an symmetric and positive


definite.

Theorem 3 For an arbitrary matrix A, the necessary condition for the convergence of re-
laxation method is 0 < ω < 2.

Theorem 4 When matrix A is strictly diagonally dominant, a sufficient condition for the
convergence of relaxation methods is that 0 < ω ≤ 1.

Theorem 5 For an symmetric and positive definite matrix A, the relaxation method con-
verges if and only if 0 < ω < 2.

You might also like