You are on page 1of 5

 

Module 4 : Solving Linear Algebraic Equations


Section 4 : Direct Methods for Solving Sparse Linear Systems

4 Direct Methods for Solving Sparse Linear Systems

A system of Linear equations

--------(32)

is called sparse if only a relatively small number of its matrix elements ( ) are nonzero. The sparse
patterns that frequently occur are

Tridiagonal

Band diagonal with band width M


 
Block diagonal matrices

Lower / upper triangular and block lower / upper triangular matrices

We have encountered numerous situations in the module on Problem Discretization using Approximation
Theory where such matrices arise. It is wasteful to apply general direct methods on these problems.
Special methods are evolved for solving such sparse systems, which can achieve considerable reduction
in the computation time and memory space requirements. In this section, some of the sparse matrix
algorithms are discussed in detail. This is meant to be only a brief introduction to sparse matrix
computations and the treatment of the topic is, by no means, exhaustive.

4.1 Block Diagonal Matrices

In some applications, such as solving ODE-BVP / PDE using orthogonal collocations on finite elements,
we encounter equations with a block diagonal matrices, i.e.

where for are sub-matrices and , for are


sub-vectors. Such a system of equations can be solved by solving the following sub-problems

-------(33)

where each equation is solved using, say, Gaussian elimination. Each Gaussian elimination sub-problem
would require

and

Defining dimension of vector as the number of multiplications and divisions in the


conventional Gaussian elimination equals

It can be easily shown that


i.e.

4.2 Thomas Algorithm for Tridiagonal and Block Tridiagonal Matrices [2]

Consider system of equation given by following equation

-------(34)

where matrix is a tridiagonal matrix. Thomas algorithm is the Gaussian elimination algorithm tailored
to solve this type of sparse system.

Step 1:Triangularization: Forward sweep with normalization

-------(35)

-------(36)

-------(37)

-------(38)

This sequence of operations finally results in the following system of equations

Step 2: Backward sweep leads to solution vector

-------(39, 40)
Total no of multiplications and divisions in Thomas algorithm is

which is significantly smaller than the operations (approximately) necessary for carrying out
the Gaussian elimination and backward sweep for a dense matrix.

The Thomas algorithm can be easily extended to solve a system of equations that involves a block
tridiagonal matrix. Consider block tridiagonal system of the form

-------(41)

where and are matrices and ( , represent vectors of appropriate dimensions.

Step 1:Block Triangularization

-------(42)

-------(43)

Step 2: Backward sweep


-------(44)

-------(45, 46)

4.3 Triangular and Block Triangular Matrices

A triangular matrix is a sparse matrix with zero-valued elements above or below the diagonal. For
example, a lower triangular matrix can be represented as follows

To solve a system , the following algorithm is used

-------(47)
-------(48)

The operational count i.e., the number of multiplications and divisions, for this elimination process is

-------(49)

which is considerably smaller than the Gaussian elimination for a dense matrix.

In some applications we encounter equations with a block triangular matrices. For example,

where are sub-matrices while and are sub-vectors for .


The solution of this type of systems is completely analogous to that of lower triangular systems,except
that sub-matrices and sub-vectors are used in place of scalars. Thus, the equivalent algorithm for block-
triangular matrix can be stated as follows

-------(50)

-------(51)

The above form does not imply that the inverse should be computed explicitly. For example, we
can find each by Gaussian elimination.

4.4 Solution of a Large System By Partitioning

If matrix is equation (LAE) is very large, then we can partition matrix and vector as

where is a square matrix. this results in two equations

-------(52)

-------(53)

which can be solved sequentially as follows

-------(54)

-------(55)

-------(56)
It is also possible to work with higher number of partitions equal to say 9, 16 .... and solve a given
large dimensional problem.

You might also like