Professional Documents
Culture Documents
". If E is a $ ‚ $ matrix with $ pivot positions, then there are elementary matrices
I" ß ÞÞÞß I: such that I: † ÞÞÞ † I" E œ M$ Þ
0
Ô"×
#Þ If E is a $ ‚ $ matrix and EB œ ! has a unique solution, then E is invertible.
Õ!Ø
Partitions and Multiplication of Matrices
Suppose E, F are the correct sizes for the product EF to be defined. If we partition E and F
into blocks (“submatrices”) so that column partition sizes of E match the row partition sizes
of Fß then can compute product EF by using the row-column rule on the blocks as if the
blocks were scalars.
In this example, the column partition sizes of E are $ß ". The row partition sizes of F
match: $ß "Þ
Ã$Ä Ã"Ä
Ô* *×
Ô * ×Ö *
* | *
*Ù
* * | *
Ö * ÙÖ Ù
* | *
Ö ÙÖ * *Ù
* * | *
Ö Ù
* | *
|
Õ * Ø _ _ _ _ _
* * | * Õ* * | * *Ø
$‚% %‚%
Æ Æ
”E E## • ”F F## •
E"" E"# F"" F"#
†
#" #"
#‚# #‚#
á á
œ”
E#" F"" E## F#" E#" F"# E## F## •
E"" F"" E"# F#" E"" F"# E"# F##
â â
"‚# "‚#
Ô8 9 5 8×
Ö2 2Ù
or Ö Ù
3 2
(% ‚ % without partitioning)
Õ0 4Ø
6 7 2 6
2 2
Example A matrix vector product can be thought of as a product of partitioned matrices.
(The point is not that you should now compute matrix vector products in a different way;
it is just to see how the new idea of multiplying partitioned matrices by blocks can be
related to previous products we have worked with.)
Ô ," ×
Suppose E is # ‚ $ and , œ ,# is a vector from ‘$ .
Õ ,$ Ø
E ,
Ô "×
,
”‡ •
‡ ‡ ‡
,# œ ," +" ,# +# ,$ +$
‡ ‡ Õ ,# Ø
(linear combination of columns of E
with the ,3 's as weights, a vector in ‘# )
Ô ," ×
Ö Ù
Ö Ù
Ö ,# Ù œ Ò," +" ,# +# ,$ +$ Ó
Ö Ù
Ò+ " l + # l + $ Ó
Õ ,$ Ø
The “block matrices” are
E F H
Æ Æ Æ
Ô" #× Ô * * * *×
% ”
"% • Õ
( ) * "!
$ œ * *
Õ& 'Ø *Ø
* *
"" "# "$
* * *
Here E is partitioned into its individual columns: the column partition sizes are "ß "
and R is partitioned into its individual rows: the row partition sizes are "ß "Þ
Ô"× Ô#×
G" œ $ V" œ Ò ( ) * "! Óß G# œ % ß V# œ Ò "" "# "$ "% Ó
Õ&Ø Õ'Ø
H œ EF œ ÒG" G# Ó ”
V# •
V"
œ ÒG" V" G# V# Ó
Å Å
$‚% $‚%
Ô #* $# $& $) ×
œ '& (# (* )' œ EF
Õ "!" ""# "#$ "$% Ø
Look now at the “same” example, but with more rows and columns . Watch how EF
is computed by the “same” formula as in the preceding example (**)
8 columns
á
Suppose E is 7 ‚ 8
and F is 8 ‚ :
ß
8 rows
Partitioning matrices into smaller blocks for multiplication can sometimes give some
theoretical insights.
The following are the highlights of what was done in class. You need to read the text and
the notes about PY decompositions that will be posted in the syllabus for additional
explanation and examples.
E œ PY , where œ
Y is an echelon form of E (so Y is also 7 ‚ 8)
P is square 7 ‚ 7, lower triangular, with "'s on the diagonal
Ô 1 0 0 0×
Ö 0Ù
Ö Ù
* 1 ! ã
so P looks like Ö 0Ù
Ö Ù
* * 1
â â ã
Õ * * * 1Ø
When E is large, this can be useful if we need to solve EB œ , many times, changing
only the vector , each time. Rather than row reduce ÒEl ,Ó many times, the row reduction
information can sometimes be “coded” once and for all into P and Y Þ More details about
finding Pß Y later.
Ô"×
Solve EB œ " using this factorization:
Õ!Ø
Ú
Ý
Ý
Ý Ô"×
Ý PC œ "
Ô × substitute C œ Y B
"
Û Õ!Ø
Ý
EB œ P Y B œ " Ò
Õ!Ø to get 2 eqns Ý
Ý
Ý
Ü C œ YB
Solve 1st equation for C (easy because P is lower triangular. Fill in the details, working
down from the top equation “forward substitution”)
Then solve for B (easy because Y is an echelon form. Fill in the details, working up
from the bottom the equation, “backward substitution”
Ô# "× Ô "×
YB œ ! # B œ "
Õ! !Ø Õ !Ø
The point is that each of the two solutions is relatively easy because of the special forms
that P and Y have.