You are on page 1of 2

Orthogonal Matrices

Notes by Vanchinathan

DEFINITION We call a matrix A orthogonal matrix if


all its columns are unit vectors and any two column vectors are
perpendicular (i.e., dot producct is zero).
One can easily check that this happens for a matrix Q iff
−1
Q = Qt .

Example: Rotation matrices and reflection matrices are or-


thogonal matrices:
cos θ − sin θ cos 2θ sin 2θ
! !
Rθ = Sθ =
sin θ cos θ sin 2θ − cos 2θ

Take the identity matrix: any column is a unit vector and


any two column vectors have dot product zero.
So if we do any number of interchanges of columns these dot
product values will not change. So the following 4 × 4 matrix,
for example, is also an orthogonal matrix P = (e3 | e1 | e4 | e2 ).
Such matrices are called permuation matrices.
One can combine both kinds and create orthogonal matrices
of any size:
cos x − sin x 0 0 0
 

 sin x cos x 0 0 0

 
Q=

 0 0 0 1 0



 0 0 0 0 1

 
0 0 1 0 0
Properties of Orthogonal Matrices:
When we regard an orthogonal matrix Q as a linear transfor-
mation the following property makes it very special.

1
(i) For any vector v, the length of Qv is the same as the
length of v.
(ii) For any two vectors v, w the angle between Qv and Qw
is the same as the angle between v and w.
In words, “an orthogonal transformation preserves the length
and angles”.
Easy to see that the angle between u = (1, 1)t and v = (0, 1)t
is 45◦ .
Now multiply these two vectors by some random 2 × 2 matrix
A. We are unlikely to get angle between Au and Av to be the
same.
Take A = 10 11 , Au = (2, 1)t , Av = (1, 1)t , their angle is
 

cos−1 √310 is not 45◦ .

You might also like