You are on page 1of 2

THE UNIVERSITY OF HONG KONG

DEPARTMENT OF MATHEMATICS

MATH2101 Linear Algebra I


Tutorial 1 solution

1. The matrix A is symmetric if and only if A = AT . This is equivalent to



⎪ x + y = 0,


⎨ x2 = y + 6,



⎩ z = 2z − 1.

The last equation implies z = 1. From the first equation, we have y = −x. Substituting into
the second equation, we obtain x2 = y + 6 = −x + 6. The roots to the quadratic equation
x2 + x − 6 = 0 are x = 2 and x = −3. Correspondingly, we get the solutions (x, y, z) = (2, −2, 1)
and (x, y, z) = (−3, 3, 1). In these cases, we have
⎡−2 0 4 ⎤⎥ ⎡3 0 9⎤⎥
⎢ ⎢
⎢ ⎥ ⎢ ⎥
A=⎢0 0 1 ⎥ or ⎢0 0 1⎥ ,
⎢ ⎥ ⎢ ⎥
⎢4 1 −3⎥⎦ ⎢9 1 2⎥⎦
⎣ ⎣
which is symmetric.
2. Suppose A + B = kB for some scalar k. This can be rewritten as A = (k − 1)B. Since A is not
1
the zero matrix, we have k ≠ 1, and hence B = k−1 A. This shows B is a scalar multiple of A.
Conversely, suppose B is a nonzero scalar multiple of A, i.e. B = rA for some nonzero scalar
r. Then we have
1 1
A + B = B + B = ( + 1) B,
r r
which shows that A + B is a scalar multiple of B.
For the remaining case where B is the zero matrix, we have A + B = A ≠ O, and hence it is
not a scalar multiple of B.
r r
To conclude, B is of the form [ ] for some r ≠ 0.
r r

3. (a) The equations combine to give 2A = A + AT = O. Hence, A = O is a zero matrix. In order


that the equations are well-defined, A and AT should be of the same size, i.e. A is a
square matrix. This shows A can be any zero matrix of size n × n, which clearly satisfies
both conditions.
(b) (Existence)
Let S = 21 (A + AT ) and K = 21 (A − AT ). Clearly, A = S + K, and S is symmetric since
T
1 1 1
S T = [ (A + AT )] = (A + AT )T = (AT + A) = S.
2 2 2
Also K is skew-symmetric since
T
1 1 1
K T = [ (A − AT )] = (A − AT )T = (AT − A) = −K.
2 2 2
(Uniqueness)
Suppose there exist symmetric matrices S and S ′ and skew-symmetric matrices K and
K ′ such that A = S + K = S ′ + K ′ . Then we have S − S ′ = K ′ − K. Note that

(S − S ′ )T = S T − (S ′ )T = S − S ′ ,

1
so S − S ′ is symmetric. On the other hand, we have

(K ′ − K)T = (K ′ )T − K T = −K ′ − (−K) = −(K ′ − K),

so K ′ − K is skew-symmetric. Let B = S − S ′ = K ′ − K. Then B T = B and B + B T = 0.


It follows from (a) that B = O and hence we must have S = S ′ and K = K ′ , proving the
uniqueness of S and K.
4. (a) No example exists. Suppose the matrix is n × n. Since there are 10 entries, we have
n ≥ 4. As the matrix is a diagonal matrix, there should be at least 2(1 + 2 + ⋯ + (n − 1)) =
n(n − 1) ≥ 12 zero entries. This is a contradiction.
⎡0 0 0⎤
⎢ ⎥
⎢ ⎥
(b) A = ⎢1 0 0⎥. It is easy to see that Ae1 = e2 .
⎢ ⎥
⎢0 0 0⎥
⎣ ⎦
(c) One example is

⎪ x = 1,


⎨ y = 2,



⎩ x + y = 3.

1 −1 1
5. (a) True. Let A = [ ]. Then A(e1 + e2 ) = A [ ] = 0.
0 0 1
1 0 v
(b) False. For example, consider A = [ ]. For any v = [ 1 ] ∈ R2 , we have
0 0 v2

v
Av = [ 1 ]
0

and
v v
A(Av) = A [ 1 ] = [ 1 ] = Av.
0 0
But A is different from I and O.
(c) False. Consider the 2 × 3 zero matrix. It is in reduced row-echelon form. However, it has
2 zero rows but 3 zero columns.

You might also like