You are on page 1of 2

Today we will discuss the following topics.

1. Linear independence/dependence.
2. Rank of a matrix
3. Vector space
4. Basis
5. dimension.
Let us start by the following equation
a
1
x + b
1
y = c
1
(1)
a
2
x + b
2
y = c
2
All of you may be knowing: when above system of equations has unique
solution, infinite solution, etc..
[Find the answer of above question from your books]
I will use a “new” (probably) word for explaining above system of equa-
tions. I will say the following.
System will have infinite solution if vectors


a
1
b
1
c
1


and


a
2
b
2
c
2


are
independent. This is one way of saying what you know earlier. Many of
you may be knowing, but lets revise and understand what does mean by
independent vectors.
More generally let v
1
, v
2
, v
3
, ..v
n
are vectors. What does mean by linearly
independency of these vectors.
We say vectors v
1
, v
2
, v
3
, ..v
n
are linearly independent if for c
1
v
1
+c
2
v
2
+
.... + c
n
v
n
= 0 for constant c
1
, c
2
, .., c
n
then we must have c
i
= 0 for each i.
Suppose one of these is not zero let it be c
1
= 0, then what happens?
(hint: Can you find vector v
1
from others vectors). This mean these vector
are independent.
Let have example
2x + 3y = 5 (2)
4x + 6y = 10
Then v
1
:= (2, 3, 5) and v
2
= (4, 6, 10). We see that v
2
=
5
2
v
1
. Therefore v
1
and v
2
are dependent and we can discard the second equation.
I am not sure, you get it or not, overall I want to say that if v
1
and v
2
are
two vector which are independent, this means both corresponding equations
are required to understand the system.
1
In other words, in case of linear dependence, we can get rid of some
vectors until we arrive at a linear independent set that is optimal to work
with.
It is smallest possible in the sense that it consists of the really essential
vectors which can no longer be expressed linearly in terms of others. This
smallest set is called the basis. .
I will discuss more precisely about the basis later but for now, in our
context, Basis is:
set of smallest possible number of vectors which are linear independent
and required to understand the corresponding system of linear equation.
Rank of a matrix. The rank of a matrix is the maximum number of
linearly independent row vectors or column vectors of the matrix. It is
denoted by rank(A).
Vector space
Before explaining vector sapce, suppose u v and w are two vectors in R
n
and s, t ∈ R Then we see that
1. v + w = w + v
2. (u + v) + w = u + (v + w)
3. v + 0 = v
4. v + (−v) = 0
5. s(v + w) = sv + sw
6. (s + t)v = sv + tv
7. s(tv) = (st)v
8. I.v = v.
All vectors from R
n
satisfies above property with usual addition and multi-
plication. Now lets define vector space.
A vector space is a non empty set V of vectors such that with any two
vectors v, w ∈ V , all their linear combination αv +βw lies in V and vectors
from V satisfies set of rules mentioned above.
Let A be subset of V , then linear span of A is all possible linear combi-
nation of vectors from A. It is denoted by LS(A). Examples are discussed
in the class.
Subset A is called a basis if
1. LS(A) = V
2. If vectors in A are linearly independent.
2