0% found this document useful (0 votes)
31 views30 pages

Linear Algebra Optimization Techniques

Optimization is about finding the best solution by maximizing or minimizing an objective function within a set of constraints. It involves variables to adjust and a feasible region where solutions lie. Methods include analytical and numerical techniques, applied in fields like engineering, economics, and machine learning.

Uploaded by

itsirs000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views30 pages

Linear Algebra Optimization Techniques

Optimization is about finding the best solution by maximizing or minimizing an objective function within a set of constraints. It involves variables to adjust and a feasible region where solutions lie. Methods include analytical and numerical techniques, applied in fields like engineering, economics, and machine learning.

Uploaded by

itsirs000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

EC6618: Optimization Techniques

Linear Algebra
By
Prof. AJIT KUMAR SAHOO
Asso.Prof (ECE)
NIT, Rourkela

1
Vectors and Linear Combinations

Let 𝒗 𝑎𝑛𝑑 𝒘 are two vectors.

You can’t add apples and oranges


 For one vector 𝒖, only linear combinations are c𝒖 (Fill a line)
 For two vectors 𝒖, 𝑎𝑛𝑑 𝒗 the combinations are c𝒖 + 𝑑𝒗 (Fill a
plane)
Matrix

 An 𝑚 × 𝑛 matrix:

𝐴 = 𝒂𝟏 , 𝒂𝟐 , … , 𝒂𝒏 , where
 𝒂𝟏 , 𝒂𝟐 , … , 𝒂𝒏 are the columns in matrix 𝐴.
 𝑎11 , … , 𝑎𝑖𝑗 , … , 𝑎𝑚𝑛 are the diagonal entries.
 𝐴 is a square matrix if 𝑚 = 𝑛, otherwise it is a rectangular
matrix.
 𝐴 is a zero matrix if all of its elements are zero.
Linear Operations

 Two matrices are equal if they have the same size and if
their corresponding columns are equal.

 Sums and Scalar Multiples


• If 𝐴 and 𝐵 are 𝑚 × 𝑛 matrices, 𝑟 a scalar
o Sum 𝐴 + 𝐵 = [𝒂𝟏 + 𝒃𝟏 , 𝒂𝟐 + 𝒃𝟐 , … , 𝒂𝒏 + 𝒃𝒏 ]

o Scalar Multiple 𝑟𝐴 = 𝑟𝒂𝟏 , 𝑟𝒂2 , … , 𝑟𝒂𝑛


Properties of Linear Operations
 Let 𝐴, 𝐵 and 𝐶 be matrices of the same size, and let 𝑟 and 𝑠
be scalars, then

• 𝐴+𝐵 =𝐵+𝐴
• 𝐴+𝐵 +𝐶 =𝐴+ 𝐵+𝐶
• 𝐴+0=𝐴
• 𝑟 𝐴 + 𝐵 = 𝑟𝐴 + 𝑟𝐵
• 𝑟 + 𝑠 𝐴 = 𝑟𝐴 + 𝑠𝐴
• 𝑟 𝑠𝐴 = 𝑟𝑠 𝐴
Matrix Multiplications
 If 𝐴 is a 𝑚 × 𝑛 matrix and 𝐵 is a 𝑛 × 𝑝 matrix,
𝑎11 ⋯ 𝑎1𝑛 𝑏11 ⋯ 𝑏1𝑝
𝐴= ⋮ ⋱ ⋮ ,𝐵 = ⋮ ⋱ ⋮
𝑎𝑚1 ⋯ 𝑎𝑚𝑛 𝑏𝑛1 ⋯ 𝑏𝑛𝑝
 The matrix product 𝐶 = 𝐴𝐵 is defined to be a 𝑚 × 𝑝 matrix
𝑐11 ⋯ 𝑐1𝑝
𝐶= ⋮ ⋱ ⋮ , such that
𝑐𝑚1 ⋯ 𝑐𝑚𝑝
𝑛
𝑐𝑖𝑗 = 𝑎𝑖1 𝑏1𝑗 + 𝑎𝑖2 𝑏2𝑗 + ⋯ + 𝑎𝑖𝑛 𝑏𝑛𝑗 = ෍ 𝑎𝑖𝑘 𝑏𝑘𝑗
𝑘=1
for 𝑖 = 1,2, … , 𝑚 and 𝑗 = 1,2, … , 𝑝.
Properties of Matrix Multiplications
 Let 𝐴 is a 𝑚 × 𝑛 matrix and 𝐵, 𝐶 are the matrices for
which the indicated sums and products are defined,

• Associative law: 𝐴 𝐵𝐶 = 𝐴𝐵 𝐶
• Left distributive law: 𝐴 𝐵 + 𝐶 = 𝐴𝐵 + 𝐴𝐶
• Right distributive law: 𝐵 + 𝐶 𝐴 = 𝐵𝐴 + 𝐶𝐴
• 𝑟 𝐴𝐵 = 𝑟𝐴 𝐵 = 𝐴 𝑟𝐵
• 𝐼𝑚 𝐴 = 𝐴 = 𝐴𝐼𝑛
Powers and Transpose of a Matrix
 Powers of a Matrix
• If 𝐴 is a square matrix and if 𝑘 is a positive integer, then
𝐴𝑘 = 𝐴 … 𝐴
 Transpose of a Matrix
• Given an 𝑚 × 𝑛 matrix 𝐴, the transpose of 𝐴 is the 𝑛 × 𝑚
matrix, denoted by 𝐴𝑇 , whose columns are formed from the
corresponding rows of 𝐴.
 Properties of the Transpose
• (𝐴𝑇 )𝑇 = 𝐴, (𝐴 + 𝐵)𝑇 = 𝐴𝑇 + 𝐵𝑇 , (𝐴𝐵)𝑇 = 𝐵𝑇 𝐴𝑇 and for any
scalar 𝑟: (𝑟𝐴)𝑇 = 𝑟𝐴𝑇 .
Systems of Linear Equations
Vectors and Linear equations
𝑥
𝒙= 𝑦

𝒙 = 𝐴−1 𝒃
Three Possible Solutions
 For a given linear system, it could
• Have a unique solution.
• Have infinitely many solutions.
• Have no solution.

 For the first two situations, the corresponding linear


system is consistent, i.e. it has at least one solution;
otherwise the linear system is inconsistent.
𝑥+𝑦 =4
3𝑥 + 3𝑦 = 6 No solution
Parallel lines
1 1 4
A= , b=
3 3 6

4𝑥 − 2𝑦 = 1
16𝑥 − 8𝑦 = 4 Infinite number of
solutions
4 −2 1
A= , b=
16 −8 4
Linear Independence, Spanning,
Basis and Dimension
Linear Independence or Dependence
 Given a set of vectors 𝒗𝟏 , 𝒗𝟐 , … , 𝒗𝒏 , we look at their
linear combinations 𝑐1 𝒗𝟏 + 𝑐2 𝒗𝟐 + ⋯ + 𝑐𝑛 𝒗𝒏 .

 Suppose 𝑐1 𝒗𝟏 + 𝑐2 𝒗𝟐 + ⋯ + 𝑐𝑛 𝒗𝒏 = 𝟎 with the trivial


combinations 𝑐1 = 𝑐2 = ⋯ = 𝑐𝑛 = 0, then the vectors
𝒗𝟏 , 𝒗𝟐 , … , 𝒗𝒏 are linearly independent.

 If any 𝑐𝑖 ’s are non-zero, then the 𝒗𝒊 ’s are linearly


dependent. In such case, one vector is the combination
of others.
Linear Independence or Dependence
 Example1: If 𝒗𝟏 = 0, then the set is linearly dependent.
We may choose 𝑐1 = 3 and all other 𝑐𝑖 = 0; this is a
non-trivial combination that produces zero.

1 3 3 2
 Example2: The columns of the matrix 2 6 9 5 are
−1 −3 3 0
linearly dependent since the second column is three
times the first.
Linear Independence or Dependence
3 4 2
 Example3: The columns of the matrix 𝐴 = 0 1 5 . Look
0 0 2
for a combination of the columns that makes zero.

 By solving the above equations we get the trivial


combinations or the null space of 𝐴 contains only the
zero vector, i.e., 𝑐1 = 𝑐2 = 𝑐3 = 0. Hence the columns of
matrix 𝐴 are linearly independent.

 The columns of 𝐴 are independent exactly when 𝑁(𝐴) =


{𝑧𝑒𝑟𝑜 𝑣𝑒𝑐𝑡𝑜𝑟}.
VECTOR SPACES AND SUBSPACES

The space 𝑹𝑛 consists of all column vectors 𝒗 with n


components

 Definition: A vector space is a nonempty set V of objects,


called vectors, on which are defined two operations, called
addition and multiplication by scalars (real numbers),
subject to the ten axioms (or rules) listed below. The axioms
must hold for all vectors 𝒖, 𝒗, and 𝒘 in 𝑽 and for all scalars
c and d.
VECTOR SPACES AND SUBSPACES

1. The sum of 𝒖 and 𝒗, denoted by 𝒖 + 𝒗 , is in 𝑽.

2. 𝒖 + 𝒗=𝒗 + 𝒖

3. (𝒖 + 𝒗) + 𝒘 = 𝒖 + (𝒗 + 𝒘).

4. There is a zero vector 0 in 𝑽 such that

𝒖+ 𝟎 =𝒖

22
5. For each 𝒖 in 𝑉, there is a vector −𝒖 in V such that
𝒖 + −𝒖 = 𝟎
6. The scalar multiple of u by c, denoted by cu, is in V.
7. 𝑐 + 𝑑 𝒖 = 𝑐𝒖 + 𝑑𝒖
8. 𝑐(𝑑𝒖) = (𝑐𝑑)𝒖
9. 𝑐(𝒖 + 𝒗) = 𝑐𝒖 + 𝑐𝒗
10. 1𝒖= 𝒖

23
SUBSPACES
Column space

The column space of a matrix is the vector space spanned by the columns of the
matrix. When a matrix is multiplied by a column vector, the resulting vector is in the
column space of the matrix, as can be seen from

The column space of an m n matrix A is a subspace of 𝑅𝑚

25
Null space

If 𝑨𝒙 = 𝟎 have solution 𝒙 = 𝟎, then the columns are independent.

26
Any two bases for a vector space 𝑉 contain the same number
of vectors. This number, which is shared by all bases and
expresses the number of “degrees of freedom” of the space,
is the dimension of 𝑉.
Rank of a Matrix
The maximum number of its linearly independent columns (or rows
) of a matrix is called the rank of a matrix. The rank of a matrix
cannot exceed the number of its rows or columns.

True size of a matrix is given by its Rank

The rank is the dimension of the column space. It is also the dimension of row space.
Thank You

30

You might also like