You are on page 1of 20

Chapter 1.

Matrices, Vectors and Vector Calculus

• Even for a coordinate transformation, the mass and the number of the particles are not
affected by a change in the coordinate axes.  Scalars
 Quantities that are invariant under coordinate transformation.

• The coordinates of a point charge change when the coordinate system changes. 
Vectors
 Quantities that are variant under coordinate transformation.
𝑥1′ = 𝑥1 cos 𝜃 + 𝑥2 sin 𝜃
𝜋
𝑥2′ = −𝑥1 𝑠𝑖𝑛 𝜃 + 𝑥2 cos −𝜃
2
𝜋
= 𝑥1 cos( 2 + 𝜃) + 𝑥2 cos 𝜃

Rewriting

𝑥1′ = 𝜆11 𝑥1 + 𝜆12 𝑥2 𝑥2′ = 𝜆21 𝑥1 + 𝜆22 𝑥2

𝑤ℎ𝑒𝑟𝑒 𝜆11 = cos 𝜃 , 𝜆12 = sin 𝜃 , 𝜆21 = − sin 𝜃 , 𝜆22 = cos 𝜃

Likewise for three dimensions we have

𝑥1′ = 𝜆11 𝑥1 + 𝜆12 𝑥2 + 𝜆13 𝑥3


𝑥2′ = 𝜆21 𝑥1 + 𝜆22 𝑥2 + 𝜆23 𝑥3
𝑥3′ = 𝜆31 𝑥1 + 𝜆32 𝑥2 + 𝜆33 𝑥3

In summation notation, 3

𝑥𝑖′ = 𝜆𝑖𝑗 𝑥𝑗 , 𝑖 = 1, 2, 3
𝑗=1
The inverse transformation is
3

𝑥𝑖 = 𝜆𝑗𝑖 𝑥𝑗 ′ , 𝑖 = 1, 2, 3
𝑗=1

The quantities 𝜆𝑖𝑗 is called the direction cosine of the 𝑥𝑖 ′ axis relative to the 𝑥𝑗 axis.
It is convenient to express the 𝜆𝑖𝑗 into a square array called a matrix.

𝜆11 𝜆12 𝜆13


𝝀 = 𝜆21 𝜆22 𝜆23 : Transformation matrix
𝜆31 𝜆23 𝜆33

The direction cosines satisfy the following relation.

𝜆𝑖𝑗 𝜆𝑘𝑗 = 𝛿𝑖𝑘 0, 𝑖𝑓 𝑖 ≠ 𝑘


𝛿𝑖𝑘 =
𝑗
1, 𝑖𝑓 𝑖 = 𝑘
where 𝛿𝑖𝑘 is the Kronecker delta symbol.
When the coordinate systems satisfy the above relation, such systems are said to be
orthogonal and the relation is the orthogonality condition. Also

𝜆𝑖𝑗 𝜆𝑖𝑘 = 𝛿𝑗𝑘


<H.W.> Apply the orthogonality condition to the
𝑖
rotational transformation.
• A matrix AT when rows and columns of an original matrix A are interchanged :
Transposed matrix

𝜆′𝑖𝑗 = 𝜆𝑗𝑖

Evidently, (AT)T= A

And the coordinates are


𝑿′ = 𝝀 𝑿, 𝑿 = 𝝀𝑇 𝑿′

Also the inverse 𝝀−𝟏 of a matrix is defined as


𝝀 𝝀−𝟏 = 𝝀−𝟏 𝝀 = 𝟏

For orthogonal matrices


<H.W.> Prove the rotational transformation
𝝀𝑻 = 𝝀−𝟏
satisfies this relation.
Some important rules of matrix algebra

1. Matrix multiplication is not commutative in general :

𝑨𝑩 ≠ 𝑩𝑨

But in special cases,

𝑨𝑨−𝟏 = 𝑨−𝟏 𝑨 = 𝟏
𝑨𝟏 = 𝟏𝑨 = 𝑨

2. Matrix multiplication is associative :


𝑨𝑩 𝑪 = 𝑨[𝑩𝑪]

3. Matrix addition is perfromed by adding corresponding elements of the two matrices.


𝑪=𝑨+𝑩
𝐶𝑖𝑗 = 𝐴𝑖𝑗 + 𝐵𝑖𝑗
0 1 0
𝝀𝟏 = −1 0 0
0 0 1

1 0 0
𝝀𝟐 = 0 0 1
0 −1 0

0 1 0
𝑿" = 𝝀𝟐 𝝀𝟏 𝑿 = 0 0 1 𝑿
1 0 0
0 0 1
𝝀𝟏 𝝀𝟐 = −1 0 0 ≠ 𝝀𝟐 𝝀𝟏
0 −1 0
The inversion matrix is

−1 0 0
−𝟏
𝝀 = 0 −1 0
0 0 −1
Consider a coordinate transformation of the type
3

𝑥𝑖′ = 𝜆𝑖𝑗 𝑥𝑗 , 𝑖 = 1, 2, 3 𝑤𝑖𝑡ℎ 𝜆𝑖𝑗 𝜆𝑘𝑗 = 𝛿𝑖𝑘


𝑗
𝑗=1

If a quantity  is unaffected under such a transformation,  is called scalar or scalar invariant.

When a set of quantities (A1, A2, A3) is transformed by a transformation matrix  with the
result :
3

𝐴′𝑖 = 𝜆𝑖𝑗 𝐴𝑗
𝑗=1

The quantity A is termed a vector.


The vectors A and B and the scalars , ,  satisfy the following relations.

Addition :
𝐴𝑖 + 𝐵𝑖 = 𝐵𝑖 + 𝐴𝑖 ∶ 𝐶𝑜𝑚𝑚𝑢𝑡𝑎𝑡𝑖𝑣𝑒 𝑙𝑎𝑤

𝐴𝑖 + (𝐵𝑖 + 𝐶𝑖 ) = (𝐴𝑖 + 𝐵𝑖 ) + 𝐶𝑖 ∶ 𝐴𝑠𝑠𝑜𝑐𝑖𝑎𝑡𝑖𝑣𝑒 𝑙𝑎𝑤

+=+ : Commutative law

 + (+) = ( +) +  : Associative law

Multiplication by a scalar :

 𝑨 = 𝑩 : a vector

= : a scalar
The multiplication of two vectors A and B to form the scalar product (or dot product)
is defined to be

𝑨∙𝑩= 𝐴𝑖 𝐵𝑖
𝒊
The magnitude of the vector is

𝑨 = + 𝐴12 + 𝐴22 + 𝐴23 ≡ 𝐴

Then the scalar product is

𝑨 ∙ 𝑩 = 𝐴𝐵 𝑐𝑜𝑠 𝜃
Consider the coordinate transformation of the scalar product
3 3

𝐴′𝑖 = 𝜆𝑖𝑗 𝐴𝑗 𝐵𝑖′ = 𝜆𝑖𝑘 𝐵𝑘


𝑗=1 𝑘=1

3 3

𝑨′ ∙ 𝑩′ = 𝐴𝑖 ′ 𝐵𝑖 ′ = 𝜆𝑖𝑗 𝐴𝑗 𝜆𝑖𝑘 𝐵𝑘
𝒊 𝒊 𝑗=1 𝑘=1

= 𝜆𝑖𝑗 𝜆𝑖𝑘 𝐴𝑗 𝐵𝑘 = 𝛿𝑗𝑘 𝐴𝑗 𝐵𝑘 = 𝐴𝑗 𝐵𝑗


𝑗,𝑘 𝑖 𝑗,𝑘 𝑗

=𝑨∙𝑩

Therefore the scalar product becomes the scalar!

The scalar product obeys the commutative and distributive laws :

𝑨∙𝑩 = 𝑩∙𝑨

𝑨 ∙ (𝑩 + 𝑪) = 𝐀 ∙ 𝑩 + (𝑨 ∙ 𝑪)
When the vector A is represented by the unit vectors,

𝐴 = 𝐴1 , 𝐴2 , 𝐴3 𝑜𝑟 𝐴 = 𝑒1 𝐴1 + 𝑒2 𝐴2 +𝑒3 𝐴3

or 𝐴 = 𝐴1 𝑖 + 𝐴2 𝑗 + 𝐴3 𝑘

where 𝑒𝑖 ∙ 𝑒𝑗 = 𝛿𝑖𝑗

< Vector product >

C=AB

𝐶𝑖 ≡ 𝜀𝑖𝑗𝑘 𝐴𝑗 𝐵𝑘
𝑗,𝑘

𝜀𝑖𝑗𝑘 is the permutation symbol ( or Levi-Civita density)

And the magnitude of the vector product is

𝑪 = 𝑨𝑩 𝐬𝐢𝐧 𝜽
The vector product satisfies the following properties :

𝑨 × 𝑩 = −𝑩 × 𝑨

In general, 𝑨× 𝑩×𝑪 ≠ 𝑨×𝑩 ×𝑪

𝑨× 𝑩×𝑪 = 𝑨 ∙𝑪 𝑩− 𝑨∙𝑩 𝑪

Simply,

𝒆𝟏 𝒆𝟐 𝒆𝟑
𝑪 = 𝑨 × 𝑩 = 𝐴1 𝐴2 𝐴3
𝐵1 𝐵2 𝐵3

The identities for the vector product


Also the derivatives of the vector sums and products are
𝑑 𝑑𝑨 𝑑𝑩
𝑨+𝑩 = +
ds 𝑑𝑠 𝑑𝑠

𝑑 𝑑𝑩 𝑑𝑨
𝑨∙𝑩 =𝑨∙ + ∙𝑩
𝑑𝑠 𝑑𝑠 𝑑𝑠

𝑑 𝑑𝑩 𝑑𝑨
𝑨×𝑩 = 𝑨× + ×𝑩
𝑑𝑠 𝑑𝑠 𝑑𝑠

𝑑 𝑑𝑨 𝑑𝜑
𝜑𝑨 = 𝜑 + 𝑨
𝑑𝑠 𝑑𝑠 𝑑𝑠

For example,

𝑑𝒓 𝑑𝑥𝑖 𝑑𝒗 𝑑 2 𝑥𝑖
𝒓= 𝑥𝑖 𝒆𝒊 𝒗= = 𝒆 𝒂= = 𝒆
𝑑𝑡 𝑑𝑡 𝒊 𝑑𝑡 𝑑𝑡 2 𝒊
𝑖 𝑖 𝑖
In cylindrical coordinates,

𝑑𝒆𝒓 = 𝑑𝜃 𝒆𝜽

𝑑𝒆𝜽 = −𝑑𝜃 𝒆𝒓

𝑒𝑟 = 𝜃 𝑒𝜃

𝑒𝜃 = −𝜃 𝑒𝑟

Therefore
𝑑𝒓
𝒗= = 𝑟 𝒆𝒓 + 𝑟 𝜃 𝒆𝜽 + 𝑧 𝒆𝒛
𝑑𝑡
𝑑𝒗
𝒂= = 𝑟 − 𝑟𝜃 2 𝒆𝒓 + (𝑟𝜃 + 2𝑟𝜃 ) 𝒆𝜽
𝑑𝑡

<H.W> Find the expressions in spherical coordinates.


<Angular Velocity >

When a point or a particle moving arbitrarily in space follows circular path about a
certain axis (so called the instantaneous axis of rotation), the rate of change of the
angular position is called the angular velocity :

𝑑𝜃
𝜔= =𝜃
𝑑𝑡

Then for motion in a circle of radius R, the instantaneous


magnitude of the linear velocity is

𝒗= 𝝎×𝒓
< Gradient Operator >

A scalar under a coordinate transformation is

𝜕′ 𝜕 𝜕𝑥𝑗 𝜕
= = 𝜆𝑖𝑗
𝜕𝑥𝑖′ 𝜕𝑥𝑗 𝜕𝑥𝑖′ 𝜕𝑥𝑗
𝑗 𝑗

′ 𝜕𝑥𝑗
since 𝑥𝑗 = 𝑘 𝜆𝑘𝑗 𝑥𝑘 , = 𝑘 𝜆𝑘𝑗 𝛿𝑖𝑘
𝜕𝑥𝑖′

Defining the gradient operator :


𝜕
𝐠𝐫𝐚𝐝 = 𝜵 = 𝒆𝒊
𝜕𝑥𝑖
𝑖

𝜕
𝐠𝐫𝐚𝐝  = 𝜵 = 𝒆𝒊
𝜕𝑥𝑖
𝑖

𝜕𝐴𝑖 𝜕𝐴𝑘
𝐝𝐢𝐯 𝑨 = 𝜵 ∙ 𝑨 = 𝐜𝐮𝐫𝐥 𝑨 = 𝜵 × 𝑨 = 𝒆𝒊 𝜀𝑖𝑗𝑘
𝜕𝑥𝑖 𝜕𝑥𝑗
𝑖 𝑖
Rewriting the derivative,

𝑑𝜙 = (𝛻𝜙) ∙ 𝑑𝒔

(𝑑𝜙)𝑚𝑎𝑥 = 𝛻𝜙 𝑑𝑠 for 𝛻𝜙 || ds

𝑑𝜙
𝛻𝜙 =
𝑑𝑠 𝑚𝑎𝑥

𝜵𝝓 is in the direction of the greatest change in .

The successive operation of the gradient


operator is called the Laplacian.

𝜕2
𝜵∙𝜵= 𝜵𝟐 =
𝜕𝑥𝑖2
𝑖
< Some important integration >

Divergence theorem (Gauss’s theorem):

𝑨 ∙ 𝑑𝐚 = 𝛻 ∙ 𝑨 𝑑𝑣
𝑠 𝑉

Line integrals (Stokes’s theorem):

𝑨 ∙ 𝑑𝐬 = (𝛻 × 𝑨) ∙ 𝑑𝐚
𝐶 𝑆

You might also like