You are on page 1of 13

Formulae Mathematical Physics

Mathmatical Representation of Complex Numbers

1. Cartesian Form
z = x + iy

Where x is the the real part Re(Z) and y is the imaginary part Im(Z)

2. Polar Form

z = re

3. x ,
= rcos θ y = rsin θ

2
r = √x + y
2
, θ = tan −1
(y/x)

z = r(cos θ + i sin θ)
i θ
re = r(cos θ + i sin θ)

Euler’s Formula

i θ
e = (cos θ + i sin θ)


Z Z4. Complex Conjugate (

or )
The Complex Conjugate (Z ) of


z = x + iy is z = x − iy

Similarly

iθ ∗ −iθ
z = re is z = re

5. Modulus / Absolute Value / Magnitude

2 2 2 2
|z| = r = √ x + y = √ ( Real Part ) + ( Imaginary Part )

||z − z 0 ∣= R ⟶ Circle with center

at z 0 = x 0 + 2y 0 and

radius R unit.s
7.


6. |z 1 z 2 |

z1

z2

8. |z 1

9. |z 1

11. arg ⋅ ( z
=
= |z 1 | ⋅ |z 2 |

z1

2
|z 1 |

|z 2 |

+ z 2 | = √ |z 1 |

− z 2 | = √ |z 1 |

10. Argument / Phase

Step - 1: Calculate θ
Step - 2:

If z lies in 1
If z lies in 2
If z lies in 3
If z lies in 4
st

nd

rd

th
2

2
⇒ |(x + iy) − (x + iy 0 )| = R

⇒ |(x − x 0 ) + i (y − y 0 )| = R

⇒ √ (x − x 0 )

⇒ (x − x 0 )

+ |z 2 |

+ |z 2 |

arg. (z) = θ = tan

) = arg. (z 1 ) − arg. (z 2 )

12. Calculation of Principal Argument


= tan
2

−1

quadrant, then Arg(z) = θ


quadrant, then Arg(z) = π − θ
2

quadrant, then Arg(z) = −(π − θ)


quadrant, then Arg(z) = −θ
−1
2
+ (y − y 0 )

+ (y − y 0 )

x
= tan
2

−1

Phase dif f . between z 1 and z 2

= arg ⋅ (z 1 ) ∼ arg ⋅ (z 2 )

= θ1 ∼ θ2

Z = a + ib
2

= R

This is the equation f or a circle at (x 0 , y 0 )

+ 2 |z 1 | |z 2 | cos (θ 1 ∼ θ 2 )

− 2 |z 1 | |z 2 | cos (θ 1 ∼ θ 2 )

(
= R

Imaginary Part

Real Part
)
Lecture 2

Euler's Formula Derivation Modulus Argument Form


Cube Roots of Unity

2(2m)π
1. e = 1 (m = 0, ±1, ⋯)

2(2m+1)π
2. e = −1 (m = 0, ±1, …)

2
3. Complex cube roots of unity: 1, ω, ω
2 ∗
ω = ω
2
1 + ω + ω = 0
2 3
ω ⋅ ω = ω = 1

∗ 2 2 2
4. zz = |z| z ≠ |z|

5. Euler’s Formula


e = cos θ + i sin θ

2nθ
6. e = Cos(nθ) + 2 sin(nθ) n = +ve/ − ve integer

2nθ
7. e = Cos(nθ) + 2 sin(nθ) n = p/q = rational number

Lecture 3

Lecture 4, 5 Matrices

1. A −1

1

in Matrix Multiplication (It’s inverse not power) A is not a number, it is a matrix

2. Important Properties

1. (A


) = A
2. (AB) ⊤
= B

A

3. (A ⊤ n n ⊤
) = (A )

4. (A
−1 ⊤
⊤ −1
) = (A )

5. (A † †
) = A

6. (AB) †
= B A
† †

7. (A
n †
† n
) = (A )

8. (A
−1 †
† −1
) = (A )

3. Upper triangular matrix


a ij = 0 for all i > j

4. Lower triangular matrix


a ij = 0 for all j > i

5. Periodic Matrix
if A k+1
= A, period = k

6. Idempotent Matrix
A = A, periodic matrix of period k
2

7. Nilpotent Matrix
A is nilpotent if, A p = 0
where p is the lowest positive integer for which A is 0
p

8. Involuntary Matrix [|A| = +1 or -1]


A
2
=I

9. Idempotent Matrix [|A| = +1 or 0]


2
A = A

10. Properties of Trace

1. Tr(A ± B) = Tr(A) ± Tr(B) → Valid for any number of matrices


2. Tr(A) = Tr (A ⊤
)

3. Tr(K ⋅ A) = K ⋅ Tr(A)

4. Tr(AB) = Tr(BA)

5. Tr(ABC) = Tr(BCA) = Tr(CAB)

(5) ≠ (6)

6. Tr(ACB) = Tr(BAC) = Tr(CBA) → Valid for any number of matrices

11. Properties of Determinants

1. Det(A) = Det (A ⊤
)

2. Det(AB) = Det(A) ⋅ Det(B) ⟶ Valid for any number

3. Det (A n
) = [Det(A)]
n
⟶ n = +v e integer

4. Det(K ⋅ A) = K n
⋅ Det(A) ⟶ K is scalar quantity & n = order of matrix

5. A adj(A) = det(A)I

6. Det (A †
) = [Det(A)]

7. Symmetric Matrix

A = A

A ij = A ji

Number of L.I. elements for n × n sym matrix = 1

2
n(n + 1)
13. Anti-Symmetric Matrix

A = −A

A ij = −A ji

For PDE (i.e., i = j)


A ii = −A ii

Thud PDE can be nothing but 0

Number of L.I. elements for n × n sym matrix = 1

2
n(n − 1)

Lecture 6

Symmetric and Anti-Symmetric Matrices

1. Every square matrix can be written as a linear combination of "symmetric


matrix" and "skew-symmetric matrix".

1 1
T T
A = (A + A ) + (A − A )
2 2

2. Product of two symmetric matrices, will be


i) symmetric matrix if both matrices commutes with each other and
ii) skew-symmetric matrix if both matrices anti-commutes with each other.
⊤ ⊤ ⊤
(AB) = AB ⇒ B A = AB

⇒ BA = AB → A and B commutes

AB is anti-sym. matrix

(AB) = −(AB)
⊤ ⊤
⇒ B A = −AB

⇒ B ⋅ A = −AB

3. Product of two skew- symmetric matrices, will be


i) symmetric matrix if both matrices commutes with each other and
ii) skew-symmetric matrix if both matrices anticommutes with each other.

4. Product of one symmetric & one skew-symmetric matrix, will be


i) symmetric matrix if both matrices anti-commutes with each other and
ii) skew-symmetric matrix if both matrices commutes with each other.
5. Power of symmetric matrix, is also be symmetric matrix.
2 3 n
A ,A ,…A → Sym.matrix

6. Even power of skew-symmetric matrix, is symmetric matrix and odd power of


skew-symmetric matrix, is,symmetric matrix. Skew.
A ⟶ Skew-sym matrix
, A , … ⟶ Symmetric matrix
2 4 6
A ,A

3
A ,A ,A
5 7
… → Skew-symmetric matrix

7. Inverse of symmetric matrix, is also be symmetric matrix.


⊤ −1
−1 ⊤ −1
(A ) = (A ) = A

8. Inverse of skew-symmetric matrix, is also be skewsymmetric matrix.

−1 ⊤ ⊤ −1 −1
(A ) = (A ) = −A

Hermitian and Skew Hermitian Matrices

1. Every square matrix can be written as a linear combination of "Hermitian matrix"


and "Skew-Hermitian matrix".

1 †
1 †
A = (A + A ) + (A − A )
2 2

2. Product of two Hermitian matrices, will be


i) Hermitian matrix if both matrices commutes with each other and
ii) skew-Hermitian matrix if both matrices anti-commutes with each other.

3. Product of two skew-Hermitian matrices, will be


i) Hermitian matrix if both matrices commutes with each other and
ii) skew-Hermitian matrix if both matrices anticommutes with each other.

4. Product of one Hermitian & one skew-Hermitian matrix, will be


i) Hermitian matrix if both matrices anti-commutes with each other and
ii) Skew-Hermitian matrix if both matrices commutes with each other.

5. Power of Hermitian matrix, is also be Hermitian matrix.

6. Even power of skew-Hermitian matrix, is Hermitian matrix and odd power of


skew-Hermitian matrix, is skew-Hermitian matrix.

7. Inverse of Hermitian matrix, is also be Hermitian matrix.

8. Inverse of skew-Hermitian matrix, is also be Skew-Hermitian matrix.

9. If a Hermitian Matrix is multiplied by a complex number, it will be neither


hermitian nor skew hermitian.

10. If a Hermitian Matrix is mutliplied by a purely imaginary number, it will be a


skew-Hermitian matrix.

11. If a Hermitian Matrix is mutliplied by a real number, it will remain a Hermitian


matrix (not so for a irrational numbers)

Lecture 7

9. Sum & difference of two Hermitian matrices, will also be Hermitian matrix.
10. Commutator bracket of two Hermitian matrices, will be skew-Hermitian matrix.
11. Multiplying Hermitian matrix with real number, does not change it's nature.
12. Multiplying Hermitian matrix with purely imaginary number, transforms it into
skew-Hermitian matrix.

Dot Product & Inner Product

Dot Product is only valid for Real Vectors and is not valid for Complex vectors.
We use Inner product for complex vectors. Inner Product is basically like the Dot
product for real vectors.
Inner Product doesn't commute.

Inner Product

Normalization Condition

Also,
→ → †
A B = [A

→ →

A B ≠ B A

→ →

B A = (A B)

|A| =


⇒ |A| = √ |A x |

Last Two Lines are Normality Conditions

Orthogonality Condition

→ →



→ →
∠(A, B) = 90

→ →
A B = 0

AN = KA



→ →


x

→ →

→ →
A A



|A| = 1 ⟶

→ →
Ay

= Ax Bx + Ay By + Az Bz


= √A x A x + A y A y A z A z

2

Az ]

+ |A y |

(f or complex vectors)

A N :Normalized Vector


A :Unnormalized Vector

K is the Normalization Constant

K = ±


1

→ →
A A

= ±
1


|A|

A ⋅ A = 1 (f or real vectors)

→ →

A ⋅ B = 0 (f or real vectors)

A A = 1 (f or complex vectors)

⎢⎥

+
Bx

By

Bz

+ |A z |

Normalized

Vector
2
Lecture 5
Orthogonal Matrix

Properties

1. Det(A) = +1 or -1
2. A ⊤
= A

Unitary Matrix
−1

Properties → Det ⋅(A) can be


= A
−1


3. Each row and each column can be treated as
Normalized vector.

R1 =

with unit module.


R2

→ →
=
AA

R3
= A

= 1, C 1

→ →
R1 R2 = R2 R3 = R3 ⋅ R1 = 0

→ → →
A = I

4. Any two rows and any two columns can be treated as orthogonal vectors.

→ → →


C1 ⋅ C2 = C2 ⋅ C3 = C3 C1 = 0

e
x

B
AA

3. Each row and each column can be treated as normalized Vector.

= 1 + x +

= I + B +

= A A = I

1. Det(A) will have unit modulus real or complex number but


2. A

4. Any two rows and any two colurms can be treated as Orthogonal to each other.

Important Points to Remember {Unitary & Orthogonal Matrix}

1. Product of two orthogonal matrices, will also be an orthogonal matrix.


2. Product of two unitary matrices, will also be an unitary matrix.
3. Transpose & inverse of an orthogonal matrix, are also orthogonal matrices.
4. Transpose conjugate & inverse of an unitary matrix, are also unitary matrices.

2!

B
2

2!
2
+

+
x
=

→ →

3!

B
3

3!

+ ⋯

3

C2

+ ⋯
=

C3 = 1
Lecture 6

C → λC

If λ > 0, then C→ and λC→ will be ∥ lel .

If λ < 0, then c→ and λc→ will be anti- ∥ lel .

If |λ| > 1, then vector magnitude will increase.

If |λ| < 1, then vector magnitude will decrease.

If |λ| = 1, then vector magnitude will remain same.

Important Points to Remember


(1) Rank of a matrix & it's transpose are same.
(2) Rank of a null matrix is 0 .
(3) Rank of a non-zero matrix is always ≥ 1.
(4) Rank of a non-singular matrix is equal to it's order.
(5) Rank of a singular matrix is less than it's order.

Lecture 7

AX = λX is the characteristic equation of a matrix. If the Matrix X satisfies the


above eqn/., we can say that X is an eigenvector of the matrix A corresponding to
eigenvalue λ

Method for Calculating Eigenvalues & Eigenvectors


STEP - 1: Solve characteristic equation & obtain eigenvalues.
STEP - 2: By putting the values of λ in (A − λI )X = 0 , obtain the relation between
elements of the eigenvector.

STEP - 3: Using the relation obtained in STEP - 2, find the linearly independent
eigenvectors.
Lecture 8

GM ⋅ (n − r)

where n = order of the (A − λI ) co-efficient matrix (characteristic matrix)

r = rank of the (A − λI ) co-efficient matrix (characteristic matrix)

Properties of Eigenvalues & Eigenvectors

1. Sum of the eigenvalues = Trace of the matrix

2. Product of the eigenvalues = Determinant of the matrix

3. For real matrices, complex eigenvalues occur in complex conjugate pairs.

4.

Lecture 9
5.

6. Eigenvectors corresponding to non-degenerate eigenvalues of Hermitian matrix,


are orthogonal to each other. This property is also valid for real symmetric, real
orthogonal & unitary matrices.

7. Consider a matrix whose rows & columns are scalar multiple of a particular row
or column respectively.

λ = Tr(A), 0, 0, … … … , 0

Lecture 10

You might also like