You are on page 1of 12

I THREE DIMENSIONAL VECTOR ALGEBRA

Dirac Notation

I. THREE DIMENSIONAL VECTOR ALGEBRA

A three-dimensional vector can be represented by specifying its components ai , i =1,2,3


with respect to a set of three mutually perpendicular unit vectors {~ei } as

~a = ~e1 a1 + ~e2 a2 + ~e3 a3 . (1)

The vectors {~ei } form a basis and are called basis vectors. If any three-dimension vector
can be represented in terms of the basis vectors, then the basis is said to be complete.
A basis is not unique. In a different basis {~i }, the components will be different to those in
the basis {~ei }.

ˆ Matrix format

We can represent the vector {~ai } by a column matrix as


 
a
 1
a =  a2  in the basis {→−e i}
 
 
a3

or

 
a01
 
a =  a02  in the basis {→
0 − }.
 
i
 
a03

ˆ The scalar or dot product of two vectors {~ai } and {~bi } is defined as

X
~a.~b = a1 b1 + a2 b2 + a3 b3 = ai b i . (2)
i

ˆ ~a.~a = a21 + a22 + a23 . This is the square of the length of the vector.

1
I THREE DIMENSIONAL VECTOR ALGEBRA

ˆ It is also true that

XX
~a.~b = ~ei .~ej ai bj . (3)
i j

This has to be identical to the result of Eq. 2. Hence



1, if i=j.

~ei .~ej = δij = δji = (4)
0, otherwise.

Here δij is the Kronecker delta symbol. This says that the basis vectors are mutually
orthogonal and are normalized.

ˆ Picking individual componenets: Given a vector, its component along ~ej can be found
by taking the scalar product of the vector with ~ej as follows

X X
~ej .~a = ~ej .~ei ai = δij ai = aj . (5)
i i

ˆ Operators

A operator is defined as an entity which when acting on a vector ~a converts that into
a vector ~b,

Â~a = ~b. (6)

ˆ Example
dy df (x) d2
Consider y = f (x) => dx
= dx
, d/dx is an operator. Similarly, Â(B̂y = dx2
y,
d2/dx2 is an operator.

ˆ The operator is linear if the following is true,

Â(x~a + y~b) = xÂ~a + y Â~b. (7)

A linear operator is completely determinded if its effect on every vector is completely


known.

2
I THREE DIMENSIONAL VECTOR ALGEBRA

ˆ Operator applied on a basis vector

Since Â~ei is a vector, this can be written as a linear combination of the basis vectors
{~ai },
3
X
Â~ei = ~ej Aji , i = 1, 2, 3. (8)
j=1

The number Aji is the component of Â~ei along ~ej . The nine numbers canbe arranged
in a two-dimensional array (matrix) as
 
A A A
 11 12 13 
A =  A21 A22 A23 
 
 
A31 A32 A33 .

A is the matrix representation of the operator A in the basis {~ei }.

Problems:

Aij aj if Â~a = ~b.


P
1. Show that bi = j

ˆ If A and B are the matrix representations of the operators  and B̂, the matrix
representation of the operator Ĉ which is the product of  and B̂ is found as follows,

X
Ĉ~ej = ~ei Cij (9)
i

= ÂB̂~ej (10)
X
= Â ~ek Bkj (11)
k
X X
= ~ei Aik Bkj ⇒ Cij = Aik Bkj . (12)
ik k

The above equation is the definition of matrix multiplication.

ˆ Commutators: For two operators  and B̂, the commutator is defined as


h i
Â, B̂ = ÂB̂ − B̂ Â (13)

3
II MATRICES

and the anticommutator is

{Â, B̂} = ÂB̂ + B̂ Â. (14)

Problems:

2. Calculate [A,B] for the following matrices.


 
1 1 0 
A=1 2 2 
 
 
0 2 −1.

and  
 1 −1 1 
B =  −1 0 0 
 
 
1 0 1.

II. MATRICES

ˆ The components of a vector {~a}, {ai }, are written as the elements of a column matrix
as

 
a1
 
a =  a2 
 
 
a3

ˆ For an N × M matrix A, if A a = b, then

M
X
bi = Aij aj , i = 1, 2, 3, ......N. (15)
j=1

ˆ The adjoint of an N × M matrix A, denoted as A† , is an M × N matrix with elements

(A† )ij = A∗ji (16)

4
II MATRICES

i.e take the complex conjugate of each of the elements of A and interchange rows
and columns. Hence, the adjoint of a column matrix is a row matrix containing the
complex conjugates of the elements of the column matrix.

ˆ A matrix is diagonal if all its off-diagonal elements are zero, Aij =Aii δij .

ˆ The trace of a matrix is the sum of its diagonal elements, tr A =


P
i Aii .

ˆ The unit matrix is defined by 1A = A1 = A.

ˆ The inverse of a matrix A, denoted as A−1 , is a matrix such that A−1 A=AA−1 = 1.

ˆ A unitary matrix A is one who inverse is its adjoint, A−1 =A† . A real unitary matrix
is called orthogonal.

ˆ A Hermitian matrix is self-adjoint, i.e. A=A† .

Problems:

3. If A is an N × M matrix and B is an M × K matrix, show that (AB)† =B† A† .

5
III STATE KETS AND OPERATORS

III. STATE KETS AND OPERATORS

A quantum state is represented by the ket |ψi. The Hermitian conjugate is the bra, hψ|.
The inner product is

hφ|ψi = c(a number). (17)

If c = hφ|ψi, then the complex conjugate is c∗ = hφ|ψi∗ = hψ|φi. Kets and bras exist in
a Hilbert space which is a generalization of the three dimensional linear vector space of
Euclidean geometry to a complex valued space with possibly infinite dimensions. The inner
product is linear

hφ|(a1 |ψ1 i + a2 ψ2 i) = a1 hφ|ψ1 i + a2 hφ|ψ2 i. (18)

Operators are denoted by a hat, Â.

ˆ An operator is linear if

Â(c1 |ψ1 i + c2 |ψ2 i) = c1 Â|ψ1 i + c2 Â|ψ2 i. (19)

ˆ The matrix element of an operator is

hφ|Â|ψi = hφ|(Â|ψi) = (hφ|Â|)ψi = c(a number). (20)

ˆ The expectation value of an operator for a system in state |ψi is

hÂi = hψ|Â|ψi. (21)

ˆ The complex conjugate of the matrix element is

hφ|Â|ψi∗ = hψ|† |φi = c∗ , (22)

where † is the Hermitian conjugate of Â. When  is representated by a matrix, the
Hermitian conjugate is found by transposing the matrix and then taking the complex
conjugate of each matrix element. The operation of taking the Hermitian conjugate
of a combination of numbers, states, operators involves changing c → c∗ , |ψi → hψ|,
 → † . For example,
 †
c1 † hφ|B̂|ψihξ| = c∗1 |ξihψ|B̂ † |φiÂ. (23)

6
IV OBSERVABLES

IV. OBSERVABLES

Observables are represented by Hermitian operators which satisfy † = Â. The expectation
value of a Hermitian operator is real:

a∗ = hÂi∗ = hψ|Â|ψi∗ = hψ|† |ψi = hψ|Â|ψi = a (24)

Denoting the eigenstates of a Hermitian operator by |ni, the eigenvalues are real since

hm|Â|mi = hm|am |mi = am hm|mi = am (25)

and

a∗m = hm|Â|mi∗ = hm|† |mi = hm|Â|mi = am . (26)

Also the eigenstates are orthogonal,

hm|Â|ni = hm|an |ni = an hm|ni

and
   †
hm|Â|ni = hm| |ni = † |mi |ni = (am |mi)† |ni = a∗m hm|ni = am hm|ni.

Thus

(am − an )hm|ni = 0. (27)

So hm|ni=0 if m 6= n.
The eigenstates |ni of a Hermitian operator form a complete set. Therefore any arbitrary
ket can be expanded as

X
|ψi = cn |ni (28)
n=0

where

X ∞
X ∞
X
hn|ψi = hn| cj |ji = cj hn|ji = cj δnj = cn (29)
j=0 j=0 j=0

Thus
∞ ∞ ∞
!
X X X
|ψi = cn |ni = hn|ψi|ni = |nihn| |ψi. (30)
n=0 n=0 n=0

Here we have the identity operator



X
ˆ
I= |nihn|, the closure property (31)
n=0

7
IV OBSERVABLES

ˆ Compatible observales: Observables  and B̂ are defined to be compatible when the


corresponding operators commute, [Â, B̂] = 0. otherwise those are incompatible.

ˆ For two compatible observales, it is always true that they have simultaneous eigenkets.

ˆ A component of ψ can be found by operating with the projection operator P̂n = |nihn|,


X ∞
X ∞
X
P̂n |ψi = |nihn| cj |ji = |ni cj hn|ji = |ni cj δnj = cn |ni. (32)
j=0 j=0 j=0

ˆ The projection operator is idempotent.

ˆ The inner product of two states can be expressed in terms of the coefficients of their
decomposition. Let

X X
|ψi = cn |ni, |φi = bn |ni. (33)
n n

Then

X X XX X
hφ|ψi = b∗m hm| cn |ni = b∗m cn δmn = b∗n cn (34)
m n n m n

ˆ The spectral representation of an operator is found as follows

! !
X X
 = IˆÂIˆ = |mihm|  |nihn|
m n
XX XX X
= |mihm|an |nihn| = an |miδmn hn| = an |nihn|. (35)
m n n m n

Thus

X
 = an |nihn|. (36)
n

8
V MEASUREMENTS

V. MEASUREMENTS

P.A.M. Dirac

“A measurement always causes the system to jump into an eigenstate of the dynamical
observable that is being measured.”

X
If |ψi = cn |ni, the results of measurement is
n=0
X X
hÂi = hψ|Â|ψi = c∗m hm|Â| cn |ni (37)
n n
XX X
= c∗m cn an hm|ni = |cn |2 an . (38)
n m n

Now the probability of the measurement being an (or the probability of finding the system
in |ni) is

P (an ) = |cn |2 = |hn|ψi|2 , provided n is normalized. (39)

The probabilistic interpretation, eq. 23, for the square inner product is one of the funda-
mental postulates of quantum mechanics, so it cannot be proven.

9
VI POSITION REPRESENTATION

VI. POSITION REPRESENTATION

Let x0 be an eigenvalue of the observable x and let |x0 i be the corresponding eigenvector.

x|x0 i = x0 |x0 i

Let us define an operator as follows,


i∆xp
J(∆x) = exp(− )

where ∆x is ∈ R. It is easy to show that J(∆x) is unitary.
Theorem: Let x0 be an eigenvalue of the observable x and let |x0 i be the correspond-
ing eigenvector. Then the ket vector J(∆x)|x0 i is a normalised eigenvector of x with an
eigenvalue x0 + ∆x.

Proof: It can be proved that [x, B(p)] = ih̄ dB


dp
. Hence

∆x
[x, J(∆x)] = ih̄ J(∆x).
ih̄

⇒ xJ(∆x)|x0 i = ([x, J(∆x)] + J(∆x)x)|x0 i = (x0 + ∆x)J(∆x)|x0 i.

Hence, the ket-vector J(∆x)|x0 i is an eigenvector of x with an eigenvalue x0 + ∆x. It follows


from the theorem that

J(∆x)|x0 i = |x0 + ∆xi.

The important point to note here is that the spectrum of eigenvalues of the operator x is
continuous and contains all real numbers.

ˆ The position wavef unction ψα (x0 ) of a state vector α is defined as ψα (x0 ) = hx0 |αi.

ˆ hx0 |x|αi = x0 hx0 |αi = x0 ψα (x0 ).

ˆ Let A(x) be a function of a x and let A(x) = an xn . Then


P
n

X
hx0 |A(x)|αi = an x0n hx0 |αi = A(x0 )ψα (x0 )
n

10
VII MOMENTUM REPRESENTATION

ˆ hx0 |J(∆x)|αi = hx0 |J † (−∆x)|αi = hx0 − ∆x|αi = ψα (x0 − ∆x).

ˆ Since
 
ip∆x i∆x
J(−∆x) = exp =1+ p,
h̄ h̄
i∆x 0
⇒ hx0 |J(−∆x)|αi = ψα (x0 ) + hx |p|αi.

As hx0 |J(−∆x)|αi = ψα (x0 + ∆x).

ψα (x0 + ∆x) − ψα (x0 )


⇒ hx0 |p|αi = −ih̄
∆x

In the limit ∆x → 0
dψα
hx0 |p|αi = −ih̄ .
dx0

VII. MOMENTUM REPRESENTATION

Let p0 be an eigenvalue of the observable p, and let |p0 i be the corresponding eigenvector,
i.e.

p|p0 i = p0 |p0 i.

ˆ Closure relation: dp0 |p0 ihp0 | =1


R

ˆ Orthonormality: hp00 |p0 i = δ(p0 − p00 )

ˆ Momentum wavefunction: φα (p0 ) = hp0 |αi.

ˆ Probability density to measure a momentum p0 is |φα (p0 )|2 = |hp0 |αi|2 .


Z ∞ Z ∞
ˆ |αi = 0 0 0
dp |p ihp |αi = dp0 φα (p0 )|p0 i
−∞ −∞

ˆ
Z ∞ Z ∞
0 0 0
hβ|αi = dp hβ|p ihp |αi = dp0 φ∗β (p0 )φα (p0 )
−∞ −∞

11
VIII TRANSFORMATION FUNCTION

VIII. TRANSFORMATION FUNCTION

What is the relation between the position wavefunction ψα (x0 ) and its momentum counter-
part φα (p0 )?

∂ 0 0
hx0 |p|p0 i = p0 hx0 |p0 i = −ih̄ hx |p i
∂x0

Solution of the above differential equation


ip0 x0
 
0 0
hx |p i = N exp

ip0 x0
 
0 1
0
⇒ hx |p i = √ exp
2πh̄ h̄

R∞ ip0 x0
∞ dp0 e h̄ φα (p0 )
Z
⇒ ψα (x0 ) = hx0 |αi = dp0 hx0 |p0 ihp0 |αi = −∞ √
−∞ 2πh̄

R∞ −ip0 x0
∞ dx0 e ψα (x0 )
Z h̄
0 0 0 0 0 0 −∞
⇒ φα (p ) = hp |αi = dx hp |x ihx |αi = √
−∞ 2πh̄

12

You might also like