You are on page 1of 23

TENSOR ANALYSIS

Thomson and Arun

January 2024

INTRODUCTION

Welcome to the realm of Tensor Analysis which is a fascinating jour-


ney into the mathematical framework that unveils the complications of
multidimensional relationships.In this project,we embark on a quest to
understand and harness the power of tensors,extending our grasp beyond
conventional vectors and matrices.As we explore into the depths of tensor
analysis,we unlock a toolkit that has revolutionized fields ranging from
physics and engineering to cutting edge technologies like machine learning
and data science.

Tensors,with their ability to represent complex relationships,form the


backbone of this exploration.Whether you’re a student,researcher,or en-
thusiast,his project offers an opportunity to explore into practical applica-
tions,and witness the transformative impact of tensor analysis in solving
real world problems.

Together,let’s unravel the mathematical elegance of tensors and ex-


plore how this versatile tool can elevate our understanding of the physical
world,enhance problem solving capabilities,and pave the way for innovative
applications in diverse domains.Welcome to the world of Tensor Analysis
where mathematical abstraction meets tangible solutions.

1
PRELIMINARY

• Magnitude

Refers to the size or extent of a quantity without regard to its di-


rection.

• Direction

Refers to a square matrix that n equal to its transpose.

• Vector

A vector is a mathematical object that represents a quantity with


magnitude and direction.

• Coordinate System

It is a mathematical framework used to uniquely determine the po-


sition of points or objects in space.

• Single Valued Function

It is a mathematical function that assigns a unique output value to


each input value in its domain.

• Dimension

In the context of tensors,the dimension refers to the number of indices


needed to specify an element within the tensor.For example,a scalar
(0-dimensional tensor) has no indices,a vector (1-dimensional tensor)

2
has one index,a matrix (2-dimensional tensor) has two indices,and so
on.Each index corresponds to a particular axis or direction within the
tensor.

• Coordinate of Tensors

It refers to the indices associated with each component of the ten-


sor within a given coordinate system.

• Symmetric Matrix

It is a square matrix that a equal to its transpose.

• Self Diffusion Coefficient

The self diffusion coefficient of a tensor refers to the diffusion coef-


ficient associated with the random motion of tensor quantities,with in
a material.

• Cross Diffusion Coefficient

The cross diffusion coefficient of a tensor describes the mutual dif-


fusion behaviour between different components or modes of the tensor
field.

• Commutative law

It states that for certain binary operations,the order of the operations


does not affect the result.
That is, a ∗ b = b ∗ a for a binary operation ’*’.

3
• Associative Law

It states that the grouping of elements does not affect the results of a
binary operation.
(a ∗ b) ∗ c = a ∗ (b ∗ c), for a binary operation’*’.

• Distributive Law

It states that,multiplication distributes over addition and substrac-


tion and addition distributes over multiplication.

• Arbitrary Tensor

It is a mathematical object that generates the concept of scalars,vectors


and matrices.

4
Chapter-1
TENSORS

Some physical quantities are specialised by their magnitude only,while


others by their magnitude and direction.But certain quantities are asso-
ciated with two or more directions.Such a quantity is called a tensor.The
stress at a point of an elastic solid is an example of a tensor which depends
on two directions,one normal to the area and other that of the force on it.

Tensor was first introduced by William Ron Hamilton in 1846 and later
became known to scientists through the publication of Levi-Civitu’s book
of absolute differential calculus.Because of its structured representation of
data format and ability to reduce the complexity of multidimensional ar-
rays,tensor has been gradually applied in various fields,such as Dictionary
Learning,Magnetic Resonance Imaging(MRI),Spectral Data Classification
and Image Deblurring.

The properties of tensors are independent of the frames of references


used to describe them.That is, why Einstein found tensors as a conve-
nient tool for formulation of his relativity theory.Since then the subject
of tensor analysis shot into prominence and is of great use in the study
of Reimannian Geometry,Mechanics,Elasticity,Electromagnetic theory and
numerous other fields of science and engineering.Tensors are generally de-
fined in terms of their ranks and components.

1 Some Basic Concepts of Tensors

1.1 Summation Conventions

Consider a sum of the type,


a1 x1 + a2 x2 + · · · + an xn
That is,
n
X
ai xi . . . (1)
i=1

5
In tensor analysis,a subscript of the symbol,x1 , x2 . . . , xn are replaced by
superscripts and we write these as x1 , x2 . . . , xn .The superscripts do not
account for various powers of x,but acts as labels to distinguish different
symbols.The power of a symbol (say:xi ) will be indicated as (xi )2 ,(xi )3
etc.Hence equation (1) is written as
n
X
ai x i (2)
i=1

A still simpler notation is to drop the summation sign and write as,
ai xi (3)
In this the repeated index i,successively takes up the values 1, 2, 3, . . . , n
and the expression (3) represents the sum of all such terms.The repeated
index i,once which the summation is to be done is called a dummy in-
dex.Since it doesn’t appear in the final result.The Einsteinian view of this
result is called summation convention.
eg:
Write the terms containing in ’S’
S = aij xi xj taking n = 3.

Solution:
Since the index i occurs both as a subscript and as a superscript we sum
an i,from 1 to 3
ie, S = a1j x1 xj + a2j x2 xj + a3j x3 xj
Now each form in S has to be summed up with respect to the repeated
index j from 1 to 3.
ie,
S = a11 x1 x1 + a12 x1 x2 + a13 x1 x3 + a21 x2 x1 + a22 x2 x2 + a23 x2 x3 + a31 x3 x1 +
a32 x3 x2 + a33 x3 x3

S = a11 (x1 )2 + a22 (x2 )2 + a33 (x3 )2 + [a12 + a21 ]x1 x2 + [a13 + a31 ]x1 x3 +
[a23 + a32 ]x2 x3

1.2 Transformation of Coordinates

In a three dimensional space the coordinates of a point are (x1 , x2 , x3 )


referred to a particular frame of reference.Similarly in an ’n’ dimensional

6
space,the coordinates of a point are ’n’ independent variables (x1 , x2 . . . , xn ),
with respect to a certain frame of reference.Let (x̄1 , x̄2 , . . . , x̄n ) be the coor-
dinates of the same point referred to another frame of reference, suppose
x̄1 , x̄2 , . . . , x̄n are independent single valued functions of x1 , x2 . . . , xn ,So
that,
x̄1 = ϕ1 (x1 , x2 , . . . , xn )
x̄2 = ϕ2 (x1 , x2 , . . . , xn )
..
.
x̄n = ϕn (x1 , x2 , . . . , xn )
or more briefly,
x̄i = ϕi (x1 , x2 , . . . , xn ) (4)
We can solve the equation (4) and express xi as functions ofx̄i ,So that,
x̄i = ϕi (x̄1 , x̄2 , . . . , x̄n ) (5)
The equations (4) and (5) are said to define a transformation of the coor-
dinates from one frame of reference to another.

1.3 Kronecker Delta

The quantity δij defined by the relation,


δij = 0, whenj ̸= i
and
δij = 1, whenj = i
is called Kronecker Delta.

Evidently
δ11 = δ22 = δ33 = · · · = δnn = 1
While,
δ12 = δ23 = · · · = δnn+1 = 0
We note that by summing up with respect to the repeated index j.
a3j δ2j = a31 δ21 + a32 δ22 + a33 δ23 + a34 δ24 + . . .
= a + a32 + 0 + 0

7
= a32
In general,
aij δkj = ai1 δk1 + ai2 δk2 + · · · + aik δkk + · · · + ain δkn
= 0 + 0 + · · · + aik + 0 + 0 . . .
= aik
note: Kronecker Delta is called after the late German Mathematician
Leopold Kronecker[1823-91] who made important contributions in algebra
and group theory.
Eg:
Show that aij Akj = ∆δik where δ is a determinant of order 3 and Aij

Solution:
By expansion of determinant we have,
a11 A11 + a12 A12 + a13 A13 = ∆
a11 A21 + a12 A22 + a13 A23 = 0
a11 A3 1 + a12 A32 + a13 A33 = 0
Which can be compactly written as,
a1j A1j = ∆
a1j A2j = 0
a1j A3j = 0
Using Kronecker Delta notation these can be combined into a single equa-
tion.
aij Akj = ∆δ2k , i=1
Similarly
a2j Akj = ∆δ2k
a3j Akj = ∆δ3k
All these mini equations are included in,
aij Akj = ∆δik

8
1.4 Diffusion Tensor

Diffusion Tensor refers to a mathematical representation used in the field


of Diffusion Tensor Imaging[DTI],a specialised Magnetic Resonance Imag-
ing[MRI] technique.The diffusion tensor characteristics the diffusion of wa-
ter molecules within biological tissues,providing valuable insights into tis-
sue micro structure,particularly in the brain.

The diffusion tensor is a 3 × 3 symmetric matrix that describes the 3


dimensional pixel of the image.Each element of the matrix represents the
rate and direction of diffusion along its corresponding axes .Mathemati-
cally,the diffusion tensor is denoted as,
 
Dxx Dxy Dxz
D = Dyx Dyy Dyz 
Dzx Dzy Dzz

Where,

• Dxx , Dyy , Dzz represent the self diffusion coefficients along x,y,z axes.
• Dxy , Dzx , Dyz , . . . represent the cross diffusion,which describe the cor-
relation between diffusion along different axes

9
Chapter-2
DIFFERENT TYPES OF TENSORS

2 Different Types Of Tensors

2.1 Scalars and Invariants

Let ϕ be a function of the co-ordinates xj and ϕ̄ be the functional value


under a transformation to the coordinates x̄j and ϕ = ϕ̄, then ϕ is called
a scalar or invariant.

note:Order/Rank of tensor represents the number of direction along which


the tensor can vary.A scalar is also called a tensor of order zero.

2.2 Contravariants Tensors of First Order

If N quantities Ai (i = 1, 2, . . . , N ) of a coordinate system xi (i = 1, 2, . . . , N )


are related to N other quantities Āj (j = 1, 2, . . . , N ) in another coordinate
system x̄j (j = 1, 2, . . . , N ) by the system of equations
N
k
X ∂ x̄k
Ā =
s=1
∂xs
k
can be written using summation convention as Āk = ∂∂xx̄s As ,k=1,2,. . . ,n.
Here As are called components of a contravariant tensor of first order or
first rank.

2.3 Covariant Tensor of First Order

If N quantities Ai in a co-ordinate system xi are related to N another


quantities Āj in another co-ordinate system x̄j by the equations
k ∂xs s
Ā = k A
∂ x̄
10
then As are called components of covariant tensor of first order or first
rank.

2.4 Contravariant Tensors of Higher Rank

1. If N 2 quantities Aij = (i, j = 1, 2, . . . , N ) of a co-ordinate system xi


are related to N 2 other quantities Ārs (r, s = 1, 2, . . . , N ) of another
co-ordinate system x̄j by the system then,
rs ∂ x̄r ∂ x̄s lm
Ā = A
∂xl ∂xm
and the quantities Alm are called components of a contravariant tensor
of rank two.
2. If N 5 quantities Aijklm (1 ≤ i, j, k, l, m ≤ N ) of a coordinate system
xi are related to N 5 another quantities Āpqrst (1 ≤ p, q, r, s, t ≤ N ) of
another co-ordinate system x̄j by the system then,

pqrst ∂ x̄p ∂ x̄q ∂ x̄r ∂ x̄s ∂ x̄t ijklm


Ā = A
∂xi ∂xj ∂xk ∂xl ∂xm
and the quantities Aijklm are called components of a contravariant
tensor of rank five.
Similarly,we can define higher rank contravariant tensors.

2.5 Covariant Tensor of Higher Rank

1. If N 2 quantities Aij (i, j = 1, 2, . . . , N ) of a co-ordinate system xi are


p
∂xq
related to N 2 quantities Ālm of the x̄j system then Ārs = ∂x ∂ x̄r ∂ x̄s Apq
and Apq are the components of a covariant tensor of the rank two or
of second order.
2. If N 4 quantities of Aijkl (1 ≤ i, j, k, l ≤ N ) of system xi are related to
Āpqrs of the system x̄j then,
∂xi ∂xj ∂xk ∂xl
Āpqrs = Aijkl
∂ x̄p ∂ x̄q ∂ x̄r ∂ x̄s
and Aijkl are the components of a covariant tensor of rank four or of
order four.

11
2.6 Mixed Tensor

It is a tensor which is both contravariant and covariant nature.

1. If N 2 quantities of Akl of xi are related to Āpq of x̄j ,then


∂ x̄p ∂ x̄l k
Āpq
= k q Al
∂x ∂x
and Apq are said to be the components of mixed tensor of rank two or of
second order with one contravariant and another of covariant nature.
2. If N 5 quantities Aqst i prm
kl of x are related to Āij of x̄j then,

∂ x̄p ∂ x̄r ∂ x̄m ∂xk ∂xl qst


Āprm
ij = q s A
∂x ∂x ∂xt ∂ x̄i ∂ x̄j kl
where Aqst
kl are said to be components of mixed tensor of rank five or
order five of which three are contravariant and two are of covariant
nature.

note:

1. The kronecker delta δji is a mixed tensor of order two.


2. The number of components of a tensor = (Dimension)rank
3. If the components of two tensors are equal in one co-ordinate sys-
tem,then they are equal in all co-ordinate systems.

2.7 Symmetric and Skew-Symmetric Tensors

A tensor is said to be symmetric with respect to two contravariant or


covariant indices if its components remain unaltered upon interchange of
the two indices,Thus the tensors Akl and Anp are symmetric if Akl = Alk
for all l,k and Amp = Apm for all m,p.

12
In general any tensor is called symmetric if it is symmetric with respect
to any two contravariant or two covariant indices.

Thus Aij ij ji ij ij ij ij
klm is symmetric if Aklm = −Aklm orAklm = Alkm orAklm = Akml ,
etc.

A tensor is said to be skew symmetric with respect to two contravariant


or two covariant indices,if its components change sign or interchange of
the two indices.

∴ Aij is skew symmetric if Aij = Aji and Aij


klm is called skew symmet-
ij ij
ric if Aklm = −Akml , etc.

13
Chapter-3
FUNDAMENTAL OPERATIONS WITH TENSORS

3 Fundamental Operation with Tensors

3.1 Addition

The sum(difference) of two tensors of the same order and type is another
tensor of the same order and type.

let Aij and Bij be two tensors of the same order and same type.Their
components in the coordinates system x̄1 , x̄2 , . . . , x̄n are Aij and Bij such
that,
∂xk ∂xl
Āij = Akl
∂ x̄i ∂ x̄j
and
∂xk ∂xl
B̄ij = Bkl
∂ x̄i ∂ x̄j

∂xk ∂xl
Āij ± B̄ij = (Akl ± Bkl )
∂ x̄i ∂ x̄j
that is,
∂xk ∂xi
C̄ij = Ckl
∂ x̄i ∂ x̄j
Thus Cij transforms in exactly the same manner as Aij and Bij and
is,therefore a tensor of the same order and same type.

3.2 Outer Product of Two Tensors

The outer product of two tensors is a tensor whose rank is the sum of
the ranks of them.If Aij is a contravariant tensor of rank two and Bk is
a covariant tensor of rank one then their outer product is a mixed tensor

14
ij
Ckl of order 3 such that,
∂ x̄i ∂ x̄j pq ∂xl
  
ij
Ckl = Āij B̄k = A Bl
∂xp ∂xq ∂ x̄k
∂ x̄i ∂ x̄j ∂xl pq
= p q k A Bl
∂x ∂x ∂ x̄

∂ x̄i ∂ x̄j ∂xl pq
Ckij
= p q k C̄l
∂x ∂x ∂ x̄
ij
Where,Ck is a mixed tensor of rank 3

As the product involves ordinary multiplication of the components of the


tensors it is called outer or open product.

note:

1. Every tensor cannot be written as product of two tensors of lower


rank.For this reason division of tensor is not always possible.
2. The outer product of tensors satisfies commutative,associative laws
and distributor over addition.

3.3 Contraction of a Tensor

Consider a mixed tensor Aijkl of order four.By the law of transformation,we


have
∂ x̄i ∂ x̄j ∂ x̄k ∂xs pqr
Āijk = A
l
∂xp ∂xq ∂xr ∂ x̄l s
In this point the covariant index l = a contravariant index i,
So that,
ijk ∂ x̄i ∂ x̄j ∂ x̄k ∂xs pqr
Āi = p q r i As
∂x ∂x ∂x ∂ x̄
∂ x̄j ∂ x̄k ∂xs pqr
= q r p As
∂x ∂x ∂x
∂ x̄j ∂ x̄k
= q r δps Apqr s
∂x ∂x

15
∂ x̄j ∂ x̄k pqr
= q r Ap
∂x ∂x
This shows that Aijk
i is a contravariant tensor of order two.

the process of getting a tensor of lower order(reduced by 2) by putting


a covariant index equal to a contravariant index and performing the sum-
mation indicated is known as contraction.

The tensors Aijk


i and Aijk
j obtained from contraction of the same ten-
sor Aijk
l are generally different from each other unless the tensor Aijk
l is
ijk ijk
symmetric with respect to i and j (ie, Aj = Ai )

3.4 Inner Product of two Tensors

Given the tensors Aij p ij p


k and Bqr ,if we first form their outer product,Ak Bqr
and contract this by putting p=k then the result is Aij k
k Bqr which is also a
tensor,called inner product of the given tensors.

Hence the inner product of two tensors is obtained by first taking their
outer product and then by contracting it.We can get several inner products
for the same two tensors by contracting in different ways.
Example:
Show that any inner product of the tensor Apr and Btqs is a tensor of rank
3. Solution:
The transformation laws for Apr and Btqs are,
∂ x̄p ∂xk i
Āpr = A ... (6)
∂xi ∂ x̄r k
and
∂ x̄q ∂ x̄s ∂xm jl
B̄tqs= j l B (7)
∂x ∂x ∂ x̄t m
∴ Inner product of Āpq and B̄tqs is,
 p k  q s m
∂ x̄ ∂x ∂ x̄ ∂ x̄ ∂x
Āpq B̄tqs = i q j l t
Aik Bm
jl
∂x ∂ x̄ ∂x ∂x ∂ x̄
∂ x̄p ∂ x̄s ∂xm k l jl
= δ A B
∂xi ∂xl ∂ x̄t j k m
16
k q
∂xk
(Since, ∂x ∂ x̄
∂ x̄q ∂xj = ∂xj = δjk )
∂ x̄p ∂ x̄s ∂xm i jl
= AB
∂xi ∂xl ∂ x̄t j m
Hence the inner product of Āpq and B̄tqs is a tensor of ranks 3.
Similarly putting p=t in the product of (6) and (7) noting that ,
∂ x̄p ∂xm ∂xm
i p
= i
= δim
∂x ∂ x̄ ∂x
Apr Bpqs is found to be tensor of rank 3.
Similarly,Apr Btqr can also be shown to be a tensor of rank 3.

3.5 Quotient Law

To ascertain that a set of given function forms the components of a ten-


sor,we have to verify,if the functions obey the tensor transformation laws.
But this is a very tedious job.A simple test is provided by the quotient law
which states that if the inner product of a set of functions with an arbi-
trary tensor is a tensor.Then these set of functions are the components of
a tensor.

the proof of this law is given below for a particular case.


Example:
show that the expression A(i, j, k) is a tensor if its inner product with an
arbitrary tensor,Bkjl is a tensor.
solution:
Let
A(i, j, k)Bkjl = Cil (8)
Where Cil is a tensor.In the coordinate system x̄i ,let (8) transforms to,
Ā(p, q, r)B̄rqs = C̄ps (9)

Where B̄rqs and C̄ps are the components of the tensors Bkjl and Cil . Express-
ing Brqs in terms of B̄kjl and C̄ps in terms of C̄il (9) takes the form,

∂ x̄q ∂ x̄s ∂xk jl ∂ x̄s ∂xl l


Ā(p, q, r) ,B = C (10)
∂xi ∂xl ∂ x̄r k ∂xl ∂ x̄p i

17
s i
Multiplying (8) by ∂∂xx̄ l ∂∂xx̄p and substracting from (10) we get,

∂ x̄q ∂ x̄s ∂xk ∂ x̄s ∂xi


 
Ā(p, q, r) i l r − A(i, j, k) l p Bkjl = 0
∂x ∂x ∂ x̄ ∂x ∂ x̄
Now Bkjl being an arbitrary tensor,the quantity within the brackets must
be identically zero,that is,

∂ x̄q ∂ x̄s ∂xk ∂ x̄s ∂xj


Ā(p, q, r) = A(i, j, k)
∂xj ∂xl ∂ x̄r ∂xl ∂ x̄p
∂xi ∂xj ∂ x̄r
Ā(p, q, r) = p q k = A(i, j, k)
∂ x̄ ∂ x̄ ∂x
But this is the law of tensor transformation.Hence A(i, j, k) is a tensor of
order 3,with i,j as covariant indices and k as contravariant index.

18
Chapter-4
APPLICATIONS OF TENSOR ANALYSIS

Tensor analysis finds application in various real life scenarios across differ-
ent fields.Here are some practical applications.

• Medical Imaging

Application: Tensor analysis is applied in medical imaging tech-


niques like MRI and diffusion tensor imaging(DTI).

Working Principle: In DTI, tensors are utilised to describe the


diffusion of water molecules in biological tissues.the diffusion tensor
represented by D,is a 3 × 3 symmetric matrix.The scalars (λ1 , λ2 , λ3 )
and vectors (v1 , v2 , v3 ) of this tensor provide information about the
magnitude and direction of water diffusion.
One crucial metric derived from the diffusion tensor is the Fractional
Anisotrophy(FA),which quantifies the degree of anisotrophy in the dif-
fusion process.The formula for FA is,
s  
1 (λ1 − M D2 ) + (λ2 − M D2 ) + (λ3 − M D2 )
FA =
2 λ21 + λ22 + λ23
Where MD is mean diffusivity given by,
λ1 + λ2 + λ3
MD =
3
This calculation involves obtaining the scalars from the diffusion ten-
sor.FA values range from 0 to 1,where 0 indicates isotropic diffu-
sion(equal diffusion in all directions) and 1 indicates highly anisotorpic
diffusion.
These calculations are crucial for understanding microstructural changes
in tissues,particularly in neurological studies where alterations in FA
values may indicate abnormalities or pathologies.
note:Keep in mind that actual implementation and interpretation in-
volve specialised software and expertise in MRI data analysis.

19
• Image Processing

Application: Tensor analysis is used in image processing for tasks


like facial recognition and object detection.

Working Principle: Image data can be represented as tensors,and


tensor operations helps in extracting features,recognising patterns and
enhancing the quality of images.

• Machine Learning and deep Learning

Application: Tensor operations are fundamental in neural networks


and deep learning models.

Working Principle: Tensors represent data in these models and ten-


sor operations are used for computations during training and inference
enabling the development of advanced machine learning algorithms.

• Weather prediction

Application: Tensor analysis is utilised in atmospheric modeling for


weather prediction.

Working Principle: Tensors represent complex atmospheric data,and


their analysis helps meteorologists model and simulate weather pat-
terns,improving the accuracy of weather forecast.

• Material Science

Application: Tensor analysis is employed in studying the proper-


ties of materials,such as crystallography and elasticity.

Working Principle: Tensors help characterize material properties


and predict their behaviour under different conditions,aiding in the
development of new materials for various applications.

20
• Robotics

Application: tensor analysis is used in robotic system for tasks like


motion planning and control.

Working Principle: Tensor represent the spatial relationships and


dynamics of robot components facilitating precise control and move-
ment in complex environments.

21
CONCLUSION

In conclusion, the project on tensor analysis and its application has illu-
minated the profound significance of this mathematical framework across
various scientific disciplines.Through rigorous study and analysis,we have
delved into the fundamental concepts of tensors,exploring their mathemat-
ical properties and operations.Furthermore,we have witnessed the indis-
pensable role tensors play in fields such as physics,engineering and machine
learning,where they serve as powerful tools for describing complex systems
and phenomena.From fluid dynamics to image processing,tensor analysis
provides a versatile and elegant language for modeling and understand-
ing real-world phenomena.As we conclude this project,we acknowledge the
ongoing relevance and potential for the exploration with in the realm of
tensor analysis,affirming its enduring importance in advancing scientific
inquiry and technological innovation.

22
BIBLIOGRAPHY

• Tensor analysis:Mathematical Foundation and Applications-


Luigi Capogna,Donatella Danielli and Scott D Pauls.
• Tensor Analysis Theory and Application- I.S Sokolnikoff.
• Higher Engineering Mathematics-B.S Grewal.
• Engineering Mathematics -N.P Bali and Manish Goyal.
• Introduction to the Mathematics of Medical Imaging-Charles L Ep-
stein.

23

You might also like