You are on page 1of 5

International Journal of Computer Applications (0975 – 8887)

Volume 61– No.10, January 2013

Optical Flow Measurement using Lucas kanade Method

Dhara Patel, Saurabh Upadhyay,


ME-CSE student of Computer science and Engineering, Asso.Prof. of Computer science and Engineering
S.P.B. Patel Engineering College, Saffrony, Mehsana, Department,
India. S.P.B. Patel Engineering College, Saffrony, Mehsana,
India.

ABSTRACT Motion Segmentation


Motion estimation is demanding field among researchers to Motion Estimation
compute independent estimation of motion at each pixel in
most of general. Motion estimation generally known as Motion Parameter Estimation
optical or optic flow. In this paper, overview of some basic
concepts of motion estimation, optical flow and Lucas kanade Motion segmentation means to identify the moving object
method has been provided by us. Lucas kanade method is one boundaries in frame. And number of parameters included in
of the methods for optical flow measurement. It is a
motion estimation like rigid bodies, translation movement,
differential method for optical flow estimation.
rotation and illumination etc.
1. INTRODUCTION Motion estimation techniques are also used in many other
Motion is very important part in image sequences. Motion applications: –Computer vision, target tracking, industrial
estimation is the process of determining motion vectors that monitoring. The motion vectors may relate to the image or its
describe the transformation from one 2-D image to another parts such as blocks, pixels etc. Chiara Ercole [2] suggested
[1]. According to Jaling wu [1]-motion estimation mean the that 2D motion results from the projection of moving 3D
estimation of velocity of image structures from one frame to objects on the image plane. The 2D projection is called
another frame in time sequence of 2-D image. apparent motion or optical flow. The projected motion
recovered from intensity variation information of a frame
sequence. Fig.2 [4] defines various motion fields. Generally
there are three types of motion such as: forward, horizontal
and rotation.
p’

t t + ∆t

Fig.1: 2-D image of the pixel location. Fig.2: various motion fields.

Fig.1 shows the 2-D displacement of the pixel located at point 2. OPTICAL FLOW
p from frame at time t to frame at time . The motion Optical flow or optic flow is the pattern of apparent motion of
field can almost always be unambiguously related to objects, edges and surface in a visual scene caused by the
translational and rotational velocities of rigid surfaces [1]. relative motion between an observer (an eye or a camera) and
These are the following equations for translation and rotation. the scene [3].

Translation- ,
Rotation of Optical flow Optical flow
Here I is identity matrix.
Observer (3d- representation) (2d- representation)
Rotation + Translation:

Or

Where

The motion estimation is mainly classified into two


categories. Ja-ling wu suggested,
Fig.3: The optical flow experienced by a rotating observer.

6
International Journal of Computer Applications (0975 – 8887)
Volume 61– No.10, January 2013

Fig.3 [3] defines the direction and magnitude of optic flow at The 2D and 3D Motion estimation of optical flow according
each location and also defines the length of each arrow. Fig.4 to J.L.Barron and N.A.Thacker [5].
[4] defines the relation between Motion Field and Optical 2D motion equation - Assume is the center pixel in a
Flow Field. neighbourhood and moves by , in time to
. Since and
• Motion Field = Real world 3D motion are the images of the same point (and therefore the
• Optical Flow Field = Projection of the motion field onto the same) we have [5]:
2d image.
= . (1)

This assumption for the 2D Motion Equation. The small local


translations provided , , are not too bigger. We can
perform a 1st order Taylor series expansion about in
equation (1) to obtain [5]:

=
(2)
Fig.4 : The relation between motion field & optical flow
Where H.O.T. is the Higher Order Terms, which we assume
field. are small term can safely be ignored. Using the above two
J.L.Barron and N.A.Thacker [5] suggested that 2D Motion equations we obtain:
Field can be defined as -2D velocities for all visible points. Or
And Optical Flow Field can be defined as- Estimate of the 2D
motion field. Optical flow is an approximation of the local And finally we get
image motion based upon local derivatives in a given
sequence of images. Optical flow techniques such as motion
detection, object segmentation, time-to-collision and
expansion calculations, motion encoding, and stereo disparity
Here = and = are the x and y components of
measurement utilize this motion of the objects' surfaces and
edges. image velocity or optical flow and , and are image
intensity derivatives at . We normally write these partial
The 2D image sequences used here are formed under
derivatives as:
perspective of projection via the motion of a camera and scene
objects. The 3D volume sequences used here were formed = , = and =
under orthographic projection for a moving object. The
optical flow is tightly linked to the recovery of the 2D are intensity derivatives. This equation can be
projected motion derived from acquisition and projection on
rewritten more compactly as:
the image plane of 3D objects motion in the real world [5].
All the general considerations made on 3D motion projection ・ , =
are still valid dealing with optical flow. ∙ =
Optical flow constraints (images) according to [6] would be. Where = is the spatial intensity gradient and
= , is the image velocity or optical flow at pixel (x, y)
at time t. I ∙ = −It is called 2D motion.
3D motion equation- The 3D motion equation is a simple
extension of the 2D motion equation. In 3D block
at (x, y, z) at time t moving by to
over time . Since =
we can perform a 1st order Taylor
series expansion and obtain [5]:
Fig.5: 2D motion of pixel.
The calculations of optical flow are as follows:
O= – Where = =( , , ) is the 3D volume velocity
≈ – or 3D optical flow and Ix, Iy, Iz and It are the 3D spatio-
temporal derivatives. This Equation can be written more
≈ – compactly as:

≈ ∙ = −It

In the limit as u and v go to zero, this becomes exact Where = is the 3D spatial intensity
gradient and is the 3D velocity.

7
International Journal of Computer Applications (0975 – 8887)
Volume 61– No.10, January 2013

3. OPTICAL FLOW BASED MOSTION be assured to be fulfilled, optical flow based techniques are
generally the better choice. In other cases, when large
ESTIMATION displacements of small structures are expected, correlation-
There are several methods for motion estimation [2]: based approaches usually perform better.
 Feature matching methods;
 Pel-recursive methods; In the section 4, explains one technic of optical flow which is
 Deterministic model based methods; Lucas kanade technic and its calculation for the video.
 Stochastic model based methods;
 Optical flow based methods;
4. LUCAS KANADE METHOD FOR
OPTICAL FLOW MEASUREMENT
There are several techniques which have been developed to The Lucas–Kanade method is a widely used in differential
approach the computation of optical flow. Such as [2]
method for optical flow estimation and computer vision [9].
 Gradient based approach This method solves the basic optical flow equations for all the
 Correlation-based approach pixels in that neighbourhood, by the least squares criterion. it
 Spatio-temporal energy based approach is a purely local method because it cannot provide flow
 Phase-based approach information in the interior of uniform regions of the image.
Here assumes that the flow is essentially constant in a local
A. Gradient based approach neighbourhood of the pixel under consideration [9].
In this approach first method being proposed is the Horn and
Schunck. This method initially defines moving object and The advantages and disadvantages of Lucas kanade method is:
sensor. The second method is the Lucas and Kanade. The Advantages –this method is easy compare another method,
iteration is not done in a strict pixel by pixel way, but very fast calculation and accurate time derivatives.
considering a small neighbourhood of the pixel [2]. So, a
Disadvantage - errors on boundaries of moving object [10].
minimization rule is used, corresponding to a least square fit
of the brightness invariance constraint. Thus the optical flow equation can be assumed to hold for all
B. Correlation-based approach pixels within a window centered at p. namely, the velocity
This method is very close to block matching. The frame is vector for local image must satisfy. The Lucas-
here divided into fixed-size blocks and also this frame Kanade method assumes that the displacement of the image
matching by a searching window. The displacement to the contents between two nearby frames are small and
block that shows the highest likelihood or the lowest approximately constant within a neighborhood of the point p
dissimilarity is taken as measure of the translational motion
[9].
vector to our reference block [2]. The first method for this
approach is the anandan’s method. This computation starts at
the highest level (at the lowest resolution) and a full search is
performed by this method. The highest level is the one at
maximum resolution. The passage from a higher level, L+1, to
a lower one, L is done by interpolation and then subtraction .
from the upper level, L+1.What is the error image at level L. .
So the image at level L can be recovered by adding the error .
at level L to the interpolated estimate at level L [2].
The Singh’s method, divides the information in two
categories: the conservation information and the Where P1,P2…Pn are pixels inside the window,
neighborhood information. Conservation information- are the partial derivatives of the image I
estimate propagates in the neighboring area and then it with respect to position x, y and time t, evaluated at the
iteratively updates the motion vector. Neighborhood point Pn and at the current time. The equations can be written
information - The neighbourhood information are collected in matrix form:
from areas surrounding our interest pixel which have
sufficient intensity variations.

C. Spatio-temporal energy based approach


In the spatial temporal frequencies define the velocity of
translational moving object. The signal corresponding to the A= ,v= , b=
movement is composed by several spatial frequencies.
Motion can be evaluated by using Spatio-temporally
oriented filters.
D. Phase-based approach
Such as differential techniques show better accuracy but Using the least squares principle determined the solution of
are less efficient in implementation and lack a single Lucas kanade [9].
useful confidence measure. The 3-D structure tensor
technique yields the best results with respect to systematic Now , we can write this equation in another form
errors and noise sensitivity.
so we get:
According to [8] optical flow based techniques correctly
estimate normal flow. If the temporal sampling theorem can

8
International Journal of Computer Applications (0975 – 8887)
Volume 61– No.10, January 2013

Where is the transpose of metrix A, so we can write,  There is some problem to move pixel position like its
neighbors:
=
–window size is too large
–what is the ideal window size?

5. LUCAS KANADE FEATURE


Here n=1 to N. TRACKING
Lucas Kanade feature tracker was used in order to determine
the movement of each sub target. The feature finding
The matrix is often called the structure tensor of the
algorithm was also modified using eigen values in order to
image at the point p. determine the trackable features according to Justin graham
Weighted window – [11]. This involves keeping track of the highest eigen values
over the search area. There are following steps for the feature
The plain least squares solution above gives the same
tracker:
importance to all N pixels in the window. In practice it is 1. First compute the intensity for each pixel.
usually better to give more weight to the pixels that are closer 2. For each pixel position compute the gradient matrix and
to the central pixel p. For that, one uses the weighted version store an eigen value of matrix.
of the least squares equation [9], 3. Store each pixel position in the score matrix S and
separate the high scoring pixels by flag matrix F and
, We can write this equation in another form region size k and flag region size f.
so we get: 4. Take the top n eigen values and use those for the
trackable features.
-1
This algorithm effectively finds the edges of the image which
Where W is an n×n diagonal matrix containing the contain the most information. The Lucas Kanade
weights Wii = wi to be assigned to the equation of pixel . implementation could keep track of targets throughout affine
transformations and occlusions. Although there are still some
That is, it computes: = improvements to be made to handle occlusion and maintain
stability [11].
Lucas Kanade tracker when the tracker is no longer able to
maintain a good track on the target. The Lucas Kanade tracker
does not require a particle filter.
Particle Filter - this particle system implementation, the
weighting system will have two modes that can be activated at
The weight wi is usually set to a Gaussian function of the a time that is convenient for the tracking system. The first
distance between and p. mode will take the final output of the Lucas Kanade tracker
and use those as inputs to the particle system measurements
Lucas and Kanade and others implemented a weighted least [11].
squares (LS) fit of local first-order constraints to a constant The particle system is still going through all of the previously
model for v in each small spatial neighbourhood by described steps which consist of:
minimizing [5].  The next frame predicts the constant velocity motion
model.
 Weight each particle based on its distance from the
Where denotes a window function that gives more Lucas Kanade targeting output Using the current frame.
influence to constraints at the center of the neighbourhood.  The Gaussian random distribution for speed base.
 Resample the particles based on the weight.
Compute Iterative LK at highest level  Repeat step first until complete.
For Each Level I [7];
 Take flow u(i-1), v(i-1) from level i-1 The Lucas Kanade algorithm returns a high pixel difference
 Up sample the flow to create u*(i), v*(i) matrices of score or a state of rapid change for the target.
twice resolution for level i.
 Multiply u*(i), v*(i) by 2 6. CONCLUSION
 Compute It from a block displaced by u*(i), v*(i) In this paper, overview of some fundamental concepts related
 Apply LK to get u’(i), v’(i) (the correction in flow) to motion estimation and optical flow has been provided. This
 Add corrections u’(i), v’(i) to obtain the flow u(i), v(i) at paper presented the Lucas kanade method which is widely
I th level, i.e., u(i)=u*(i)+u’(i), v(i)=v*(i)+v’(i) used in differential method for optical flow estimation and
computer vision. The Lucas kanade method has a great
Errors in Lucas-Kanade according to CeLiu [6] - causes of potential for authentication technic. In future we would use
errors in this procedure are following: the Lucas kanade method for feature extraction in intelligent
 ATA is easily invertible. video authentication.
 There is not much noise in the image when particular 7. REFERNCES
assumptions are violated.
[1] Ja-Ling Wu, “Motion Estimation for Video Coding
 Brightness constancy is not satisfied. Standards”, National Taiwan University.
 The motion is not always small.

9
International Journal of Computer Applications (0975 – 8887)
Volume 61– No.10, January 2013

[2] Chiara Ercole,"LPA-ICI approximation: applications to [7] Mester R., Some step towards a unified motion
motion estimation and video denoising", University of estimation procedure. In proceedings of the 45th IEEE
Roma TRE, Year 2003-2004. Midwest symposium on circuits and systems, 2002
[3] HustonSJ, Krapp HG. Kurtz, Rafael. "Visuomotor [8] Viral Borisagar and Hiral Raveshiya, “Motion Estimation
Transformation in the Fly Gaze Stabilization System" Using Optical Flow Concepts “, Gujarat University.
Bielefeld University, Germany, Year 2008.
[9] Bruce D. Lucas “Image Matching by the Method of
[4] Lihizelnikmanor, “motion estimation”, Cambridge Differences”, Carnegie Mellon University, year 1984.
University, year 2008.
[10] Mariya Zhariy, “Introduction to Optical Flow”, Uttendorf
[5] J.L.Barron and N.A.Thacker,”Computing 2D and 3D 2005.
Optical Flow”, University of Manchester, year 2005.
[11] Justin graham, “target tracking with Lucas kanade optical
[6] CeLiu, “Motion Estimation”, Microsoft Research New flow “, University of Texas, year 2010.
England year 2011.

10

You might also like