Professional Documents
Culture Documents
Real Time Eye Blinking Detectionand Tracking Using Opencv
Real Time Eye Blinking Detectionand Tracking Using Opencv
net/publication/272172830
CITATIONS READS
27 7,855
3 authors, including:
Some of the authors of this publication are also working on these related projects:
IET International Conference, Wireless Mobile and Multimedia networks, pp.259-262, 11-12 January 2008. (Paper available on IEEE Explore) View project
All content following this page was uploaded by Atul Borkar on 13 February 2015.
ABSTRACT
Robust and accurate algorithm in real time eye tracking system has been a fundamental and challenging problem
for computer vision. This paper proposed a new method to estimate eye-position and direction based on initial
centroid analysis technique. The proposed method was validated by tracking eye position within high and low
occlusion condition. In this paper, we represent a methodology for detection of eye blinking robustly in real time
environment. Here we use the connected component technique and centroid method to track and blinking of eyes
on OpenCV platform which is open source and developed by Intel.
Keywords-OpenCv, blink detection, connected component, Eye tracking, DR, FAR, Success Rate
www.ijera.com 1780 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
been associated with decreased information interval between two adjacent calculations of feature
processing. So alpha, beta and delta rhythm are extraction. Four features were chosen to characterize
independent and strongly related to fatigued state, so the power spectral density. As a result, in total 304
they three are selected as studying indicators [6]. quantitative EEG features (4 kinds of features, 19
Electroencephalogram (EEG) [1][2-7] is the channels, and 4 frequency bands) were extracted. RF
most commonly studied signal for vigilance is used as Machine Learning Technique for
estimation. In most of the EEG-based vigilance classification / recognition of features.
research, EEG data of certain people is captured for The work in [10] senses different
experimentation and is labeled as per the vigilance physiological parameters like galvanic skin resistance
states by some observer. Then different machine (GSR), heart rate variability (HRV), body temperature
learning techniques can be applied on such data to find (THE) and in the context vehicle driving, different
co-relation between EEG signal and observed fatigue mechanical data like steering wheel positioning, brake
state. pedal positioning etc. of driver by sensors put on
The work in [7] claims that most existing steering wheel. A fuzzy logic classifier trained with
methods have focused on employing supervised offline statistical results is used to classify the
learning methods to estimate vigilance states. vigilance state of driver. The classifier continuously
However, till now, there is no standard criterion for monitors the data to detect a possible decrease in
vigilance scale labeling, and the existing vigilance driver vigilance. If the driver is found to be sleepy,
labeling methods are complex, expensive and then the system is put on alert to focus on detecting a
sometimes unreliable and therefore choose clustering possible sleep attack and to alert the driver.
methods to mine the latent distribution of EEG for
vigilance estimation. The work extracts features in 2.2BEHAVIORAL MEASURES
EEG using spatial filter and further clusters it into 4 Behavioral Measures [5] [1] [11] are also
clusters as wakefulness (W), middle state 1 (M1), accurate and objective. This category of devices, most
middle state 2 (M2), and sleepiness (S). commonly known as acti-graph, is used to measure
The paper [8] proposes a system that sleep based on the frequency of body movement. The
combines electroencephalogram (EEG) power spectra number of body movement recorded during a
estimation, independent component analysis and fuzzy specified time period, or epoch, has been found to
neural network models to estimate subjects’ cognitive significantly correlate with the presence of sleep and
state in a dynamic virtual-reality-based driving has a significant correlation with EEG.
environment. It first uses the independent component In [11], jerk profiles for the machine-human
analysis (ICA) to remove artifacts and extract possible interfaces of vehicle are sensed as measures for
sources of brain activities from continuous EEG assessing vigilance of the vehicle driver. Responding
recordings. After calculating power spectrum of each to the stimulus was considered as sign of vigilance.
ICA components, it correlates the information The work in [12] claims that short pauses in
between the human alertness and the ICA power performances are more indicative measures of
spectra. Here human alertness is given by a vigilance.
performance index (driving error expressed as
deviation between the center of the vehicle and the 2.3VISUAL MEASURES
center of the lane). They select the alpha-band power An increasing research interest has focused
in two major ICA components with the maximum on developing systems that detect the visual facial
correlation coefficients for individual subject as the feature changes associated with fatigue with a video
input features of the estimation models. The camera. These facial features include eyes, head
correlation between subject’s driving performance position, face, or mouth. This approach is non-
index (SDPI) and 33-ICA components are calculated intrusive and becomes more and more practical with
to select the most useful representative components. the rapid development of camera and computer vision
The highest correlated 2-ICA-components power technology [13-14] [15].
spectra were selected. Then the selected alpha-band People in fatigue exhibit certain visual
features are fed to the self-constructing neural fuzzy behaviors that are easily observable from changes in
inference network (SONFIN) to learn the relationship facial features like the eyes, head, and face. Visual
between the EEG spectra and driving performance. behaviors that typically reflect a person’s level of
The paper [9] investigates the use of RF fatigue include eyelid movement, head movement,
(Random Forest) to select the key EEG features from gaze, and facial expression. Various studies have
a broad pool of available features, for the purpose of shown that eyelid activities are strongly related with
multilevel mental fatigue EEG classification. The level of vigilance, intention, and needs. Percentage of
EEG data were labeled to 5-level mental fatigue. EEG eyelid closure (PERCLOS) has been found to be the
data passed through the feature extraction windows, most reliable and valid measure of a person’s alertness
with one window for each EEG channel. The length of level among many drowsiness detection measures.
each feature extraction window was 2 s (or 334 EEG PERCLOS measures the percentage of eyelid closure
samples), with half second (or 84 EEG samples) time over the pupil over time and reflects slow eyelid
www.ijera.com 1781 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
closures (droops). Another potentially good fatigue pupil ratio is above 80% of its nominal size or the
indicator is the average eye closure and opening speed pupils are lost, being in the CLOSING_STATE, an
(AECS). Since eye opening/closing is controlled by FSM transition to the CLOSED_STATE is provoked,
the muscle near the eyes, a person in fatigue may which means that the eyes are closed. A new detection
open/close eyes slowly due to either tired muscles or of the pupils from the CLOSED_STATE produces a
slower cognitive processing. Other potentially good change to the OPENING_STATE.
fatigue parameters include various parameters that In [5] probabilistic model (Bayesian model)
characterize pupil movement, which relates to one’s of fatigue is presented. Fatigue is modeled as the
gaze and his/her awareness of the happenings in target hypothesis variable that we intend to infer while
surroundings. The movement of a person’s pupil other contextual factors, which could cause fatigue
(gaze) may have the potential to indicate one’s (e.g. sleep quality, circadian, work condition, work
intention and mental condition. For example, for a environment, and physical condition), and visual cues,
driver, the nominal gaze is frontal. Looking at other which are symptoms of fatigue, are information
directions for an extended period of time may indicate variables. On the other hand, when a person is
fatigue or inattention. Furthermore, when people are fatigued, he/she tends to exhibit various visual
drowsy, their visual awareness cannot cover a wide behaviors that deviate from the nominal behaviors.
enough area, concentrating on one direction. Hence, The behaviors that typically capture the cognitive state
gaze (deliberate fixation) and saccade eye movement of a person include eye movement, head movement,
may contain information about one’s level of and facial expression. The Conditional Probability
alertness. Besides eye activities, head movement like Tables (CCPTs) required for Bayesian Mode is
nodding or inclination is a good indicator of a person’s constructed using several series of large-scale
fatigue or the onset of a fatigue. It could also indicate subjective surveys.
one’s attention. Head movement parameters such as
head orientation, movement speed, frequency, etc. If H is hypothesis variable, o is observed
could potentially indicate one’s level of vigilance. evidence and s are hidden variables, H can be inferred
Finally, facial expression may also provide with following probability.
information about one’s vigilance. For example, a
typical facial expression that indicates the onset of
fatigue is yawning. The major benefits of the visual
measures are that they can be acquired non-intrusively Since fatigue is developed over period of time, in [5],
[5]. it is actually modeled as Dynamic Bayesian Network
Most of the fatigue detection work finds eyes (DBN). If T is a time boundary, hidden variables S= {
as an interesting and useful visual measure, and s0 , ..., sT− 1} ,observations O= {o0, ... , oT−1} and
fatigue is detected by detecting geometries of eyes, hypothesis H= {h0 , ... ,hT− 1} , the probability
iris. The eyes detection method in [14] uses new circle distribution of DBN expressed as following:
detection operator and then uses neural network to
validate the detected regions as eyes. An appearance
based model in [16] uses Support Vector Machine to
detect eyes and needs large number of training sets
composed of different objects, orientations, light
conditions, and eye closure degree. The Hypo-
Vigilance Detection based on Eyelid Behavior [17] As H and S at each point in time depends
detects first face in the image and then percentage of also on H and S at previous point in time respectively.
eye closure (PERCLOS) and eyelid distance changes Different eye features are proposed in [15]
(ELDC)are computed for fatigue detection and eye like blink duration, eye closure, blinking rate, head
closure rate (CLOSNO) is used for distraction nodding frequency etc. Finally drowsiness
detection. Finally, the computed values are compared classification is defined as to combine these different
with some simple threshold to estimate driver hypo- features to a single continuous-valued drowsiness
vigilance. Work in [18] is based on fact the measure or the discrete classes awake, questionable
‘Variations in the papillary size which was known and drowsy. An artificial neural network (ANN) was
controlled by the autonomic nervous system were used for classification.
found to be positively correlated with the level of Yawning is used as fatigue measure in [19]
vigilance in healthy normal subjects’. It computes the and it detect yawing by detecting first mouth in an
size of pupil by finding its minimum closing rectangle image and then its size and shape as increasing or
and finds variation in pupil size over time to detect decreasing. In [20], the face and features of the face
fatigue. like eyes, mouth, and facing direction are identified
Work in [13] detects eyes and tracts pupils. It and tracked which can be used for driver’s fatigue
also associates state with every eye status as detection.
CLOSED, CLOSING etc. and changes state based on
certain observation in eyes. For example when the
www.ijera.com 1782 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
3.1.4Removal of Noise
The removal of noise in the binary image is
very straightforward. Starting at the top, (x2,y2),move
c. Threshold value 200 left on pixel by decrementing x2, and set each y value
Fig.2 Different Threshold Values for Binarization
www.ijera.com 1783 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
IV. ALGORITHM
An explanation is given here for drowsiness
detection procedure. For that we capture the video
from camera. OpenCV supports capturing images
from a camera or a video file (AVI) First we initialize
the capture from a camera by setting zero which grab
the frame from default camera. After getting the frame
we converted into Gray scale. Using OpenCV
conversion we can convert color image into Gray
scale image. This Gray scale frame will give us the
binary image. Then we locate the eye using centroid
method by which we can locate the eye center in the
face for tracking purpose. To find the centroid of an
image, the image first has to be binarized. The
centroid program will then calculate the centroid
based on where the majority of black pixels are
located. For example if you have a complete black 20
x 20 BMP, then the centroid would be located in the
exact center of that square. After getting this centroid
we mark the eye with rectangle with connected
component technique by which we can easily figure
out how many components of pixels are connected
each other. If eyes are found during tracking then its X
and Y coordinates are detected by finding centroid of
contours. This gives the center coordinates of the eye
pupil. This method is for detection of eye blinking.
Based on blink detection we can decide by connecting
loop that whether driver is drowsy or active.
Fig.4. Drowsiness Detection Algorithm
www.ijera.com 1784 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
www.ijera.com 1785 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
Table.1: DR, FAR, Success Rate [5] QiangJi, PeilinLan, Carl Looney “A
Test 1 2 3 4 5 Probabilistic Framework for Modeling and
Real-Time Monitoring Human Fatigue”,
Parameter IEEE Transactions on Systems, Man, and
TP 43 62 35 51 60 Cybernetics—part A: Systems and Humans,
vol. 36, no. 5, September 2006
TN 10 16 4 7 5
[6] Li Shiwu, Wang Linhong, Yang Zhifa,
FP 0 2 4 1 2 JiBingkui, QiaoFeiyan, Yang Zhongkai, "An
FN 0 0 2 1 1 Active Driver Fatigue Identification
DR 100 100 94.5 98. 98.3 Technique Using Multiple Physiological
9 07 6 Features" ,2011 International Conference on
FAR 0 3.12 10.2 1.9 3.22 Mechatronic Science
5 5 2 [7] Li-Chen Shi, Bao-Liang Lu, “Dynamic
Success Rate 100 96.9 96.4 98. 96.8 Clustering for Vigilance Analysis Based on
7 9 07 3 EEG”, Proceedings of International
Conference of the IEEE Engineering in
Medicine and Biology Society, pp. 54-57,
CONCLUSION
Vancouver, BC Canada, Aug. 2008.
[8] Chin-Teng Lin; Yu-Chieh Chen; Ruei-Cheng
In this paper we presented the self -
Wu; Sheng-Fu Liang; Teng-Yi Huang;
developed algorithm for drowsiness detection. The
"Assessment of driver's driving performance
proposed method easily detects the blinks and the
and alertness using EEG-based fuzzy neural
drowsiness as success rate is high because centroid
networks ", IEEE International Symposium
and connected component methods are used instead of
on Circuits and Systems, 2005. ISCAS 2005,
haar classifier.
Issue Date: 23-26 May 2005 On page (s):
Our system is designed for detecting
152 - 155 Vol. 1, Print ISBN: 0-7803-8834-8
drowsiness detection for in the real time. The
[9] Lifang Deng; XingliangXiong; Jin Zhou;
implementation of this approach runs at 25-30 frames
Ping Gan; Shixiong Deng; “Fatigue
per second. The application is implemented in C++
Detection Based on Infrared Video
using OpenCV library in Windows environment with
Pupillography”, 4th International Conference
a single camera view. The methods we presented for
on Bioinformatics and Biomedical
video surveillance system shows promising results
Engineering (iCBBE), Publication Year:
under good lighting conditions.
2010 , Page(s): 1 – 4
[10] Alessandro Giusti, Chiara Zocchi, and
REFERENCES
Alberto Rovetta, “A Noninvasive System for
Journal Papers:
Evaluating Driver Vigilance Level
[1] Patrick K. Sullivan et al, “Systems and
Examining both Physiological and
methods for intelligent medical vigilance with
Mechanical Data”, IEEE Transactions on
alert cause indication”, U.S. Patent
Intelligent Transportation Systems, Vol. 10,
Publication number: US 2010/0052917 A1,
No. 1, March 2009
Pub Date 4 march 2010.
[11] A.V. Desai, M.A. Haque, “Vigilance
[2] Mu Li, Jia-Wei Fu and Bao-Liang Lu,
monitoring for operator safety: A simulation
“Estimating Vigilance in Driving Simulation
study on highway driving”, Journal of Safety
using Probabilistic PCA”, 30th Annual
Research 37 (2006) 139 – 147
International IEEE EMBS Conference
[12] Jokinen, Tapio, Saarenpää-Heikkilä, Outi,
Vancouver, British Columbia, Canada,
Laippala, Pekka, “A new visual performance
August 20-24, 2008 Electric Engineering and
test offers new variables for assessing
Computer August 19-22, 2011, Jilin, China
daytime vigilance: Short pauses in
[3] Weirwille,W.W. (1994), “Overview of
performance”, Behavior Research Methods,
Research on Driver Drowsiness Definition
Springer New York, Volume: 31 Issue: 1
and Driver Drowsiness Detection,” 14th
1999-03-01
International Technical Conference on
[13] Bergasa, L.M. Nuevo, J.Sotelo, M.A.Barea,
Enhanced Safety ofVehicles, pp 23-26.
R.Lopez, M.E “Real-time system for
[4] B. F. Momin and Parag P. Abhyankar
monitoring driver vigilance”, Intelligent
“Current Status and Future Research
Transportation Systems, IEEE Transactions
Directions in Monitoring Vigilance of
on March 2006, Volume: 7 Issue: 1, on
Individual or Mass Audience in Monotonous
page(s): 63 – 77, ISSN: 1524-9050
Working Environment” International Journal
[14] T. D'Orazio, M. Leo, A. Distante, “Eye
on Soft Computing (IJSC) Vol.3, No.2, May
detection in face images vigilance system”,
2012
IEEE Intelligent Vehicles Symposium
www.ijera.com 1786 | P a g e
Dhaval Pimplaskar et al Int. Journal of Engineering Research and Application www.ijera.com
ISSN : 2248-9622, Vol. 3, Issue 5, Sep-Oct 2013, pp.1780-1787
www.ijera.com 1787 | P a g e