You are on page 1of 49

HEART BEAT MONITORING USING ICA,PCA,FFT

A PROJECT REPORT

Submitted by

PRIYA S (312417104070)

PREETHA CATHERINE S (312417104068)

in partial fulfilment for the requirement of award of the degree

of

BACHELOR OF ENGINEERING
IN
COMPUTER SCIENCE AND ENGINEERING

St. JOSEPH’S INSTITUTE OF TECHNOLOGY

CHENNAI - 119

ANNA UNIVERSITY: CHENNAI 600 025

JULY-2021
ANNA UNIVERSITY : CHENNAI 600 025

BONAFIDE CERTIFICATE

Certified that this project report “HEART BEAT MONITORING USING

ICA,PCA,FFT” is the bonafide work of “PRIYA S(312417104070) and

PREETHA CATHERINE S(312417104068)” who carried out the project under

my supervision

SIGNATURE SIGNATURE

Dr. J. DAFNI ROSE M.E ,Ph.D., Dr. S. Padmakala M.E., Ph.D,


Professor, Professor and & Supervisor,
Head of the Department, Computer Science andEngineering,
Computer Science and Engineering, St. Joseph’s Institute of Technology,
St. Joseph’s Institute of Technology, Old Mamallapuram road,
Old Mamallapuram road, Chennai - 600119.
Chennai - 600119.

i
ACKNOWLEDGEMENT

We also take this opportunity to thank our honourable Chairman Dr. B.


Babu Manoharan, M.A., M.B.A., Ph.D. for the guidance he offered during our
tenure in this institution.

We extend our heartfelt gratitude to our honourable Director Mrs. B. Jessie


Priya, M.com., for providing us with the required resources to carry out this project.

We express our deep gratitude to our honourable CEO Mr. B. Sashi Sekar,
M.Sc (INTL.Business) for the constant guidance and support for our project.

We are indebted to our Principal Dr. P. Ravichandran, M.Tech., Ph.D. for


granting us permission to undertake this project.

Our earnest gratitude to our Head of the Department Dr. J. Dafni Rose,
M.E., Ph.D., for her commendable support and encouragement for the completion of
the project with perfection.

We express our profound gratitude to our guide Dr. S. Padmakala M.E.,


Ph.D for her guidance, constant encouragement, immense help and valuable advice
for the completion of this project.

We wish to convey our sincere thanks to all the teaching and non-teaching
staff of the department of COMPUTER SCIENCE AND ENGINEERING without
whose co-operation this venture would not have been a success.

ii
CERTIFICATE OF EVALUATION

College Name : St. JOSEPH’S INSTITUTE OFTECHNOLOGY


Branch : COMPUTER SCIENCE ANDENGINEERING
Semester : VIII

Name of the Supervisor with


Sl.No Name of the Title of the designation
Students Project

PRIYA S Heart beat


1
(312417104070) monitoring
Dr. S. Padmakala
using
M.E., Ph.D
PREETHA ICA,PCA,FFT
2 Professor &
CATHERINE Supervisor,
S Department of CSE
(312417104068) St.Joseph’s Institute Of
Technology

The reports of the project work submitted by the above students in partial

fulfilment for the award of Bachelor of Engineering Degree in Computer Science

and Engineering of Anna University were evaluated and confirmed to be reports

of the work done by above students.

(INTERNALEXAMINER) (EXTERNAL EXAMINER)


iii
ABSTRACT

Heart Rate (HR) is one of the most important Physiological parameter


and a vital indicator of people’s physiological state and is therefore
important to monitor. Monitoring of HR often involves high costs and
complex application of sensors and sensor systems. Research progressing
during last decade focuses more on noncontact based systems which are
simple, low-cost and comfortable to use. Still most of the noncontact
based systems are fit for lab environments in offline situation but needs
to progress considerably before they can be applied in real time
applications. This project presents a real time HR monitoring method
using a webcam of a laptop computer.

The heart rate is obtained through facial skin color variation caused by
blood circulation. Three different signal processing methods such as Fast
Fourier Transform (FFT), Independent Component Analysis (ICA) and
Principal Component Analysis (PCA) have been applied on the color
channels in video is extracted from the facial regions. HR is subsequently
quantified and compared to corresponding reference measurements.

The obtained results show that there is a high degrees of agreement


between the proposed experiments and reference measurements. This
technology has significant potential for advancing personal health care
and telemedicine.

iv
TABLE OF CONTENTS
CHAPTER TITLE PAGE
NO NUMBER
ABSTRACT iv
LIST OF FIGURES viii
1. INTRODUCTION 1
1.1 Overview 1
1.2 Problem Statement 1
1.3 Existing System 2
1.4 Proposed System 2

2. LITERATURE SURVEY 4
3. SYSTEM DESIGN 5
System Requirements 5
3.1 UML Flow Diagrams 5
3.1.1 Use Case Diagram of Heart beat 7
monitoring using ICA,PCA,FFT

3.1.2 Class Diagram of Heart beat 9


monitoring using ICA,PCA,FFT

3.1.3 Sequence Diagram of Heart beat 10


monitoring using ICA,PCA,FFT

3.1.4 Activity Diagram of Heart beat 10


monitoring using ICA,PCA,FFT

3.1.5 Collaboration Diagram of Heart beat 12


monitoring using ICA,PCA,FFT

3.1.6 Component Diagram of Heart beat 13


monitoring using ICA,PCA,FFT
v
3.1.7 Deployment Diagram of Heart beat 14
monitoring using ICA,PCA,FFT
15
3.1.8 Package diagram of Heart beat
Monitoring using ICA,PCA,FFT

3.1.8 Package diagram of Heart beat 16


Monitoring using ICA,PCA,FFT
4. SYSTEM ARCHITECTURE 18
4.1 Architectural Design 18
4.2 Architectural Description 18
5. SYSTEM IMPLEMENTATION 19
5.1 System Description 19
5.1.1 ICA
5.1.2 PCA
5.1.3 FFT
5.2 Pseudo code for Heart beat monitoring using 22
ICA,PCA,FFT
6. RESULTS AND CODING 23
6.1 Sample Code 23
6.2 Screenshot 2536
7. CONCLUSION AND FUTURE WORK 38
Conclusion 38
Futurework
REFERENCES 39

vi
LIST OF FIGURES
LIST OF NAME OF THE FIGURE PAGE NO
FIGURES

3.1 Use Case Diagram of Heart beat monitoring using 8


ICA,PCA,FFT

3.2 Class Diagram of Heart beat monitoring using 9


ICA,PCA,FFT

3.3 Sequence Diagram of Heart beat monitoring using 10


ICA,PCA,FFT

3.4 Activity Diagram of Heart beat monitoring using 11


ICA,PCA,FFT

3.5 Collaboration Diagram of Heart beat monitoring 12


using ICA,PCA,FFT

3.6 Component Diagram of Heart beat monitoring using 13


ICA,PCA,FFT

viii
3.7 Deployment Diagram of Heart beat monitoring using 14
ICA,PCA,FFT

3.8 Package Diagram of Heart beat monitoring using 15


ICA,PCA,FFT

3.9 Object Diagram of Heart beat monitoring using 17


ICA,PCA,FFT

4.1 Architecture Diagram of Heart beat monitoring using 17


ICA,PCA,FFT

LIST OF ABBREVIATIONS

Heart Rate Variation


HRV

HR Heart Rate

ICA Independent Component Analysis

FFT Fast Fourier Transform

Principal component analysis


PCA

viii
CHAPTER 1

INTRODUCTION

Now a days people are becoming more interested in their personal health. This is
evident from the rapid rise in personal health related to applications for smart
phones, which today are capable of complex calculations. Vital signs are good
indicators of personal health, and measurement of these are of interest. The most
commonly measured vital signs is the heart rate .

1.1 OVERVIEW

In recent years, the usage of home health monitoring system is getting increased
as diseases like cardiovascular disease is the most dangerous one. The heart
disease is the number-one killer compared to all cancers combined. In this paper
automatic cardiovascular pulse rate monitoring system using a webcam integrated
with personal computers is described. Respiratory disease is a medical term which
affects the organs in higher organisms .It includes condition of the upper
respiratory tract, trachea, bronchi, bronchioles, alveoli, pleura and pleural cavity
.Heart rate is the speed of heartbeat measured by the number of contractions of the
heart per minute ( bpm) .According to the body’s physical needs ,the heart rate
may vary person to person.

One of the respiratory disease known as severe acute respiratory syndrome is


caused by corona virus .An outbreak of SARS in southern china caused an
eventual death resulting in 774 deaths during the year of November 2002 and July
2003.No cases of SARS have been reported world wise since 2004.There is no
treatment for SARS which is safe for humans as per 2015.The identification and
development of novel vaccines and medicines to treat SARS is a priority for
government and public health agencies around the world. Some of the
international airports adopted the method of screening using thermograph.
1.2 PROBLEM STATEMENT

The problem is to monitor the heart rate of a individual without using any sensor
to reduce cost and time and to make it easily available in remote areas.
1
1.3 EXISTING SYSTEM

Here have been many methods developed in order to ensure that the heartbeat rate
of a human is under control..An optical heart rate sensor measures pulse waves,
which are changes in the volume of a blood vessel that occur when the heart
pumps blood. Pulse waves are detected by measuring change in volume using an
optical sensor and green LED.

The problems with these optical monitors are the presence of noise in the received
signals because they have similar frequencies. Also, in some cases, measuring
heartbeat is wrong, which depends on the physical body and the intensity of the
user's activity and also to remove motion-artifact by separating the fluctuations
caused by small motions or movement. All these methods have the similar
drawback of accuracy.

1.4 PROPOSED SYSTEM

Heart Rate Variability analysis has been gaining attention from researches due to
its wide range of applications. As we know there are many ways to sense heart
beat but there are very few techniques by which we can sense heart beat without
any physical contact. The subtle changes in static scene which are invisible to the
naked eye can be detected using signal and image processing.

This technique involves color change detection and evaluation of heart rate using
frequency change. This technique also monitors the abnormal movements which
cannot be seen by naked eyes. It had applied three algorithms such as FFT, ICA
and PCA have been applied at the same time but separately to extract HR in real
time using only facial video.

The average of the R, G and B signals were calculated for FFT method.
Therefore, ICA is used which is able to remove motion-artifact by separating the
fluctuations caused by small motions or movement. This transformation is defined
in such a way that the first principal component has the largest possible variance
and each succeeding component in turn has the highest variance possible under
2
the constraint that it is orthogonal to the preceding components

ICA algorithm

ICA is used which is able to remove motion-artifact by separating the fluctuations


caused by small motions or movement. Interestingly, ICA returns the independent
components randomly and the component whose power spectrum contained the
highest peak is then selected for further analysis.

PCA

Similarly the normalized raw traces are also decomposed by PCA to find the
principal components . This transformation is defined in such a way that the first
principal component has the largest possible variance and each succeeding
component in turn has the highest variance possible under the constraint that it is
orthogonal to the preceding components.

The resulting vectors are an uncorrelated orthogonal basis set. The principal
components are orthogonal because they are the eigenvectors of the covariance
matrix, which is symmetric. PCA is sensitive to the relative scaling of the original
variables

Fast Fourier Transform (FFT)

Finally, the Fast Fourier Transform (FFT) is applied on the selected source signal
to obtain the power spectrum . The pulse frequency was designated as the
frequency that corresponded to the highest power of the spectrum within an
operational frequency band.

3
CHAPTER 2

LITERATURE REVIEW

Toshiaki Negishi et. al., (2018) proposed Contactless Vital Signs Measurement
System Using RGB-Thermal Image Sensors and Its Clinical Screening Test on
Patients with Seasonal Influenza for the screening of infectious diseases, which
demonstrates heart rate and respiration rate measurement system using RGB-
thermal image sensors. The RGB camera measures blood volume pulse (BVP)
through variations in the light absorption from human facial areas. IRT is used to
estimate the respiration rate by measuring the change in temperature near the
nostrils or mouth accompanying respiration.

Hamidreza Shirzadfar et. al., (2018) proposed Heart beat rate monitoring using
optical sensors, which presents a a heart rate measuring device that has a power
button , the user places his fingertip on the sensors then the microcontroller key is
pressed and then the heart rate is measured in 15 seconds and displayed on the
LCD screen. Using this device, it can be measured easily and in a short time, heart
rate with a non-invasive method without harm to the human body.

Guha Balakrishnanet. al., (2013) proposed Detecting Pulse from Head Motions in
Video where the extraction of heart rate and beat lengths from videos by
measuring subtle head motion caused by the Newtonian reaction to the influx of
blood at each beat. Our method tracks features on the head and performs principal
component analysis (PCA) to decompose their trajectories into a set of component
motions

Prajakta A. Pawar et. al., ( 2014 ) proposed Heart rate monitoring system using IR
base sensor & Arduino Uno which is used to monitor physical parameter like
heart beat and send the measured data directly to a doctor through SMS.

4
System consists of an IR base heart beat sensor, Arduino Uno & GSM module.
This device will be able to measure heart beat from an infant to elder person. The
low cost of the device will help to provide appropriate home base effective
monitoring system.

Tatsuya mori et. al., proposed Continuous Real-time Heart Rate Monitoring from
Face Images In which RPEM efficiency is investigated by the brightness change
on face image with three fundamental experiments .RPEM is used to measure the
light reflection from face covered by copper film and removing the motion noise
from the green light absorption variation.

Ramin Irani, et. al., proposed Improved Pulse Detection from Head Motions using
DCT proposes such a system, in which the heartbeats (pulses) are detected by
subtle motions that appear on the face due to blood circulation. The proposed
system has been tested in different facial expressions. The experimental results
show that the proposed system is correct and robust and outperforms state-of-the-
art.Viola and Jones algorithm has been employed for this purpose which is based
on Haar-like rectangular features that are extracted from integral images. Viola
and Jones algorithm is considered

ALALEH ALIVAR et. al., proposed Motion Artifact Detection and Reduction
in Bed-Based Ballisto cardiogram Which is to have a reliable estimation of beat-
to-beat (B-B) interval. The proposed algorithm includes two main stages: a
motion detection algorithm followed by a motion artifact removal system. Motion
detection involves a sequential detection algorithm in which successive data
frames are compared to two thresholds: upper and lower thresholds. Each motion
corrupted frame can then be reconstructed by an approach that relies on a
parametric model of the BCG signal.

5
Nayan A. Nai1 et. al., proposed Heart Beat Sensing Without Physical Contact
Using Signal and Image Processing where ,the system measures the heartbeats
without any physical contact to the person. It makes use of concepts of
image and signal processing on MATLAB platform. It also provides the graphical
representation at the end, of the heart rate calculated. MATLAB platform is a
high-performance language for technical computing. It integrates
computations, visualization and programming in easy to use environment
where problems and solutions are expressed in mathematical notation. MATLAB
have various inbuilt functions and also have its own libraries which helps in
implementation of every programs on MATLAB. The system measures the
heartbeats without any physical contact to the person. It makes use of
concepts of image and signal processing. It also provides the graphical
representation of the heart rate calculated at the end.

Larissa Carvalho et. al., proposed Analysis of Heart Rate Monitoring Using a
Webcam implemented a non invasive heart rate monitoring system, to monitor
subjects of different age groups using Digital Image Processing. Using this
method, it is possible to visualize the flow of blood as it fills the face. From this
result, it is possible to extract the subject‟s heart rate. The main field of research is
Image processing and computer Vision. Variations in videos that are difficult or
impossible to see with the naked eye are revealed by taking a standard video of
the subject as input, performing face tracking and applying pyramid
decomposition, followed by filtering of the frames. The resulting signal is then
amplified to reveal hidden information. Thus able to visualize the flow of blood as
it fills the face. This method is based on the Eulerian Video magnification
algorithm presented at SIGGRAPH 2012.

S.S.Lokhande , Binu K Nair et. al., proposed Heart Rate and Respiratory Rate
Measurement Using Image Processing where non contact measurement of
multiple vital signs i.e respiratory rate and heart rate based on RGB image
processing with CMOS camera is proposed.Monitoring the periodic temperature
changes of RGB images at nasal area can calculate respiratory rate.Heart rate is
measured by capturing the brightness variations of RGB facial images by
fluctuations in skin blood flow.The transmission of disease can be prevented by
this non contactable method.
6
CHAPTER 3

SYSTEM DESIGN

In this chapter, the various UML diagrams for the Heart Rate Monitoring Using
FFT ,ICA,PCA is represented and the various functionalities are explained.

3.1 UNIFIED MODELLING LANGUAGE

Unified Modeling language (UML) is a standardized modeling language


enabling developers to specify, visualize, construct and document artifacts of a
software system. Thus, UML makes these artifacts scalable, secure and robust
execution.

It uses graphic notation to create visual models of software systems. UML is


designed to enable users to develop an expressive, ready to use visual modeling
language. In addition, it supports high-level development concepts such as
frameworks, patterns and collaborations. Some of the UML diagrams are
discussed.

3.1.1 USE CASE DIAGRAM OF Heart rate monitoring using


PCA/ICA/FFT

Use case diagrams are considered for high level requirement analysis of a
system. So when the requirements of a system are analyzed the functionalities
are captured in use cases. So it can be said that uses cases are nothing but
the system functionalities written in an organized manner. Now the second
things which are relevant to the use cases are the actors. Actors can be
defined as something that interacts with the system. The actors can be
human user, some internal applications or may be some external applications.

7
Use case diagrams are used to gather the requirements of a system including
internal and external influences. These requirements are mostly Design
requirements .Hence, when a system is analyzed to gather its functionalities, use
cases are prepared and actors are identified.

Figure 3.1 Use case diagram of Heart Rate Monitoring Using FFT ,ICA,PCA

The Functionalities are to be represented as a use case in the representation.


Each and every use case is a function in which the user or the server can
have the access on it. The names of the use cases are given in such a way
8
that the functionalities are preformed, because the main purpose of the
functionalities is to identify the requirement to add some extra notes that
should be clarified to the user, the notes kind of structure is added to the use
case diagram. Only the main relationships between the actors and the
functionalities are shown because all the representation may collapse the
diagram. The use case diagram as shown in Figure 3.1 provides details
based on the Heart Rate Monitoring Using FFT ,ICA,PCA

3.1.2CLASS DIAGRAM OF Heart Rate Monitoring Using FFT ,ICA,PCA

Class diagram is basically a graphical representation of the static view of the


system and represents different aspects of the application. So a collection of
class diagrams represent the whole system.

Figure3.2 Class diagram of Heart Rate Monitoring Using FFT,ICA,PCA

The name of the class diagram should be meaningful to describe the aspect of
the system. Each element and their relationships should be identified in
advance responsibility (attributes and methods) of each class should be clearly
identified for each class minimum number of properties should be specified.
All of these specifications for the system is displayed as a class diagram in
Figure3.2.

9
3.1.3 SEQUENCE DIAGRAM OF Heart Rate Monitoring Using FFT
,ICA,PCA

UML sequence diagrams model the flow of logic within the system in a
visual manner, enabling to both document and validate the logic, and are
commonly used for both analysis and design purposes.

Figure 3.3 Sequence diagram of Heart Rate Monitoring Using FFT


,ICA,PCA

The various actions that take place in the application in the correct sequence are
shown in Figure 3.3 Sequence diagrams are the most popular UML for
dynamic modeling.

3.1.4 ACTIVITY DIAGRAM OF Heart Rate Monitoring Using FFT


,ICA,PCA

Activity is a particular operation of the system. Activity diagram is suitable for


modeling the activity flow of the system.
10
Figure 3.4 Activity diagram of Heart Rate Monitoring Using FFT ,ICA,PCA

Activity diagrams are not only used for visualizing dynamic nature of a
system but they are also used to construct the executable system by using
forward and reverse engineering techniques. The only missing thing in
activity diagram is the message part.

An application can have multiple systems. Activity diagram also captures these
systems and describes the flow from one system to another. This specific
usage is not available in other diagrams. These systems can be database,
external queues, or any other system. Activity diagram is suitable for
modeling the activity flow of the system.

It does not show any message flow from one activity to another. Activity
diagram is sometime considered as the flow chart. Although the diagrams looks
like a flow chart but it is not. It shows different flow like parallel, branched,
concurrent and single. The Figure 3.4 shows the activity diagram of the developed
application.

11
3.1.5 COLLABORATION DIAGRAM OF Heart Rate Monitoring Using
FFT ,ICA,PCA

Figure 3.5 Collaboration diagram of Heart Rate Monitoring Using FFT


,ICA,PCA

The next interaction diagram is collaboration diagram. It shows the object


organization. Here in collaboration diagram the method call sequence is
indicated by some numbering technique.

The number indicates how the methods are called one after another. The
method calls are similar to that of a sequence diagram. But the difference
is that the sequence diagram does not describe the object organization whereas
the collaboration diagram shows the object organization. The various objects
involved and their collaboration is shown in Figure3.5.

12
Now to choose between these two diagrams the main emphasis is given on
the type of requirement. If the time sequence is important then sequence
diagram is used and if organization is required then collaboration diagram is used.

3.1.6 COMPONENT DIAGRAM OF Heart Rate Monitoring Using FFT


,ICA,PCA

Figure 3.6 Component diagram of Heart Rate Monitoring Using FFT


,ICA,PCA

A component diagram displays the structural relationship of components of


a software system. These are mostly used when working with complex
systems that have many components such as sensor nodes, cluster head and
base station. It does not describe the functionality of the system but it
describes the components used to make those functionalities .Components
communicate with each other using interfaces. The interfaces are linked using
connectors. The Figure 3.6 shows a component diagram.

13
3.1.7 DEPLOYMENT DIAGRAM OF Heart Rate Monitoring Using FFT
,ICA,PCA

Figure 3.7 Deployment diagram of Heart Rate Monitoring Using FFT


,ICA,PCA

A deployment diagrams shows the hardware of your system and the software in
hardware. Deployment diagrams are useful when your software solution is
deployed across multiple machines such as sensor nodes, cluster head and
base station with each having a unique configuration. It shows how the
modules such as FFT Algorithm, ICA Algorithm, PCA, filtration gets
deployed in the system.

14
3.1.8 PACKAGE DIAGRAM OF Heart Rate Monitoring Using FFT
,ICA,PCA

Figure 3.8 Package diagram of Heart Rate Monitoring Using FFT


,ICA,PCA

Package diagrams are used to reflect the organization of packages and their
elements. When used to represent class elements, package diagrams provide a
visualization of the name spaces. Package diagrams are used to structure high
level system elements. Package diagrams can be used to simplify complex class
diagrams, it can group classes into packages. A package is a collection of
logically related UML elements.

Packages are depicted as file folders and can be used on any of the UML
diagrams. The Figure 3.8 represents package diagram for the developed
application which represents how the elements are logically related.
15
Package is a namespace used to group together elements that are semantically
related and might change together. It is a general purpose mechanism to organize
elements into groups to provide better structure for system model.

OBJECT DIAGRAM OF Heart rate monitoring using ICA,PCA and FFT

Object is an instance of a particular moment in runtime, including objects and


data values. A static UML object diagram is an instance of a class diagram;
it shows a snapshot of the detailed state of a system at a point in time, thus
an object diagram encompasses objects and their relationships at a point in time.

An Object Diagram can be referred to as a screenshot of the instances in a


system and the relationship that exists between them. Since object diagrams
depict behavior when objects have been instantiated, we are able to study the
behavior of the system at a particular instant. Object diagrams are vital to
portray and understand functional requirements of a system.

Object Diagrams use real world examples to depict the nature and structure of
the system at a particular point in time. Since we are able to use data
available within objects, Object diagrams provide a clearer view of the
relationships that exist between objects.

Object diagrams use a subset of the elements of a class diagram in order to


emphasize the relationship between instances of classes at some point in
time. They are useful in understanding class diagrams. They don’t show
anything architecturally different to class diagrams, but reflect multiplicity and
roles.

Object diagrams are derived from class diagrams so object diagrams are
dependent upon class diagrams. Object diagrams represent an instance of a
16
class diagram. The basic concepts are similar for class diagrams and object
diagrams.

Object diagrams also represent the static view of a system but this static view is a
snapshot of the system at a particular moment. Object diagrams are used to
render a set of objects and their relationships as an instance.

Figure 3.9 Object diagram of Heart Rate Monitoring using ICA,PCA and FFT

The Figure 3.9 represents Object diagram for the developed application and
also represents the relationship between the objects such as 1:1 relationship
between the client object and server object.

17
CHAPTER 4

SYSTEM ARCHITCTURE

In this chapter, the System Architecture for the Heart Rate monitoring using
ICA/PCA/FFT is represented and the modules are explained.
ARCHITECTURE DESCRIPTION
In system architecture the detailed description about the system modules and
the working of each module is discussed as shown in figure 4.1.

Figure 4.1 System Architecture of Heart rate monitoring using


PCA/ICA/FFT
Design is a multi- step that focuses on data structure software architecture,
procedural details, procedure etc… and interface among modules. The design
procedure also decodes the requirements into presentation of software that can be
accessed for excellence before coding begins. Computer software design change
continuously as novel methods; improved analysis and border understanding
evolved. Software proposal is at relatively primary stage in its revolution.
Therefore, software design methodology lacks the depth, flexibility and
quantitative nature that are usually associated with more conventional engineering
18
CHAPTER 5

SYSTEM IMPLEMENTATION

IMPLEMENTATION OF Heart rate monitoring using PCA/ICA/FFT

Initially the patients were informed to seat at a table in front of a laptop computer
at a distance of approximately 0.5 m from the built-in webcam (HP HD webcam).
During the patients were asked to keep still, breathe spontaneously, and face the
webcam while their video was focused for 2 minutes to detect their facial feature.
The experiments were carried out in indoors and with a sufficient amount of
ambient sunlight. HR was extracted in real time and displays in the GUI. the real
time HR monitoring system to extract a number of image frames one by one at a
certain period of time defined by the user. It is also important to notice that the
resolution of the video should remain same during each image frame extraction
for further calculations. All facial image frames (24-bit RGB) during real time HR
extraction were recorded sequentially at 30 frames per second (fps) with pixel
resolution of 640 × 480 in the laptop.

• This project has 4 modules:


Module 1:Video display and extract each image frame.
Module 2: face detection and facial image extraction.
Module 3: RGB signal extraction and pre-processing.
Module 4:Extraction of HR using ICA, PCA, FFT methods and displaying the
results.

5.1 Module Description

• Video display and extract each image frame.

Face Tracking and Region of Interest Selection: • The automatic face detection
function ‘CascadeObjectDetector’ of Computer Vision Toolbox.

• RGB signal extraction and pre-processing:

19
R,Gand B color values of each pixel of the facial image frames are the most
essential part for this experiment. Hence it was searched a perfect Region of
Interest (ROI) over the detected face.

• Signal Detrending :

The RGB signal has been detrended using the method used in based on
smoothness priors approach ,the average of the R, G and B signals were calculated
for FFT method.

• Filtering :

Before applying PCA, ICA and FFT the Red, Green and Blue signals formed from
all red, green and blue image frames are filtered by Hamming window for heart
rate.

• Normalization:

After Average RGB signal normalization is done After the signal needs to be
normalized and the normalization has been performed Using 𝑋𝑖 (𝑡) = 𝑌𝑖 (𝑡)− μ𝑖
(𝑡)/ 𝛿𝑖 (1) For each i = R, G and B signals where μ𝑖 is the mean and 𝛿𝑖 is the
standard deviation of 𝑌𝑖 .

Extraction of HR using ICA, PCA, FFT methods and displaying the results.

• For real time monitoring system :

To extract HR in real time at first the number of peaks in frequency domain was
calculated for first 50 image frames . Therefore HR is calculated as HR = 60*𝑓ℎ
bpm (beat per minute) where 𝑓ℎ is the extracted frequency of the HR.
HR=60*𝑓ℎbpm=[60x(number of Peaks/Time)] bpm =60x(25/20)bpm=75bpm.

Three algorithms such as FFT, ICA and PCA have been applied at the same time
but separately to extract HR in real time using only facial video. The average of
the R, G and B signals were calculated for FFT method. For the ICA method the
20
normalized raw traces were decomposed into three independent source signals (R,
G and B).

5.1.1 ICA algorithm

In this ICA is used which is able to remove motion-artifact by separating the


fluctuations caused by small motions or movement. Interestingly, ICA returns the
independent components randomly and the component whose power spectrum
contained the highest peak is then selected for further analysis.

5.1.2 PCA

Principal component analysis (PCA) is the process of computing the principal


components and using them to perform a change of basis on the data, sometimes
using only the first few principal components and ignoring the rest.In this paper
the normalized raw traces are also decomposed by PCA to find the principal
components . This transformation is defined in such a way that the first principal
component has the largest possible variance and each succeeding component in
turn has the highest variance possible under the constraint that it is orthogonal to
the preceding components. The resulting vectors are an uncorrelated orthogonal
basis set. The principal components are orthogonal because they are the
eigenvectors of the covariance matrix, which is symmetric. PCA is sensitive to the
relative scaling of the original variables

5.1.3 Fast Fourier Transform (FFT)

A fast Fourier transform is an algorithm that computes the discrete Fourier


transform of a sequence, or its inverse. Fourier analysis converts a signal from its
original domain to a representation in the frequency domain and vice
versa.Finally, in this the Fast Fourier Transform (FFT) is applied on the selected
source signal to obtain the power spectrum . The pulse frequency was designated
as the frequency that corresponded to the highest power of the spectrum within an
operational frequency band.

21
5.2 PSEUDO CODE FOR Heart beat rate monitoring usingPCA/ICA/FFT

Input : Real time image using webcam.

Output : Heart Rate is displayed.

Algorithm :

Step1 : Start.

Step2 : Detection of facial feature and Landmarks .

Step3 : Selection and ROI is done and average RGB is extracted.

Step4 : Apply ICA and PCA to filter and normalize the signal.

Step5 : FFT is used to find the peak

Step6 : Heart Rate is calculated and displayed.

Step7 : Stop.

22
CHAPTER 6

CODING AND SCREENSHOTS

6.1 CODING

FACE_DETECTION.py

import cv2
import numpy as np
import dlib
from imutils import face_utils
import imutils

class FaceDetection(object):
def __init__(self):
self.detector = dlib.get_frontal_face_detector()
self.predictor =
dlib.shape_predictor("shape_predictor_68_face_landmarks.dat")
self.fa = face_utils.FaceAligner(self.predictor, desiredFaceWidth=256)

def face_detect(self, frame):


#frame = imutils.resize(frame, width=400)
face_frame = np.zeros((10, 10, 3), np.uint8)
mask = np.zeros((10, 10, 3), np.uint8)
ROI1 = np.zeros((10, 10, 3), np.uint8)
ROI2 = np.zeros((10, 10, 3), np.uint8)
#ROI3 = np.zeros((10, 10, 3), np.uint8)
status = False

23
if frame is None:
return

gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)


# detect faces in the grayscale image
rects = self.detector(gray, 0)

# loop over the face detections


#for (i, rect) in enumerate(rects):
# determine the facial landmarks for the face region, then
# convert the facial landmark (x, y)-coordinates to a NumPy
# array

#assumpion: only 1 face is detected


if len(rects)>0:
status = True
# shape = self.predictor(gray, rects[0])
# shape = face_utils.shape_to_np(shape)

# convert dlib's rectangle to a OpenCV-style bounding box


# [i.e., (x, y, w, h)], then draw the face bounding box
(x, y, w, h) = face_utils.rect_to_bb(rects[0])
#cv2.rectangle(frame, (x, y), (x + w, y + h), (0, 255, 0), 1)
if y<0:
print("a")
return frame, face_frame, ROI1, ROI2, status, mask
#if i==0:
face_frame = frame[y:y+h,x:x+w]
# show the face number
#cv2.putText(frame, "Face #{}".format(i + 1), (x - 10, y - 10),
# cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)
# loop over the (x, y)-coordinates for the facial landmarks
# and draw them on the image

24
# for (x, y) in shape:
# cv2.circle(frame, (x, y), 1, (0, 0, 255), -1) #draw facial landmarks
if(face_frame.shape[:2][1] != 0):
face_frame = imutils.resize(face_frame,width=256)

face_frame = self.fa.align(frame,gray,rects[0]) # align face

grayf = cv2.cvtColor(face_frame, cv2.COLOR_BGR2GRAY)


rectsf = self.detector(grayf, 0)

if len(rectsf) >0:
shape = self.predictor(grayf, rectsf[0])
shape = face_utils.shape_to_np(shape)

for (a, b) in shape:


cv2.circle(face_frame, (a, b), 1, (0, 0, 255), -1) #draw facial landmarks

cv2.rectangle(face_frame,(shape[54][0], shape[29][1]), #draw rectangle on


right and left cheeks
(shape[12][0],shape[33][1]), (0,255,0), 0)
cv2.rectangle(face_frame, (shape[4][0], shape[29][1]),
(shape[48][0],shape[33][1]), (0,255,0), 0)

ROI1 = face_frame[shape[29][1]:shape[33][1], #right cheek


shape[54][0]:shape[12][0]]

ROI2 = face_frame[shape[29][1]:shape[33][1], #left cheek


shape[4][0]:shape[48][0]]

# ROI3 = face_frame[shape[29][1]:shape[33][1], #nose


# shape[31][0]:shape[35][0]]

#get the shape of face for color amplification


rshape = np.zeros_like(shape)
rshape = self.face_remap(shape)
25
mask = np.zeros((face_frame.shape[0], face_frame.shape[1]))

cv2.fillConvexPoly(mask, rshape[0:27], 1)
# mask = np.zeros((face_frame.shape[0], face_frame.shape[1],3),np.uint8)
# cv2.fillConvexPoly(mask, shape, 1)

#cv2.imshow("face align", face_frame)


# cv2.rectangle(frame,(shape[54][0], shape[29][1]), #draw rectangle on right
and

left cheeks
# (shape[12][0],shape[54][1]), (0,255,0), 0)
# cv2.rectangle(frame, (shape[4][0], shape[29][1]),
# (shape[48][0],shape[48][1]), (0,255,0), 0)

# ROI1 = frame[shape[29][1]:shape[54][1], #right cheek


# shape[54][0]:shape[12][0]]

# ROI2 = frame[shape[29][1]:shape[54][1], #left cheek


# shape[4][0]:shape[48][0]]

else:
cv2.putText(frame, "No face detected",
(200,200), cv2.FONT_HERSHEY_PLAIN, 1.5, (0, 0, 255),2)
status = False
return frame, face_frame, ROI1, ROI2, status, mask

# some points in the facial landmarks need to be re-ordered


def face_remap(self,shape):
remapped_image = shape.copy()

# left eye brow


remapped_image[17] = shape[26]
remapped_image[18] = shape[25]
remapped_image[19] = shape[24]
26
remapped_image[20] = shape[23]
remapped_image[21] = shape[22]

# right eye brow


remapped_image[22] = shape[21]
remapped_image[23] = shape[20]
remapped_image[24] = shape[19]
remapped_image[25] = shape[18]
remapped_image[26] = shape[17]

# neatening
remapped_image[27] = shape[0]

remapped_image = cv2.convexHull(shape)
return remapped_image

GUI.py

import cv2
import numpy as np
# from PyQt4.QtCore import *
# from PyQt4.QtGui import *
from PyQt5 import QtCore

from PyQt5.QtCore import *


from PyQt5.QtGui import *
from PyQt5.QtWidgets import *
#from PyQt4 import QtTest

import pyqtgraph as pg
import sys
import time
from process import Process
from webcam import Webcam
from video import Video
27
from interface import waitKey, plotXY
import json
import os
print('gggg')

class Communicate(QObject):
closeApp = pyqtSignal()

class GUI(QMainWindow, QThread):


def __init__(self):
if os.path.exists("data.json"):
os.remove("data.json")

super(GUI,self).__init__()
self.initUI()
self.webcam = Webcam()
self.video = Video()
self.input = self.webcam
self.dirname = ""
print("Input: webcam")
self.statusBar.showMessage("Input: webcam",5000)
self.btnOpen.setEnabled(False)
self.process = Process()
self.status = False
self.frame = np.zeros((10,10,3),np.uint8)
#self.plot = np.zeros((10,10,3),np.uint8)
self.bpm = 0

def initUI(self):
#set font
font = QFont()
font.setPointSize(16)

#widgets
28
self.btnStart = QPushButton("Start", self)
self.btnStart.move(440,520)
self.btnStart.setFixedWidth(200)
self.btnStart.setFixedHeight(50)
self.btnStart.setFont(font)
self.btnStart.clicked.connect(self.run)

self.btnOpen = QPushButton("Open", self)


self.btnOpen.move(230,520)
self.btnOpen.setFixedWidth(200)
self.btnOpen.setFixedHeight(50)
self.btnOpen.setFont(font)
self.btnOpen.clicked.connect(self.openFileDialog)

self.cbbInput = QComboBox(self)
self.cbbInput.addItem("Webcam")
self.cbbInput.addItem("Video")
self.cbbInput.setCurrentIndex(0)
self.cbbInput.setFixedWidth(200)
self.cbbInput.setFixedHeight(50)
self.cbbInput.move(20,520)
self.cbbInput.setFont(font)
self.cbbInput.activated.connect(self.selectInput)
#-------------------

self.lblDisplay = QLabel(self) #label to show frame from camera


self.lblDisplay.setGeometry(10,10,640,480)
self.lblDisplay.setStyleSheet("background-color: #000000")

self.lblROI = QLabel(self) #label to show face with ROIs


self.lblROI.setGeometry(660,10,200,200)
self.lblROI.setStyleSheet("background-color: #000000")

self.lblHR = QLabel(self) #label to show HR change over time


self.lblHR.setGeometry(900,20,300,40)
29
self.lblHR.setFont(font)
self.lblHR.setText("Frequency: ")

self.lblHR2 = QLabel(self) #label to show stable HR


self.lblHR2.setGeometry(900,70,300,40)
self.lblHR2.setFont(font)
self.lblHR2.setText("Heart rate: ")

# self.lbl_Age = QLabel(self) #label to show stable HR


# self.lbl_Age.setGeometry(900,120,300,40)
# self.lbl_Age.setFont(font)
# self.lbl_Age.setText("Age: ")

# self.lbl_Gender = QLabel(self) #label to show stable HR


# self.lbl_Gender.setGeometry(900,170,300,40)
# self.lbl_Gender.setFont(font)
# self.lbl_Gender.setText("Gender: ")

#dynamic plot
self.signal_Plt = pg.PlotWidget(self)

self.signal_Plt.move(660,220)
self.signal_Plt.resize(480,192)
self.signal_Plt.setLabel('bottom', "Signal")

self.fft_Plt = pg.PlotWidget(self)

self.fft_Plt.move(660,425)
self.fft_Plt.resize(480,192)
self.fft_Plt.setLabel('bottom', "FFT")

self.timer = pg.QtCore.QTimer()
self.timer.timeout.connect(self.update)
self.timer.start(200)

30
self.statusBar = QStatusBar()
self.statusBar.setFont(font)
self.setStatusBar(self.statusBar)

#event close
self.c = Communicate()
self.c.closeApp.connect(self.close)

#event change combobox index

#config main window


self.setGeometry(100,100,1160,640)
#self.center()
self.setWindowTitle("Heart rate monitor")
self.show()

def update(self):
#z = np.random.normal(size=1)
#u = np.random.normal(size=1)
self.signal_Plt.clear()
self.signal_Plt.plot(self.process.samples[20:],pen='g')

self.fft_Plt.clear()
self.fft_Plt.plot(np.column_stack((self.process.freqs, self.process.fft)), pen = 'g')

def center(self):
qr = self.frameGeometry()
cp = QDesktopWidget().availableGeometry().center()
qr.moveCenter(cp)
self.move(qr.topLeft())

31
def closeEvent(self, event):
reply = QMessageBox.question(self,"Message", "Are you sure want to quit",
QMessageBox.Yes|QMessageBox.No, QMessageBox.Yes)
if reply == QMessageBox.Yes:
event.accept()
self.input.stop()
cv2.destroyAllWindows()
else:
event.ignore()

def selectInput(self):
self.reset()
if self.cbbInput.currentIndex() == 0:
self.input = self.webcam
print("Input: webcam")
self.btnOpen.setEnabled(False)
self.statusBar.showMessage("Input: webcam",5000)
elif self.cbbInput.currentIndex() == 1:
self.input = self.video
print("Input: video")
self.btnOpen.setEnabled(True)
self.statusBar.showMessage("Input: video",5000)

def mousePressEvent(self, event):


self.c.closeApp.emit()

# def make_bpm_plot(self):

# plotXY([[self.process.times[20:],
# self.process.samples[20:]],
# [self.process.freqs,
# self.process.fft]],
# labels=[False, True],
# showmax=[False, "bpm"],
32
# label_ndigits=[0, 0],
# showmax_digits=[0, 1],
# skip=[3, 3],
# name="Plot",
# bg=None)

# fplot = QImage(self.plot, 640, 280, QImage.Format_RGB888)


# self.lblPlot.setGeometry(10,520,640,280)
# self.lblPlot.setPixmap(QPixmap.fromImage(fplot))

def key_handler(self):
"""
cv2 window must be focused for keypresses to be detected.
"""
self.pressed = waitKey(1) & 255 # wait for keypress for 10 ms
if self.pressed == 27: # exit program on 'esc'
print("[INFO] Exiting")
self.webcam.stop()
sys.exit()

def openFileDialog(self):
self.dirname = QFileDialog.getOpenFileName(self,
'OpenFile',r"C:\Users\uidh2238\Desktop\test videos")
self.statusBar.showMessage("File name: " + self.dirname,5000)

def reset(self):
self.process.reset()
self.lblDisplay.clear()
self.lblDisplay.setStyleSheet("background-color: #000000")

@QtCore.pyqtSlot()
def main_loop(self):
frame = self.input.get_frame()

self.process.frame_in = frame
33
self.process.run()

cv2.imshow("Processed", frame)

self.frame = self.process.frame_out #get the frame to show in GUI


self.f_fr = self.process.frame_ROI #get the face to show in GUI
#print(self.f_fr.shape)
self.bpm = self.process.bpm #get the bpm change over the time

self.frame = cv2.cvtColor(self.frame, cv2.COLOR_RGB2BGR)


cv2.putText(self.frame, "FPS "+str(float("{:.2f}".format(self.process.fps))),
(20,460), cv2.FONT_HERSHEY_PLAIN, 1.5, (0, 255, 255),2)
img = QImage(self.frame, self.frame.shape[1], self.frame.shape[0],
self.frame.strides[0], QImage.Format_RGB888)
self.lblDisplay.setPixmap(QPixmap.fromImage(img))

self.f_fr = cv2.cvtColor(self.f_fr, cv2.COLOR_RGB2BGR)


#self.lblROI.setGeometry(660,10,self.f_fr.shape[1],self.f_fr.shape[0])
self.f_fr = np.transpose(self.f_fr,(0,1,2)).copy()
f_img = QImage(self.f_fr, self.f_fr.shape[1], self.f_fr.shape[0],
self.f_fr.strides[0], QImage.Format_RGB888)
self.lblROI.setPixmap(QPixmap.fromImage(f_img))

self.lblHR.setText("Freq: " + str(float("{:.2f}".format(self.bpm))))

if self.process.bpms.__len__() >50:
if(max(self.process.bpms-np.mean(self.process.bpms))<5): #show HR if it
is stable -the change is not over 5 bpm- for 3s
self.lblHR2.setText("Heart rate: " +
str(float("{:.2f}".format(np.mean(self.process.bpms)))) + " bpm")

#entry = "FPS: " + str(float("{:.2f}".format(self.process.fps)))


#entry = "Freq: " + str(float("{:.2f}".format(self.bpm)))
#entry = "Heart rate: " +
str(float("{:.2f}".format(np.mean(self.process.bpms))))
34
entry = [
"FPS: " + str(float("{:.2f}".format(self.process.fps))),
"Freq: " + str(float("{:.2f}".format(self.bpm))),
"Heart rate: " + str(float("{:.2f}".format(np.mean(self.process.bpms))))
]
with open('data.json', 'a') as outfile:
outfile.write(json.dumps(entry))
outfile.write(",")
outfile.close()

#self.lbl_Age.setText("Age: "+str(self.process.age))
#self.lbl_Gender.setText("Gender: "+str(self.process.gender))
#self.make_bpm_plot()#need to open a cv2.imshow() window to handle a
pause
#QtTest.QTest.qWait(10)#wait for the GUI to respond
self.key_handler() #if not the GUI cant show anything

def run(self, input):


self.reset()
input = self.input
self.input.dirname = self.dirname
if self.input.dirname == "" and self.input == self.video:
print("choose a video first")
self.statusBar.showMessage("choose a video first",5000)
return
if self.status == False:
self.status = True
input.start()
self.btnStart.setText("Stop")
self.cbbInput.setEnabled(False)
self.btnOpen.setEnabled(False)
self.lblHR2.clear()
while self.status == True:
self.main_loop()
35
elif self.status == True:
self.status = False
input.stop()
self.btnStart.setText("Start")
self.cbbInput.setEnabled(True)

if __name__ == '__main__':
app = QApplication(sys.argv)
ex = GUI()
while ex.status == True:
ex.main_loop()

sys.exit(app.exec_())

6.2 SCREENSHOTS

36
37
CHAPTER 7

CONCLUSION AND FUTUREWORK

CONCLUSION

A real time noncontact based HR extraction method is described in this


project using facial video which is easy to implement, low cost and
comfortable for real time applications.

Here, the main idea is to extract HR from the color variation in the facial
skin due to cardiac pulse and the implementation has been done using a
simple webcam in indoor environment with constant ambient light. Better
results (i.e. 90%) can be achieved by taking the average of the three methods
and using video.

FUTUREWORK

Creating a real-time, multi-parameter physiological measurement platform such


as RR, HRV and arterial blood oxygen saturation with higher resolution of video
based on this technology in driving situation will be the subject of future work.

38
REFERENCES

• [1] Hamidreza Shirzadfar, Mahsa Sadat Ghaziasgar, Zeinab Piri,


Mahtab Khanahmad , (2011), ‘ Heart beat rate monitoring using optical
sensors ‘.

• [2] Guha Balakrishnan, Fredo Durand,JohnGuttag, (2010),’ Detecting


Pulse from Head Motions in Video ‘.

• [3] Ming-Zher Poh, Daniel J. McDuff, and Rosalind W. Picard, (2012),


‘ Non-contact, automated cardiac pulse measurements using video
Imaging and blind source separation ‘.

• [4] ALALEH ALIVAR, (2009), ‘Motion Artifact Detection and


Reduction in Bed-Based Ballisto cardiogram’ .

• [5] Ching-Wei Wang,Andrew Hunter, Neil Gravill, and Simon


Matusiewicz, (2011),’ Unconstrained Video Monitoring of Breathing
Behavior and Application to Diagnosis of Sleep Apnea ‘.

• [6] Ramin Irani, Kamal Nasrollahi and Thomas B. Moeslund, (2011),


39
‘ Improved Pulse Detection from Head Motions using DCT ’.

• [7] Tatsuya mori,(2008) ,’Continuous Real-time Heart Rate Monitoring


from Face Images’.

• [8] Pradyumna Chari,1 , Krish Kabra1 , Doruk Karinca1 , Soumyarup


Lahiri1 , Diplav Srivastava1 , Kimaya Kulkarni1 , Tianyuan Chen1 ,
Maxime Cannesson2 , Laleh Jalilian2 and Achuta Kadambi1 ,(2010)
‘DIVERSE R-PPG: CAMERA-BASED HEART RATE ESTIMATION
FOR DIVERSE SUBJECT SKIN-TONES AND SCENES ‘ .

• [9] Daniel McDuff, Sarah Gontarek, and Rosalind W. Picard, (2014)


‘Improvements in Remote Cardio-Pulmonary Measurement Using a
Five Band Digital Camera ‘.

• [10] Wim Verkruysse, Lars O Svaasand, and J Stuart Nelson, (2013)


‘Remote plethysmographic imaging using ambient light ’ .

• [11] Frédéric Bousefsaf , Alain Pruski and Choubeila Maaoui ,(2012)


‘3D Convolutional Neural Networks for Remote Pulse Rate
Measurement and Mapping from Facial Video ’ .
40

You might also like