You are on page 1of 12

A

Project Report on

GSM-BASED SMART HELMET DROWSY DETECTION SYSTEM

Submitted for partial fulfilment of the requirements for the award of the degree of

BACHELOR OF TECHNOLOGY

in
ELECTRONICS AND
COMMUN
by

NAME OF THESTUDENT Roll No

NAME OF THESTUDENT Roll No


NAME OF THESTUDENT Roll No
NAME OF THESTUDENT Roll No

Under the Guidance of


GUIDE NAME
DESIGNATION

DEPARTMENT OF <NAME>

St. MARTIN'S ENGINEERING COLLEGE


UGC Autonomous
Affiliated to JNTUH, Approved by AICTE,
Accredited by NBA & NAAC A+, ISO 9001:2008
Certified
Dhulapally, Secunderabad - 500 100
www.smec.ac.in

MARCH - 2024
St. MARTIN'S ENGINEERING COLLEGE
UGC Autonomous
NBA& NAAC A+ Accredited
Dhulapally, Secunderabad - 500
100

Certificate
This is to certify that the project entitled “<Title>” is being submitted by <NAME

OF THE STUDENT> <ROLL NUMBER>, in fulfilment of the requirement for the

award of degree of BACHELOROF TECHNOLOGY in <DEPARTMENT

NAME> is recorded of bonafide work carried out by them. The result embodied in

this report have been verified and found satisfactory.

Signature of Guide Head of the Department


<Name> <HOD Name>
<Designation> <Designation>
<Department> <Department>

Internal Examiner External Examiner

Date:

Place:
St. MARTIN'S ENGINEERING COLLEGE
UGC Autonomous
NBA& NAAC A+ Accredited
Dhulapally, Secunderabad - 500
100

DEPARTMENT OF <<COMPUTER SCIENCE AND ENGINEERING>>

DECLARATION
We, the students of „Bachelor of Technology in Department of Computer Science

and Engineering’, session: 2020 - 2024, St. Martin’s Engineering College,

Dhulapally, Kompally, Secunderabad, hereby declare that the work presented in

this Project Work entitled <<Title of the Project>>is the outcome of our own

bonafide work and is correct to the best of our knowledge and this work has been

undertaken taking care of Engineering Ethics. This result embodied in this project

report has not been submitted in any university for award of any degree.

<Name of the Student1> <Roll Number>


<Name of the Student2> <Roll Number>
<Name of the Student3> <Roll Number>
<Name of the Student4> <Roll Number>
ACKNOWLEDGEMENT

The satisfaction and euphoria that accompanies the successful completion of


any task would be incomplete without the mention of the people who made it possible
and whose encouragement and guidance have crowded our efforts with success.

First and foremost, we would like to express our deep sense of gratitude and
indebtedness to our College Management for their kind support and permission to use
the facilities available in the Institute.

We especially would like to express our deep sense of gratitude and


indebtedness to Dr. P. SANTOSH KUMAR PATRA, Professor and Group Director,
St. Martin‟s Engineering College, Dhulapally, for permitting us to undertake this
project.

We wish to record our profound gratitude to Dr. M. SREENIVAS RAO,


Principal, St. Martin‟s Engineering College, for his motivation and encouragement.

We are also thankful to <HOD NAME>, Head of the Department, Computer


Science and Engineering, St. Martin‟s Engineering College, Dhulapally,
Secunderabad, for his support and guidance throughout our project as well as Project
Coordinator <COORDINATOR NAME>, Professor, Computer Science and
Engineering department for his valuable support.

We would like to express our sincere gratitude and indebtedness to our project
supervisor <GUIDE NAME>, Professor, Department of Computer Science and
Engineering, St. Martins Engineering College, Dhulapally, for his support and
guidance throughout our project.

Finally, we express thanks to all those who have helped us successfully


completing this project. Furthermore, we would like to thank our family and friends
for their moral support and encouragement. We express thanks to all those who have
helped us in successfully completing the project.

<<Name of Student>> <<Roll No>>


<<Name of Student>> <<Roll No>>
<<Name of Student>> <<Roll No>>

i
ABSTRACT

This project mainly concentrates on the implementation of completely capable


interactive systems with which humans can interact. The design of this system is very
similar to the fictional character “Wall-E”. We can use Raspberry pi 3 B+ as the
single board computer and speakers for the output and USB mic for the input.
Programming language used is python and the system can access the internet for
various tasks.

We saw many home automation technologies introduced over these years from
Zigbee automation to Amazon Echo, Google Home and Home from Apple. It has
become a craze these days. Google Home price is around 150$ (USD) with an
additional cost of the devices to be connected to, the total cost of the system reaches
over 250$ (USD). Apple Home Kit too is pretty more expensive, over 100$ (USD)
more than the Google Home just for a basic setup. Philips Hue, a smart light which is
controlled by the Google Assistant, Amazon Echo and Siri, voice assistant by Apple
is priced around 145$ (USD). Similarly, Belikin‟s Wemo light is priced around 44$
(USD) per unit and this can be controlled both by Siri and Google Assistant. So,
overall we can see here that to make our home smart we need to invest quite a lot,
let‟s say some 250$ (USD) for a basic setup. What if we can automate our house
within (cost of the Smartphone is not included as it is assumed to be owned by every
individual these days) a limited amount and can control up to 8 appliances using Life
Assistant? Well, this paper describes the implementation of such a system. The
system is implemented using a simple python script. The communication between the
human and the life assistant can be established via Wi-Fi.

ii
LIST OF FIGURES

Figure No. Figure Title Page No.

3.1 Use Case diagram for text to speech system 12

3.2 Sequence diagram for text to speech system 13

3.3 Text to speech conversion 15

4.1 Speech Recognition Process 20

4.2 Implementation of source code 21

5.1 Execution of command 25

iii
LIST OF TABLES

Table No. Table Name Page No.

3.1 Use Case diagram for text to speech system 12

3.2 Sequence diagram for text to speech system 13

3.3 Text to speech conversion 15

4.1 Speech Recognition Process 20

4.2 Implementation of source code 21

5.1 Execution of command 25

iv
LIST OF ACRONYMS AND DEFINITIONS

S.NO ACRONYM DEFINITION

01. STT Speak To Text

02. TTS Text To Speech

03. API Application Program Interface

04. UML Unified Modeling Language

05. JS JavaScript

06. Gtts Google Text To Speech

07. HTML Hyper Text Markup Language

08. XML Extensible Markup Language

09. Pyttxs Python Text To Speech

10. JSON JavaScript Object Notation

v
CONTENTS

ACKNOWLEDGEMENT i
ABSTRACT ii
LIST OF FIGURES iii
LIST OF TABLES v
LIST OF ACRONYMS AND DEFINITIONS vi
CHAPTER 1 INTRODUCTION 01
1.1 Objective 02
1.2 Overview 03
CHAPTER 2 LITERATURE SURVEY 07
CHAPTER 3 SYSTEM ANALYSIS AND DESIGN 13
3.1 Existing System 13
3.2 Proposed System 13

CHAPTER 4 SYSTEM REQUIREMENTS & SPECIFICATIONS 15

4.1 Database 15
4.2 CNN Algorithm 15
4.3 Design 20
4.3.1 System Architecture 20
4.3.2 Data Flow Diagram 21
4.3.3 UML Diagram 22
4.3.4 Use Case Diagram 23
4.3.5 Class Diagram 24
4.3.6 Sequence Diagram 25
4.3.7 Activity Diagram 26
4.4 Modules 27
4.4.1 Modules Description 27
4.5 System Requirements 28
4.5.1 Hardware Requirements 28
4.5.2 Software Requirements 28
4.6 Testing 29
4.6.1 Unit Testing 29
4.6.2 Integration Testing 29
4.6.3 Functional Testing 29
4.6.4 System Testing 30
4.6.5 White Box Testing 30
4.6.6 Black Box Testing 30
4.6.7 Unit Testing 31
4.6.8 Integration Testing 31
4.6.9 Acceptance Testing 31
CHAPTER 5 SOURCE CODE 32
CHAPTER 6 EXPERIMENTAL RESULTS 41
CHAPTER 7 CONCLUSION & FUTURE ENHANCEMENT 48
7.1 CONCLUSION 48
7.2 FUTURE ENHANCEMENT 48
REFERENCES 49
Patent/Publication
CHAPTER 1
INTRODUCTION

1.1 NECESSITY OF INTERACTIVE SYSTEM


Interactive System is developing a user friendly Interaction Engine, using
cognitive technology. With system quickly becoming more able to assist us, human
system interaction is the next big challenge that needs to be solved for system to
successively enter into our society. System not only needs to perform tasks for us but
need to do so in a way that makes sense to us. This requires systems with social
intelligence to understand us, system that have natural interaction capabilities to talk
with us, and robots that are able to adapt to us. Our Interaction Engine enables you to
quickly develop interactive scenarios for your application. Interactive systems‟
solutions deliver an optimal interaction with people.

1.2TYPES OF INTERACTIVE SYSTEMS


Task Type
When discussing human-system interaction, the task to be accomplished sets the tone
for the system‟s design and use, so the task must be identified as part of the system‟s
classification. The task should be specified at a high level. For example, the TASK
classification could be urban search and rescue, walking aid for the blind, toy, or
delivery robot. Task type also allows the systems environment to be implicitly
represented.

Task Criticality

Task criticality measures the importance of getting the task done correctly in
terms of its negative effects should problems occur. Criticality is a highly subjective
measure. To counteract this problem, we define a critical task to be one where a
failureaffects the life of a human. For example, the failure of a systematic wheelchair
to recognize a down staircase could severely injure or kill its user. The failure of a
Furby to act properly threatens no one. A hospital delivery robot does have some
criticalityin its task, since failure to bring a critical sample to the lab in time could be
harmful. However, food delivery is much less critical task, since a late delivery is
unlikely to harm a person seriously. Due to its subjective nature, CRITICALITY is
broken into three categories: high, medium and low. Urban search and rescue have
CRITICALITY=high; it is dangerous for its user to be near the disaster situation and
REFERENCES
[1] Krieger, R. and Borman, G. (1966), “The Computation of Apparent Heat Release for the Internal
Combustion Engines,” ASME, Paper No. 66-WA/DGP-4.

You might also like