You are on page 1of 104

ABSTRACT

Fire incident is a disaster that can potentially cause the loss of life, property damage and permanent disability to the
affected victim. Firefighting is a momentous and perilous job. The fire has to rapidly & safely extinguish by a
firefighter to prevent more damage and destruction. Fire detection and extinguishment are the hazardous job that
invariably put the life of a fire fighter in danger. One of the most efficient tools for early extinguishing of fire is
fire fighter robot. In most of the Industries fire sensing is very essential to prevent heavy losses. Robots with this
type of embedded systems can save life of Engineers in industrial sites with dangerous conditions. It is desirable to
design a robot that can detect fire and extinguish the fire as quickly as possible. In this work, a novel fire fighting
robot with Arduino is presented. A novel Firefighting robot is designed and built with an embedded system. In this
system, the fire detection system is designed using two flame sensors in the Fire Sensing and Extinguishing Robot,
and program for fire detection and extinguishing procedure using sensor-based method. The firefighting robot is
designed using Arduino UNO, Flame Sensor, Water Dispenser and Ultrasonic sensor as main components. This
Robot can be moved Forward, Backward, Left and Right Direction using Android based Application. This Robot
stops on Detection of Obstacle and start again on clearance.

Keywords Fire Fighting Robot · Deep Learning · FFR

v
TABLE OF CONTENTS

CHAPTER NO. TITLE PAGE NO.

ABSTRACT
V

LIST OF FIGURES ix

1 INTRODUCTION 1

1.1 Overview 1

1.2 Objective 1

1.3 Summary 2

2 LITERATURE SURVEY 3

2.1 Introduction of literature survey 3

4
2.2 Machine learning

4
2.3 Data forecasting

6
2.4 Industrial algorithm

2.5 Systematic Review 7

2.6 summary 7

3 RESEARCH METHODOLOGY 8

3.1 Existing System 8

3.2 Architecture Diagram 9

3.3 Summary 10

4 SYSTEM SPECIFICATION 15

4.1 Hardware Requirement 11

4.2 Software Requirement 11

4.3 Software Environment 12

4.4 Summary 16

vi
5 PROJECT DESCRIPTION 17

5.1 Overview of the project 17

5.2 Module Description 17

5.2.1 Arduino Board 17

5.2.2 Automatic Function 17

5.2.3 power Consumption 18

5.2.4 sensor 18

5.2.5 Ultrasonic
19

5.3 Rcommendation 20

5.4 Summary 22

6 SYSTEM TESTING 23

6.1 Unit Testing 24

6.2 Functional Testing 25

6.3 Integration Testing 25

6.4 Summary 26

7 RESULT AND DISCUSSION 27

7.1 Support vector Machine Algorithm 27

7.2 Classifier 28

7.3 Summary 29

8 ALGORITHM 30

8.1 Source Code 30

8.2 Web application 37

vii
8.3 Style code 43

8.4 Screenshots 71

8.5 Battery 72

8.6 Summary 73

9 CONCLUSION 74

9.1 Conclusion 74

9.2 Future Enhancement 74

9.3 Summary 75

10 REFERENCE 76

viii
LIST OF FIGURES

S.NO FIGURES PAGE NO.

3.1 Methodology flow Chart 8

3.2 System Architecture 9

3.3 Autonomous fire Protection 10

4.1 Block diagram 14

4.2 Flow chart 16

7.1 Automatic Dispenser Specification 28

7.2 Parallel Connection with Arduino 29

8.1 IR Sensors 71

8.2 Ultrasonic 71

ix
CHAPTER 1

INTRODUCTION

1.1 OVERVIEW

The Firefighting Robot is a compact and portable emergency responder robot that assists firemen in
fighting high-rise fires, especially in highly dangerous environments where it is not safe for people to enter.
Equipped with a thermal imager that is able to detect hot zones in a room autonomously without being
impeded by smoke, the scenes captured by the robot’s camera are transmitted live to the operator’s control
unit, allowing firefighters to assess the fire scene from a safe distance while remotely guiding the robot. With
the flexibility to discharge foam from its 9-litre on-board foam solution tank or water through its water
monitor, the Firefighting Robot can put out small yet potentially lethal fires in a confined space. Therefore,
reducing risk exposure, alleviate cognitive loads and allow more bandwidth for firefighters during an
emergency to conduct highly demanding tasks such as casualty rescue.

1.2 OBJECTIVES

The primary objective of a firefighting robot is to extinguish fires safely and effectively,
while reducing the risk of injury and death to firefighters. Firefighting robots can also be used to search for
and rescue victims, clear debris, and monitor fire conditions.

 Navigate through hazardous environments that are too dangerous for human firefighters to enter.

 Detect and locate fires quickly and accurately.

 Select and apply the appropriate firefighting agent to extinguish the fire.

 Operate autonomously or under remote human control.

 Communicate with human firefighters to provide updates on the fire situation and to receive
instructions.

Reduce the cost of firefighting by automating tasks that are currently performed by human firefighters.
Improve the efficiency of firefighting operations by providing firefighters with real-time data and
analysis .Minimize the environmental impact of firefighting by using less water and other resources.
Firefighting robots are still under development, but they have the potential to revolutionize the way that fires
are fought. By automating dangerous and repetitive tasks, firefighting robots can help to keep firefighters
safe and improve the overall effectiveness of firefighting operations. Protecting firefighters: Firefighting
1
robots can help to protect firefighters by performing tasks in dangerous or hazardous environments. This can
include tasks such as entering burning buildings to search for victims, extinguishing fires in remote or
inaccessible areas, and clearing debris from fire scenes.

Extinguishing fires: Firefighting robots can be equipped with a variety of tools and technologies to
extinguish fires, including water cannons, foam cannons, and dry powder cannons. They can also be used to
deliver extinguishing agents to fires that are difficult or dangerous for human firefighters to reach.

Supporting rescue operations: Firefighting robots can be used to support rescue operations in a
variety of ways. For example, they can be used to search for victims in collapsed buildings, clear debris from
rescue paths, and provide medical assistance to victims.

In addition to these general objectives, firefighting robots may also have specific objectives depending
on their intended use. For example, a firefighting robot designed for use in industrial facilities may have the
objective of preventing fires from spreading to other areas of the facility. A firefighting robot designed for
use in wildfires may have the objective of creating firebreaks to slow the spread of the fire.

1.3 SUMMARY

The Firefighting Robot is a compact and portable emergency responder robot that
assists firemen in fighting high-rise fires, especially in highly dangerous environments where it is not
safe for people to enter. To establish the machine learning for the AI robot, it was born from pattern
recognition and the theory that computers can learn without being programmed to perform specific
tasks. How to use a dispenser is quite easy, by placing the glass under the tap and then turning the
water tap, then the water will come out of the dispenser.

2
CHAPTER 2

LITERATURE REVIEW

2.1 INTRODUCTION TO LITERATURE

REVIEW

Firefighting robots are an emerging technology that has the potential to revolutionize the way fires are
fought. These robots can perform a variety of tasks, including extinguishing fires, searching for
victims, rescuing victims, clearing debris, and monitoring fire conditions. Firefighting robots offer a
number of advantages over traditional firefighting methods, including increased safety, increased
effectiveness, and reduced costs. However, they also have some disadvantages, such as high cost,
limited capabilities, and safety concerns.

One of the most promising applications of firefighting robots is in the area of firefighting safety.
Firefighting robots can reduce the risk of injury and death to firefighters by performing dangerous
tasks, such as entering burning buildings or extinguishing fires in hazardous areas. For example,
firefighting robots can be used to extinguish fires in chemical plants or refineries, where the risk of
explosion or toxic gas exposure is high.

Firefighting robots can also be used to increase the effectiveness of firefighting operations. For
example, firefighting robots can be equipped with cameras and sensors that can be used to search for
victims in burning buildings or to identify the source of a fire. This information can help firefighters
to develop a more effective firefighting plan.

In addition, firefighting robots can be used to reduce the costs associated with firefighting. For
example, firefighting robots can be used to automate tasks, such as extinguishing fires or clearing
debris. This can free up firefighters to focus on more important tasks, such as rescuing victims.

3
2.2 MACHINE LEARNING
Machine learning algorithms can be trained to detect fires with greater accuracy and speed than
traditional fire detection methods. This is important because it allows firefighting robots to respond to
fires more quickly and effectively.

The firefighting robot QRob uses machine learning to detect fires with greater accuracy and speed
than traditional fire detection methods. QRob is equipped with a thermal camera and a machine
learning algorithm that is trained to detect the presence of flames. Once QRob detects a fire, it can
send an alert to firefighters and autonomously navigate to the fire to extinguish it.

Fire fighting robot uses machine learning to navigate through complex and hazardous environments,
such as burning buildings. It is equipped with a variety of sensors, including ultrasonic sensors, lidar
sensors, and cameras. FFr uses machine learning to fuse the data from these sensors to create a map
of its surroundings. then uses this map to navigate to the fire and avoid obstacles.
firefighting robot uses machine learning to navigate through complex and hazardous environments,
such as burning buildings. FFr is equipped with a variety of sensors, including ultrasonic sensors,
lidar sensors, and cameras. FFr uses machine learning to fuse the data from these sensors to create a
map of its surroundings. FFr then uses this map to navigate to the fire and avoid obstacles.

2.3 DATA FORCASTING

A quick review of machine learning algorithms, Author is Ms. Susmita Ray and year of publication
Feb 2019. InMachineLearningacomputerprogramisassignedtoperformsometasksanditissaidthatthe
machine has learnt from its experience if its measurable performance in these tasks improves as It
gains more and more experience in executing these tasks. So, the machine takes decisions and does
predictions forecasting based on data.

Take the example of computer program that learns to detect predict cancer from the medical
investigation reports of a patient. It will improve in performance as it gathers more experience by
analyzing medical investigation reports of wider population of patients. Its performance will be
measured by the count of correct predictions and detections of cancer cases as validated by an
experienced Oncologist.

4
Machine Learning is applied in wide variety of fields namely :robotics, virtual personal
assistants (like Google), computer games, pattern recognition, natural language processing, data
mining, traffic prediction, online transportation network (e.g. estimating surge price in peak hour by
product recommendation, share market prediction, medical diagnosis, online fraud prediction,
agriculture advisory, search engine result refining (e.g. Google search engine), IoTs (chatbots for
online customer support).

Even in E-mail spam filtering, crime prediction through video surveillance system, social
media services (face recognition in Facebook). Machine Learning generally deals with three types of
problems namely: classification, regression, and clustering. Depending on the availability of types and
categories of training data one may need to select from the available
techniquesof“supervisedlearning”,“unsupervisedlearning”,“semisupervisedlearning”and“reinforcemen
tlearning” to apply the appropriate machine learning algorithm in this paper author intends to do a
brief review of various machine learning algorithm which are most frequently used and therefore are
the most popular ones. The author intends to highlight the merits and industrial of the machine
learning algorithms from their application perspective to a informed decision making towards
selecting the appropriate learning algorithm to meet the specific requirement of the application.

The collected data in this study uses observation techniques. The observation technique in this
study was used to determine the design and the level of accuracy before planning the manufacture of
tools. The tool is designed based on observations about dispensers that have been circulating and then
added an automatic system to help visually impaired people by adjusting the ergonomic values and
visually impaired needs. How to use a dispenser is quite easy, by placing the glass under the tap and
then turning the water tap, then the water will come out of the dispenser. However, using a dispenser
can be some troubles for the blind because they have to touch the glass and place it in the right
position.

5
2.4 INDUSTRIAL ALGORITHM

Improving the accuracy of machine learning, Author is Mr. Jonathan G and year of publication on
11 August 2020. This raises the question, why do existing approaches struggle with differential
Allexisting industrial algorithms, including Bayesian model-based and Deep Learning approaches.
Machine learning uses a series of steps to identify, train, and test computer algorithms to identify a
feature of interest. Some of these techniques have been applied to classify brain magnetic resonance
imaging or computed tomography with machine controls, and to distinguish different types or stages
of a, cerebrovascular and accelerated features of aging. However, the recent rapid increase in
publications using different machine learning techniques in different populations, types of images, and
criteria make it difficult to obtain an objective view of the current accuracy of machine learning.
Introduction Enhancing a model performance can be challenging at times. I’m sure, a lot of you would
agree with me if you’ve found yourself stuck in a similar situation.

You try all the strategies and algorithms that you’ve learned. Yet, you fail at improving the accuracy
of your model. You feel helpless and stuck. And this is where 90% of the data scientists give up. But
this is where the real story begins! This is what differentiates an average data scientist from a master
data scientist. Do you also dream of becoming a master data scientist If yes, you need these 8 proven
ways to re-structure your model approach. A predictive model can be built in many ways. There is no
‘must-follow’ rule. But, if you follow my ways (shared below), you’d surely achieve high accuracy in
your models (given that the data provided is sufficient to make predictions). I’ve learnt these methods
with experience. I’ve always preferred to learn practically than digging theories. And my approach
has always encouraged me. In this article, I’ve shared the 8 proven ways using which you can create a
robust machine learning model. I hope my knowledge can help people in achieving great heights in
their careers. The main pitfall is, it allows one to ascribe causal explanations to data.

In the last decade, improvements in medical imaging, exponential increase in


computationalpowerofaffordablecomputingplatforms,andgreateravailabilityofimagingdatasets,forexa
mple,haveincreasedopportunitiestodevelopmachinelearningapproachestoautomatedetection,classificati
on,andquantification of IOT devices. Machine learning uses a series of steps to identify, train, and test
computeralgorithms to identify a feature of interest. Some of these techniques have been applied to
classify brainmagnetic resonance imaging or computed tomography scans, comparing patients with
planning andhealthy controls, and to distinguish different types or stages of create, cerebrovascular.

6
2.5 SYSTEMATIC REVIEW

Machine learning of assisted impairment a systematic review, Author is Enrico Pellegrini and year of
publication on 2016. In the last decade, improvements in medical imaging, exponential increase in computational
power of affordable computing platforms, and greater availability of imaging data automate detection,
classification, and quantification of IOT devices. Machine learning uses a series of steps to identify, train, and test
computer algorithms to identify sets, for example, have increased opportunities to develop machine learning
approaches to a feature of interest. Some of these techniques have been applied to classify brain magnetic
resonance imaging or computed tomography scans, comparing patients with planning and healthy controls, and to
distinguish different types or stages of create, cerebrovascular, and accelerated features of aging.
However, the recent rapid increase in publications using different machine learning techniques in different
populations, types of images, and criteria make it difficult to obtain an objective view of the current accuracy
of machine learning. We undertook this systematic review to critically appraise the accuracy of machine
learning to differentiate healthy aging from MCI from and predict the future risk. The main pitfall is this
reflects that generating sufficient ground truth for a reliable validation.

2.6SUMMARY

The project firefighting robot was designed in such that the robot can be operated using smoke
detection sensor, temperature sensor detects the temperature and smoke and when the fire burns it emits a
small amount of infrared light, this light will be received by the IR receiver on the sensor module.

7
.

CHAPTER 3

RESEARCH METHODOLOGY

3.1 EXISTING SYSTEM

There are many sections in the development of designing autonomous robot.

These sections can be simplified using methodic approach.

Figure 3.1 show the block diagram for this robot. This block diagram shows how Autonomous fire fighter
robot implements. Sensors use as input and this input control by programming in PIC helped by SK40C
board and additional circuit. With this input, main board set the references and uses this data to applying it
at output where the output is brushless motor, actuator and message notification from GSM modern

Programming

Fig 3.1 methodology flow chart

8
ARCHITECTURE DIAGRAM
Software architecture involves the high-level structure of software system abstraction, by
using decomposition and composition, with architectural style and quality attributes. A software
architecture design must conform to the major functionality and performance requirements of the
system, as well as satisfy the non-functional requirements such as reliability, scalability, portability,
and availability. Software architecture must describe its group of components, their connections,
interactions among them and deployment configuration of all components.

Fig 3.2 System Architecture

9
START

MCU Initialization

I/O Pins initialize base on hardware designed.


DC motor PWM are enable
ADC initialize
Serial port communication initialize using 9600bps baud rate.
Initialize servo motor driver interrupt (PWM)

YES

While (1), loop operation


Stop liquid

NO
Fire Extinguish?

Fire detected at center sensor?


YES Inject liquid
Send sms to user

NO

Navigation
Fig 3.3 Autonomous Fire Protection Robot withNotification

SUMMARY
DC engines as a rule convert electrical vitality into mechanical vitality. To quench the fire a
siphon is utilized to siphon the water on to the fire for a simple motor is used to pump the water. The
pumping motor in extinguishing system controls the flow of water coming out of pumping data flow
diagram must identify external inputs and outputs, determine how the inputs and outputs relate to each.

10
CHAPTER 4

SYSTEM SPECIFICATION

4.1 HARDWARE REQUIREMENTS

 Microcontroller : Arduino Uno.


 Crystal : 16 MHz.
 LCD : 16X2 LCD.
 H-Bridge : L293D.
 Bluetooth Module : HC-05.
 Fire sensor : IR type DC 5V.
 Motor : DC 5V Servo motor.
 Robot Driving Robot : DC gear motor.

4.2 SOFTWARE REQUIREMENTS

 Operating System : Windows OS

 Language : PYTHON, AI& IOT.

 IDE : PYCHARM

 Back End : MY SQL

11
4.3 SOFTWARE ENVIRONMENT

Front End: PYTHON


Python is an interpreted high-level programming language for general-purpose programming.
Created by Guido van Rossum and first released in 1991, Python has a design philosophy that
emphasizes code readability, notably using significant whitespace. It provides constructs that enable
clear programming on both small and large scales. In July 2018, Van Rossum stepped down as the
leader in the language community.

Python features a dynamic type of system and automatic memory management. It supports
multiple programming paradigms, including object-oriented, imperative, functional, and procedural,
and has a large and comprehensive standard library. Python interpreters are available for many
operating systems. C Python, the reference implementation of Python, is open-source software and
has a community-based development model, as do nearly all of Python's other implementations.
Python and C Python are managed by the non-profit Python Software Foundation. Rather than having
all its functionality built into its core, Python was designed to be highly extensible. This compact
modularity has made it particularly popular as a means of adding programmable interfaces to existing
applications. Van Rossum's vision of a small core language with a large standard library and easily
extensible interpreter stemmed from his frustrations with

ABC, which espoused the opposite approach. While offering choice in coding methodology,
the Python philosophy rejects exuberant syntax (such as that of Perl) in favor of a simpler, less-
cluttered grammar. As Alex Martelli put it: "To describe something as 'clever' is not considered a
compliment in the Pythonculture. “Python’s philosophy rejects the Perl "there is more than one way to
do it" approach to language design in favor of "there should be one—and preferably only one—
obvious way to do it".

Python's developers strive to avoid premature optimization and reject patches to non-critical
parts of C Python that would offer marginal increases in speed at the cost of clarity. [ When speed is
important, a Python programmer can move time-critical functions to extension modules written in
languages such as C, or use PyPy.

12
Emphasis on readability. In contrast, code that is difficult to understand or reads like a rough
transcription from another programming language is called pythonic. Users and admirers of Python,
especially those considered knowledgeable or experienced, are often referred to as Pythonistas,
Pythonistas, and Pythoness. Python is an interpreted, object-oriented, high-level programming
language with dynamic semantics. Its high-level built-in data structures, combined with dynamic
typing and dynamic binding, make it very attractive for Rapid Application Development, as well as
for use as a scripting or glue language to connect existing components together. Python's simple, easy
to learn syntax emphasizes readability and therefore reduces the cost of program maintenance. Python
supports modules and packages, which encourages program modularity and code reuse. The Python
interpreter and the extensive standard library are available in source or binary form without charge for
all major platforms and can be freely distributed.

Often, programmers fall in love with Python because of the increased productivity it
provides. Since there is no compilation step, the edit-test-debug cycle is incredibly fast. Debugging
Python programs is easy: a bug or bad input will never cause a segmentation fault. Instead, when the
interpreter discovers an error, it raises an exception. When the program doesn't catch the exception,
the interpreter prints a stack trace. A source level debugger allows inspection of local and global
variables, evaluation of arbitrary expressions, setting breakpoints, stepping through the code a line at a
time, and so on. The debugger is written in Python itself, testifying to Python's introspective power.
On the other hand, often the quickest way to debug a program is to add a few print statements to the
source: the fast edit-test- debug cycle makes this simple approach very effective.

Python’s initial development was spearheaded by Guido van Rossum in the late 1980s. Today,
it is developed by the Python Software Foundation. Because Python is a multiparadigm language,
Python programmers can accomplish their tasks using different styles of programming: object
oriented, imperative, functional, or reflective. Python can be used in Web development, numeric
programming, game development, serial port access and more.

Just-in-time compiler. C Python is also available, which translates a Python script into C and
makes direct C-level API calls into the Python interpreter. An important goal ofPython's developers is
keeping it fun to use. This is reflected in the language's name a tribute to the British comedy group
Monty Python and in occasionally playful approaches to tutorials and reference materials, such as
examples that refer to spam and eggs (from a famous Monty Python sketch) instead ofthe standard for

13
and bar. A common neologism in the Python community is pythonic, which can have a wide range of
meanings related to program style. To say that code is pythonic is to say.

There are two attributes that make development time in Python faster than in other programming
languages:

1. Python is an interpreted language, which precludes the need to compile code before executing
a program because Python does the compilation in the background. Because Python is a high-
level programming language, it abstracts many sophisticated details from the programming
code. Python focuses so much on this abstraction that its code can be understood by most
novice programmers.

2. Python code tends to be shorter than comparable codes. Although Python offers fast
development times, it lags slightly in terms of execution time. Compared to fully complier
languages like C and C++, Python programs execute slower. Of course, with the processing
speeds of computers these days, the speed differences are usually only observed in
benchmarkingtests, not in real-world operations. In most cases, Python is already included in
Linux distributions and Mac OS X machines.

Fig 4.1 Block Diagram

14
Back End: MY SQL

MySQL is the world's most used open source relational database management system
(RDBMS) as of 2008 that run as a server providing multi-user access to a number of databases.
The MySQL development project has made its source code available under the terms of the GNU
General Public License, as well as under a variety of proprietary agreements. MySQL was owned
and sponsored by a single for-profit firm, the Swedish company MySQL AB, now owned by
Oracle Corporation.

Inter images:

MySQL is primarily an RDBMS and ships with no GUI tools to administer MySQL
databases or manage data contained within the databases. Users may use the included command
line tools, or use MySQL "front-ends", desktop software and web applications that create and
manage MySQL databases, build database structures, back up data, inspect status, and work with
data records. The official set of MySQL front-end tools, MySQL Workbench is actively
developed by Oracle, and is freely available for use.

Graphical:

The official MySQL Workbench is a free integrated environment developed by MySQL


AB, that enables users to graphically administer MySQL databases and visually design database
structures.MySQL Workbench replaces the previous package of software, MySQL GUI Tools.
Like other third-party packages, but still considered the authoritative MySQL frontend, MySQL
Workbench lets users manage database design & modeling, SQL development (replacing MySQL
Query Browser).

15
Figure.4.2 Flow Chart

4.5 SUMMARY

Python code tends to be shorter than comparable codes. Although Python offers fast
development times, it lags slightly in terms of execution time. Compared to fully complier
languages like C and C++, Python programs execute slower. Of course, with the processing speeds of
computers.

16
CHAPTER 5

PROJECT DESCRIPTION

5.1 OVERVIEW OF THE PROJECT


The project entitled “INDUSTRIAL CART AUTOMOTION USING IOT AND AI” is
developed using PYTHON as front end and MYSQL as back end. The modules of the project are,

 Arduino Board
 Automatic Function
 Features Extraction
 Conception
 Sensor’s

5.2 MODULE DESCRIPTION

5.2.1 ARDUINO BOARD

In this module, create the login for admin and user login. Admin can upload the datasets related.
A data set (or dataset, although this spelling is not present in manycontemporary dictionaries like
Merriam- Webster) is a collection of data. Most commonly a data set corresponds to the contents of a
single database table, or a single statistical data matrix, where every column of the table represents a
particular variable, and each row corresponds to a given member of the data set in question. The data set
lists values for each of the variables, such as height and weightof an object, for each member of the data
set. Each value is known as a datum. The data set may comprise data for one or more members,
corresponding to the number of rows.

5.2.2 AUTOMATIC FUNCTION

Data pre-processing is an important step in the [data mining] process. The phrase "garbage in,
garbage out" is particularly applicable to data mining and machine learning projects. Data-gathering
methods are often loosely controlled, resulting in out-of-range values, impossible data combinations,
missing values, etc. Analyzing data that has not been carefully screened for such problems can produce
misleading results. Thus, the representation and quality of data is first and

17
foremost before running an analysis. If there is much irrelevant and redundant information present
or noisy and unreliable data, then knowledge discovery during the training phase is more difficult.
Data preparation and filtering steps can take considerable amount of processing time. In this
module, we can eliminate the irrelevant values and estimate the missing values of data. Finally
provide structure dataset.

5.2.3 POWER CONCEPTION

Feature selection refers to the process of reducing the inputs for processing and analysis, or of
finding the most meaningful inputs. A related term, feature engineering (or feature extraction), refers
to the process of extracting useful information or features from existing data. Filter feature selection
methods apply a statistical measure to assign a scoring to each feature. The features are ranked by the
score and either selected to be kept or removed from the dataset. The methods are often univariate and
consider the feature independently, or about the dependent variable. It can be used to construct this
module, select the multiple features from uploaded datasets.

5.2.4 SENSOR

IR technology is used in a wide range of wireless applications which includes remote


controls and sensing. The infrared part in the electromagnetic spectrum can be separated into three
main regions: near IR, mid-IR & far IR. The wavelengths of these three regions vary based on the
application. For the near IR region, the wavelength ranges from 700 nm- 1400 nm, the wavelength of
the mid-IR region ranges from 1400 nm – 3000 nm & finally for the far IR region, the wavelength
ranges from 3000 nm – 1 mm The IR sensor or infrared sensor is one kind of electronic component,
used to detect specific characteristics in its surroundings through emitting or detecting IR radiation.
These sensors can also be used to detect or measure the heat of a target and its motion. In many
electronic devices, the IR sensor circuit is a very essential module. This kind of sensor is similar to
human’s visionary senses to detect obstacles. IR Sensor The sensor which simply measures IR
radiation instead of emitting is called PIR or passive infrared. Generally, in the IR spectrum, the
radiation of all the targets radiation and some kinds of thermal radiation are not visible to the eyes but
can be sensed through IR sensors. IR Sensor Working Principle An infrared sensor includes two parts
namely the emitter & the receiver (transmitter & receiver), so this is jointly called an optocoupler or a
photo-coupler. Here, IR LED is used as an emitter whereas the IR photodiode is used as a receiver.
The type of incident that occurred is the direct otherwise indirect type where indirect type, the

18
arrangement of an infrared LED can be done ahead of a photodiode

19
.

5.2.5 ULTRASONIC

sensors are electronic devices that calculate the target’s distance by emission of ultrasonic sound
waves and convert those waves into electrical signals. The speed of emitted ultrasonic waves traveling
speed is faster than the audible sound. There are mainly two essential elements which are the
transmitter and receiver. Using the piezoelectric crystals, the transmitter generates sound, and from
there it travels to the target and gets back to the receiver component. D = 1/2 T * C Where ‘T’
corresponds to time measured in seconds ‘C’ corresponds to sound speed = 343 measured in mts/sec
Ultrasonic sensor working principle is either similar to sonar or radar which evaluates the target/object
attributes by understanding the received echoes from sound/radio waves correspondingly. These
sensors produce high-frequency sound waves and analyze the echo which is received from the sensor.
The sensors measure the time interval between transmitted and received echoes so that the distance to
the target is known. Ultrasonic Sensor Specifications Knowing the specifications of an ultrasonic
sensor helps in understanding the reliable approximations of distance measurements. • The sensing
range lies between 40 cm to 300 cm. • The response time is between 50 milliseconds to 200
milliseconds. • The Beam angle is around 50. Ultrasonic Sensor Arduino This section explains the
interfacing of the ultrasonic sensor with an Arduino by considering HC-SR-04 where it explains the
ultrasonic sensor pinout, its specifications, wiring diagram, and how the sensor with Arduino
connection.

The ultrasonic sensor pin diagram is: Ultrasonic Sensor Pin Diagram Vcc – This pin has to be
connected to a power supply +5V. TRIG – This pin is used to receive controlling signals from the
Arduino board. This is the triggering input pin of the sensor ECHO – This pin is used for sending
signals to the Arduino board where the Arduino calculates the pulse duration to know the distance.
This pin is the ECHO output of the sensor. GND – This pin has to be connected to the ground. The
below picture shows the ultrasonic sensor block diagram for distance measurement. Ultrasonic Sensor
Block Diagram The target’s distance is calculated using an ultrasonic distance sensor and the output
from the sensor is provided to the signal conditioning section and then is processed using an Arduino
microcontroller. The results from the microcontroller are fed to the LCD display and then moved to a
personal computer.

20
The ultrasonic sensor can be connected to the servo motor to know the polar distance of the
sensor up to 1800 rotations approximately. Working In general, an ultrasonic sensor has two sections
which are the transmitter and receiver. These sections are closely placed so that the sound travel in a
straight line

21
from the transmitter to the target and travels back to the receiver. Making sure to have minimal
distance between transmitter and receiver section delivers minimal errors while calculations.
Integration of Ultrasonic Transducer with Arduino These devices are also termed ultrasonic
transceivers because both the transmitter and receiver sections are combined in a single unit.

5.3 RECOMMENDATION
Independent cart technology is a new approach to linear motors. Traditional conveyors rely
on gears, chains, and belts. Independent cart technology uses magnets to precisely control motion
with frictionless propulsion! The result is fewer parts to worry about, reduced energy consumption
and the ability to quickly start and stop loads without losing control or putting wear on parts.
Deliver More Flexible Design and Quick Change. Customers want more options and customized
products. To accommodate the demand, machines need to be flexible and easy to change over.
Fortunately, we’ve got that covered. Curved or straight sections of track can be combined in endless
combinations to suite your design and space requirements.

And once it’s built, changeovers can be as simple as selecting a different software-
configured move profile. Reduce bottlenecks and increase output: Each cart in the system is its
own intelligently controlled axis of motion. That means that the carts can speed up or slow down
based on where other carts are in the system. The system also tracks what each cart is moving, so
there’s no need to slow down for sortation. The result? Fewer bottlenecks and increased output.
Whether you're moving material within a single machine or from one end of your facility to the
other, independent cart technology can help you do it better. All our motors and carts are IP65-
rated or higher and modular which make them a great choice for a wide range of applications.
industrial automation is pervading most industries these days. It wouldn’t be surprising if much of
the industrial intelligentsia have already begun looking into the prospects of precision agriculture,
smart manufacturing, or digital medicine.

And these industries, including automotive, aren’t novice to automation technologies such
as Artificial Intelligence (AI) or machine learning. The recent deterring speech by automotive titan
Tesla CEO Elon Musk on the use of level-5 AI and robotics in automobiles made one thing clear:
the stellar leaps in current automation and robotics are indeed causing shockwaves and beaming
our industries into the future, which may usher in a new industrial revolution.

22
Of course, any claim of Skynet metamorphosing into reality or robots taking over all our
jobs can be considered a hyperbole at this point. power trains and other components in some
companies, save a few simpler parts that can be made by human workers.

Model A Model B Model B+

Target price: US$25 US$35

SoC: Broadcom BCM2835 (CPU, GPU, DSP, SDRAM, and single USB port)

CPU: 700 MHz ARM1176JZF-S core (ARM11 family, ARMv6 instruction set)

Broadcom VideoCore IV @ 250 MHz


OpenGL ES 2.0 (24 GFLOPS)
GPU:
MPEG-2 and VC-1 (with license), 1080p30 h.264/MPEG-4 AVC high-
profile decoder and encoder

Memory 256 MB (shared with 512 MB (shared with GPU) as of 15 October


(SDRAM): GPU) 2012

USB 2.0 1 (direct from 2 (via the on-board 3-port 4 (via the on-board 5-
ports:[10] BCM2835 chip) USB hub) port USB hub)

15-pin MIPI camera interface (CSI) connector, used with the Raspberry Pi
Video input:
Camera Addon.

23
Composite RCA (PAL and NTSC) –in model B+ via 4-pole 3.5 mm
jack, HDMI (rev 1.3 & 1.4), raw LCD Panels via DSI
Video outputs:
14 HDMI resolutions from 640×350 to 1920×1200 plus
various PAL and NTSC standards.

Audio 3.5 mm jack, HDMI, and, as of revision 2 boards, I²S audio (also
outputs: potentially for audio input)

Onboard SD / MMC / SDIO card slot (3.3 V card power


MicroSD
storage: support only)

Onboard 10/100 Mbit/s Ethernet (8P8C) USB adapter on


None
network: the third/fifth port of the USB hub

Low-level 8× GPIO,[38] UART, I²C bus, SPI bus with two chip
17× GPIO
peripherals: selects, I²S audio[39]+3.3 V, +5 V, ground

Power ratings: 300 mA (1.5 W) 700 mA (3.5 W) 600 mA (3.0 W)

Power source: 5 V via MicroUSB or GPIO header

85.60 mm × 56 mm (3.370 in × 2.205 in) – not including protruding


Size:
connectors

Weight: 45 g (1.6 oz)

5.4 SUMMARY
Python code tends to be shorter than comparable codes. Although Python offers

fast development times, it lags slightly in terms of execution time.

24
CHAPTER 6

SYSTEM

TESTING
Software testing is a method of assessing the functionality of a software program. There are
many different types of software testing, but the two main categories are dynamic testing and static
testing. Dynamic testing is an assessment that is conducted while the program is executed; static
testing, on the other hand, is an examination of the program's code and associated documentation.
Dynamic and static methods are often used together.

Testing is a set activity that can be planned and conducted systematically. Testing begins at the
module level and work towards the integration of entire computers-based system. Nothing is
complete without testing, as it is vital success of the system.
• Testing Objectives:

There are several rules that can serve as testing objectives, they are
1. Testing is a process of executing a program with the intent of finding an error

2. A good test case is one that has high probability of finding an undiscovered error.

3. A successful test is one that uncovers an undiscovered error.

If testing is conducted successfully according to the objectives as stated above, it would


uncover errors in the software. Also testing demonstrates that software functions appear to the
working accordingto the specification, that performance requirements appear to have been met.
There are three ways to test a program

1. For Correctness

2. For Implementation efficiency

3. For Computational Complexity.

Tests for correctness are supposed to verify that a program does exactly what it was designed
to do. This is much more difficult than it may at first appear, especially for large programs.

Tests used for implementation efficiency attempt to find ways to make a correct program
fasteror use less storage. It is a code-refining process, which re-examines the implementation
phase ofalgorithm development. Tests for computational complexity amount to an experimental

25
analysis of the complexity of an algorithm or an experimental comparison of two or more
algorithms, which solve the same problem.

26
The data is entered in all forms separately and whenever an error occurred; it is corrected
immediately. A quality team deputed by the management verified all the necessary documents
andtested the Software while entering the data at all levels. The development process involves
various types of testing. Each test type addresses a specific testing requirement. The most common
types of testing involved in the development process are:

• Unit Test.
• Integration Test
• Functional Test

6.1 UNIT TESTING:


The first test in the development process is the unit test. The source code is normally divided
intomodules, which in turn are divided into smaller units called units. These units have specific
behavior. The test done on these units of code is called unit test. Unit test depends upon the
language on which theproject is developed. Unit tests ensure that each unique path of the project
performs accurately to the documented specifications and contains clearly defined inputs and
expected results. Unit testing producing tests for the behavior of components (nodes and vertices) of
a product to ensure their correct behavior prior to system integration. Writing software unit test code
provides quick, almost instantaneous feedback about the correctness of the coding, including its
design and implementation.

Test passes and test fails confirm if the software works or doesn’t work as intended and can
update its vetting every time someone changes a specific piece of code. Unit testing saves time and
money by fixing problems early in the development process, as opposed to later during system
testing, integration testing, and even beta testing. Other benefits of unit testing include It gives
developers insight into the testing code base, so they can make any code changes quickly. More
bugs caught early in the process enables the development team to spend less time debugging later
and more time creating value with their work. New team members can get up to speed more easily,
without having to worry about damaging existing code, because any problems will be caught
quickly. A well-constructed unit test can be used as documentation, which is updated each time the
test is run. Because each unit test is a standalone function, it can test different parts of a project
without waiting for others to be completed.

27
6.2 FUNCTIONAL TESTING:

Functional test can be defined as testing two or more modules together with the intent of
finding defects, demonstrating that defects are not present, verifying that the module performs its
intended functions as stated in the specification and establishing confidence that a program does
what it is supposed to do. It is a type of software testing which is used to verify the functionality of
the software application, whether the function is working according to the requirement specification.
In functional testing, each function tested by giving the value, determining the output, and verifying
the actual output with the expected value. Functional testing performed as black box testing which is
presented to confirm that the functionality of an application or system behaves as we are expecting.
It is done to verify the functionality of the application.

Functional testing also called as black-box testing, because it focuses on application


specification rather than actual code. Tester must test only the program rather than the system. Goal
of functional testing the purpose of the functional testing is to check the primary entry function,
necessarily usable function, the flow of screen GUI. Functional testing displays the error message so
that the user can easily navigate throughout the application. Testers follow the following steps in the
functional testing: Tester does verification of the requirement specification in the software
application. After analysis, the requirement specification tester will plan. After planning the tests,
the tester will design the test case. After designing the test, case tester will make a document of the
traceability matrix. The tester will execute the test case design. Analysis of the coverage to examine
the covered testing area of the application. Defect management should do to manage defect
resolving.

6.3 INTEGRATION TESTING:


In integration testing modules are combined and tested as a group. Modules are typically
code modules, individual applications, source and destination applications on a network, etc.
Integration Testing follows unit testing and precedes system testing. Testing after the product is
code complete. Betas are often widely distributed or even distributed to the public at large in hopes
that they will buy the final product when it is released. Even though each component might undergo
rigorous testing, it can’t guarantee that the system will work as a whole. This is where integration
testing comes in. Different programmers usually develop different modules with varying
approaches, creating an unwanted incongruence between component designs. Integration testing
helps by ensuring the entire system runs in unison. Modules can undergo several code changes even
28
after being unit tested, leading to untested code being released with the product. Integration testing
becomes a necessary step here.

29
The interfaces between software modules, the database, and external hardware can be erroneous.
This can be confirmed via integration testing. Integration testing shouldn’t be left for the end of the
testing cycle. It’s crucial to test as early and often as possible to catch all the defects. Testers should
clearly define each unit and its relation to the other units in the test case to allow for easy reuse in
the future. Organizations should use test automation to conduct integration testing much faster.

6.4 SUMMARY

The tester will execute the test case design. Analysis of the coverage to examine the
covered testing area of the application. Defect management should do to manage defect resolving. A
data flow diagram is a two-dimensional diagram that explains how data is processed and transferred in a
system. The graphical depiction identifies each source of data and how it interacts with other data sources to
reach a common output. Individuals seeking to draft a data flow diagram must identify external inputs and
outputs, determine how the inputs and outputs relate to each other, and explain with graphics how these
connections related.

30
CHAPTER 7

RESULT AND DISCUSSION

7.1 SUPPORT VECTOR MACHINE ALGORITHM

Support Vector Machine is a supervised machine learning algorithm used for


bothclassification and regression. Though we say regression problems as well its best suited for
classification. The objective of algorithm is to find a hyper plane in an N-dimensional space that
distinctly classifies the data points. The dimension of the hyper plane depends upon the number of
features. If the number of input features is two, then the hyper plane is just a line. If the number of
input features is three, then the hyper plane becomes a 2-D plane. It becomes difficult to imagine
when the number of features exceeds three is a function that takes low dimensional input space and
transforms it into higher-dimensional space, i.e., it converts not separable problem to separable
problem. It is mostly useful in non-linear separation problems. Simply put the kernel, it does some
extremely complex data transformations then finds out the process to separate the data based on the
labels or outputs defined.

Support Vector Machine is a supervised algorithm based on machine learning which can be
used for both classification and regression problems. However, it's far ordinarily used in
classification work. In this work, plot each data item as a point in n-dimensional space with the value
of every feature being the count of a particular coordinate. Then, we perform classification by finding
the hyper-plane that differentiates the two classes very well. Support Vectors are simply the co-
ordinates of individual observation.

The two-dimensional linearly separable data can be separated by a line. The function of the
lineis y=ax+by. We rename x with x1 and y with x2 and we get:

ax1−x2+b=0

If we define x = (x1,x2) and w = (a,−1) we

get: w⋅ x+b=0

31
This equation is derived from two-dimensional vectors. But in fact, it also works for any
numberof dimensions. This is the equation of the hyperplane.

Fig.7.1 Automatic dispenser specification

Testing of automatic dispenser tools for blind people is done to test the performance of
automatic dispensers. Test equipment is divided into 3 types, namely testing the detection of
the presence of glass; cold water system testing; hot water system testing. The automatic
dispenser testing is as follows.

7.2 CLASSIFIER

Once we have the hyperplane, we can then use the hyperplane to make predictions. We
definethe hypothesis function h as:

h(xi)={+1if

w⋅ x+b≥0

h(xi)={−1if

w⋅ x+b<0

The point above or on the hyperplane will be classified as class +1, and the point
below thehyperplane will be classified as class -1.

So basically, the goal of the learning algorithm is to find a hyperplane which could
separate the data accurately. There might be many such hyperplanes. And we need to find the best

32
one, which is often referred as the optimal hyperplane.

Support Vector Machines has higher effectiveness in higher dimensional spaces. It is even
very effective on data sets where number of dimensions is greater than the number of samples.
This is mainly because of the kernel trick, which we talk about it later. Further advantages of
Support Vector Machines are the memory efficiency, speed, and general accuracy in comparison
to other classification methods like k-nearest neighbor or deep neural networks.

Fig 7.2 Parallel connection with Arduino R3 UNO

7.3 SUMMARY

The equation is derived from two-dimensional vectors. But in fact, it also works for any
numberof dimensions. This is the equation of the hyperplane. Testing of automatic dispenser tools
for blind people is done to test the performance of automatic dispensers. Test equipment is divided
into 3 types, namely testing the detection of the presence of glass; cold water system testing; hot
water system testing.

33
CHAPTER 8

ALGORITH

8.1 SOURCE CODE

-import time

import pyfirmata

from adafruit_servokit import

ServoKit from pydub import

AudioSegment from pydub.playback

import play

from speech_recognition import Microphone,

Recognizer # Board Constants

ARDUINO_PORT = '/dev/ttyACM0' # Change to the correct port when using Windows or macOS

BAUD_RATE = 9600

# Constants for ultrasonic sensor

pins TRIGGER_PIN = 7

ECHO_PIN = 6

MAX_DISTANCE = 200

# Threshold distance for obstacle avoidance

OBSTACLE_THRESHOLD = 15 # centimeters

# Pins for motor control

LEFT_MOTOR_EN = 3

34
LEFT_MOTOR_IN1 = 4

LEFT_MOTOR_IN2 = 5

35
RIGHT_MOTOR_EN = 6

RIGHT_MOTOR_IN1 = 7

RIGHT_MOTOR_IN2 = 8

MOTOR_SPEED = 150

# Bluetooth control commands

COMMAND_FORWARD = b'F'

COMMAND_BACKWARD =

b'B' COMMAND_LEFT = b'L'

COMMAND_RIGHT = b'R'

COMMAND_STOP = b'S'

# Initialize board components

board = pyfirmata.Arduino(ARDUINO_PORT, BAUD_RATE)

ultrasonic_trigger = board.get_pin('d:' + str(TRIGGER_PIN) +

':o') ultrasonic_echo = board.get_pin('d:' + str(ECHO_PIN) + ':i')

led = board.get_pin('d:13:o')

servo_kit =

ServoKit(channels=16) servo =

servo_kit.servo[0]

# Initialize voice

recognition recognizer =

Recognizer() microphone =

Microphone() # Move the

board.digital[LEFT_MOTOR_IN1].write(1)

board.digital[LEFT_MOTOR_IN2].write(0)

36
robot forward def

move_forward():

board.digital[LEFT_MOTOR_IN1].write(1)

board.digital[LEFT_MOTOR_IN2].write(0)

37
board.digital[RIGHT_MOTOR_IN1].write(1)

board.digital[RIGHT_MOTOR_IN2].write(0)

board.digital[LEFT_MOTOR_EN].write(MOTOR_SPEED)

board.digital[RIGHT_MOTOR_EN].write(MOTOR_SPEED)

# Move the robot backward

def move_backward():

board.digital[LEFT_MOTOR_IN1].write(0)

board.digital[LEFT_MOTOR_IN2].write(1)

board.digital[RIGHT_MOTOR_IN1].write(0)

board.digital[RIGHT_MOTOR_IN2].write(1)

board.digital[LEFT_MOTOR_EN].write(MOTOR_SPEED)

board.digital[RIGHT_MOTOR_EN].write(MOTOR_SPEED)

# Turn the robot left

def turn_left():

board.digital[LEFT_MOTOR_IN1].write(0)

board.digital[LEFT_MOTOR_IN2].write(1)

board.digital[RIGHT_MOTOR_IN1].write(1

board.digital[RIGHT_MOTOR_IN2].write(0

board.digital[LEFT_MOTOR_EN].write(MOTOR_SPEED)

board.digital[RIGHT_MOTOR_EN].write(MOTOR_SPEED)

board.digital[LEFT_MOTOR_IN1].write(1)

board.digital[LEFT_MOTOR_IN2].write(0)

38
# Turn the robot right

def turn_right():

board.digital[LEFT_MOTOR_IN1].write(1)

board.digital[LEFT_MOTOR_IN2].write(0)

39
board.digital[RIGHT_MOTOR_IN1].write(0)

board.digital[RIGHT_MOTOR_IN2].write(1)

board.digital[LEFT_MOTOR_EN].write(MOTOR_SPEED)

board.digital[RIGHT_MOTOR_EN].write(MOTOR_SPEED)

# Stop the robot

def stop_robot():

board.digital[LEFT_MOTOR_IN1].write(0)

board.digital[LEFT_MOTOR_IN2].write(0)

board.digital[RIGHT_MOTOR_IN1].write(0)

board.digital[RIGHT_MOTOR_IN2].write(0)

board.digital[LEFT_MOTOR_EN].write(0)

board.digital[RIGHT_MOTOR_EN].write(0)

# Turn the robot around

def turn_around():

# Back up

move_backward(

) time.sleep(1)

stop_robot()

# Turn around

turn_left()

time.sleep(0.5

) stop_robot()

# Speak a command using text-to-

speech def speak_command(text):

40
file_name =

'temp.mp3' tts =

gTTS(text=text)

tts.save(file_name)

audio = AudioSegment.from_file(file_name,

format='mp3') play(audio)

# Recognize a command using voice

recognition def recognize_command():

with microphone as source:

recognizer.adjust_for_ambient_noise(source

) audio = recognizer.listen(source,

timeout=5) try:

speech_text =

recognizer.recognize_google(audio) except:

return None

if 'forward' in speech_text:

return

COMMAND_FORWARD elif

'backward' in speech_text:

return

COMMAND_BACKWARD elif

'left' in speech_text:

return COMMAND_LEFT

elif 'right' in speech_text:

41
return COMMAND_RIGHT

elif 'stop' in speech_text:

return COMMAND_STOP

42
else:

return None

# Control the robot using Bluetooth commands or voice

recognition def control_robot():

# Read Bluetooth commands

if board.serial.available() > 0:

command = board.serial.read()

if command ==

COMMAND_FORWARD:

move_forward()

elif command ==

COMMAND_BACKWARD:

move_backward()

elif command ==

COMMAND_LEFT: turn_left()

elif command ==

COMMAND_RIGHT: turn_right()

elif command ==

COMMAND_STOP: stop_robot()

# Recognize voice

commands else:

command = recognize_command()

if command ==

43
COMMAND_FORWARD:

speak_command('Moving forward')

move_forward()

44
elif command ==

COMMAND_BACKWARD:

speak_command('Moving backward')

move_backward()

elif command ==

COMMAND_LEFT:

speak_command('Turning left')

turn_left()

elif command ==

COMMAND_RIGHT:

speak_command('Turning right')

turn_right()

# Read distance from ultrasonic

sensor def get_ultrasonic_distance():

ultrasonic_trigger.write(1)

time.sleep(0.001) # Required delay between trigger and echo

ultrasonic_trigger.write(0)

duration =

ultrasonic_echo.read() distance =

duration * 0.034 / 2 return

distance

# Main program

loop def main():

45
while True:

# Read distance from ultrasonic

sensor distance =

get_ultrasonic_distance()

# If an obstacle is detected...

46
if distance < OBSTACLE_THRESHOLD:

# Stop the

robot

stop_robot()

# Turn

around

turn_around()

else:

# Otherwise, continue moving

forward move_forward()

# Control the robot using Bluetooth or voice

commands control_robot()

time.sleep(0.1)

8.2 WEB APPLICACTION (CODE)

<!DOCTYPE html>

<html lang="en">

<head>

<title>I C A</title>

<link rel="stylesheet" href="STYLE.CSS">

</head>

<body>

<div class="main">

<div class="navbar">

<div class="icon">

47
<h2 class="logo">I C A</h2>

</div>

48
<div class="menu">

<ul>

<li><a href="#"">HOME</a></li>

<li><a href="#"">ABOUT</a></li>

<li><a href="#"">SERVICE</a></li>

<li><a href="#"">DESIGN</a></li>

<li><a href="#"">CONTACT</a></li>

</ul>

</div>

<div class="search">

<input class="srch" type="search" name=""placeholder="type to text">

<a href="#"> <button class="btn">search</button></a>

</div>

</div>

<div class="content">

<h1>INDUSTORY CART AUTOMATION

<br><span>Join</span><br>New Ubdate</h1>

<P class="par">Easy carrry

<br>

<br>Nemo placeat cumque explicabo?</P>

<button class="cn"><a href="#">JOIN US</a></button>

<!-- Add icon library -->

<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0/css/

font- awesome.min.css">

49
<!-- Fonts -->

<link href="https://fonts.googleapis.com/css?family=Dancing+Script" rel="stylesheet">

<link href="https://fonts.googleapis.com/css?family=Pacifico" rel="stylesheet">

<link href="https://fonts.googleapis.com/css?family=EB+Garamond" rel="stylesheet">

<link href="https://fonts.googleapis.com/css?family=Ledger" rel="stylesheet">

<link href="https://fonts.googleapis.com/css?family=Arimo" rel="stylesheet">

</head>

<body>

<header class="v-header container">

<div class="fullscreen-video-wrap">

<video src="vid/dock.mov" autoplay="" loop=""></video>

</div>

<div class="header-overlay"></div>

<div class="header-content ">

<div class="logo">

<img src="./img/logo4.png" alt="Puzzle Lightbulb">

</div>

<div class="logo-name">

<img src="./img/name3.png" alt="Denis S Dujota">

<a href="#about" class="btn fa fa-chevron-circle-down fa-5x"></a>

</div>

<!-- <h1>Hello World</h1>

<p>Lorem ipsum, dolor sit amet consectetur adipisicing elit. Id temporibus

perferendis necessitatibus numquam amet impedit explicabo? Debitis quasi ullam aperiam!

50
</p> -->

51
<!--

</div> -->

</header>

<section class="section section-light">

<h1>Who Am I ?</h1>

<p>

I am an adventurer.

It can take me to the far east for a bowl of the best ramen on earth or pulling a marathon

of Neil deGrasse Tyson. It's that spark of curiousity which drives my passion. Nothing has

captivated me more than the journey of being a web developer. I'd like to keep it up.

</p>

</section>

<div class="parallax-img2">

<div class="ptext">

<span class="border">

<img src="./img/quote5.png" alt="Isaac Newton Quote">

</span>

</div>

</div>

<div class="section section-dark">

<h2>What can I do?</h2>

<p class="skill-name"><i class="fa fa-cogs" aria-hidden="true"></i> Back-End

(Rails, PostgreSQL)</p>

<div class="skill-container">

52
<div class="skills back-end">90%</div>

</div>

<p class="skill-name"><i class="fa fa-picture-o" aria-hidden="true"></i> Front-End

(HTML, CSS, JS)</p>

<div class="skill-container">

<div class="skills front-end">85%</div>

</div>

<p class="skill-name"><i class="fa fa-shield" aria-hidden="true"></i> Testing

(RSpec, Jasmine, PhantomJS)</p>

<div class="skill-container">

<div class="skills testing">80%</div>

</div>

<p class="skill-name"><i class="fa fa-pencil-square-o" aria-hidden="true"></i>

Web Design</p>

<div class="skill-container">

<div class="skills design">75%</div>

</div>

<p class="skill-name"><i class="fa fa-graduation-cap" aria-hidden="true"></i> Learning

a new skill</p>

<div class="skill-container">

<div class="skills learn">100%</div>

</div>

</div>

<div class="parallax-img3">

<div class="ptext">
53
<span class="border">

<img src="./img/quote6-a.png" alt="Socrates Quote">

</span>

</div>

</div>

<div class="section section section-dark">

<h2>What have I done?</h2>

<p>

Lorem ipsum dolor sit amet, consectetur adipisicing elit. Officia quisquam voluptatum

quidem praesentium temporibus eaque commodi nihil, rerum saepe in perspiciatis. Odit,

blanditiis eius debitis, quisquam et optio! Iure nesciunt commodi nulla aliquam culpa fugit nihil

laboriosam soluta repellat consequuntur magnam repudiandae odio reiciendis, veniam illo,

modi quibusdam impedit deleniti?

</p>

</div>

<div class="form">

<h2>login Here</h2>

<input type="email" name="email" placeholder="enter email here">

<input type="password" name="" placeholder="enter password here">

<button class="btnn"><a href="#">login</a></button>

<p class="link">Don't have an account<br>

<a href="#">Sign up </a>here</a></p>

<p class="liw">log in with</p>

</div>

<div class="icon">
54
<a href="#"><ion-ion name="logo-facebook"></ion-ion></a>

<a href="#"><ion-ion name="logo-instagram"></ion-ion></a>

<a href="#"><ion-ion name="logo-whatsapp"></ion-ion></a>

<a href="#"><ion-ion name="logo-twitter"></ion-ion></a>

<a href="#"><ion-ion name="logo-google"></ion-ion></a>

</div>

</div>

</div>

</div>

<script src="https://unpkg.com/ionicons@5.4.0/dist/ionicons.js"></script>

</body>

</html>

8.3 STYLE CODE

*{

margin: 0;

padding: 0;

.main{

width: 100%;

background: linear-gradient(to

top,rgba(0,0,0,0.5)50%,rgba(0,0,0,0.5)50%),url(1.jpg); background-position: center;

background-size:

cover; height: 109vh;

.navbar{

width: 1200px;
55
height: 75px;

margin: auto;

h1, h2{

font-family: 'Pacifico',

cursive; font-size: 40px;

p, span{

font-family: 'Arimo',

serif; font-size: 20px;

/* Scrolling Effect */

.parallax-img1, .parallax-img2, .parallax-

img3{ position: relative;

opacity: 0.70;

background-position: center;

background-size: cover;

background-repeat: no-

repeat;

/*

fixed =

parallax scroll

= normal

*/

56
background-attachment: fixed;

57
/* import images and position them */

.parallax-img1{

background-image:

url('../img/img1.jpeg'); min-height: 100%;

.parallax-img2{

background-image:

url('../img/img2.jpeg'); min-height: 400px;

.parallax-img3{

background-image: url('../img/img3-

3.jpeg'); min-height: 400px;

/* Section Styling */

.section{

text-align: center;

padding: 50px 80px;

.section-light{

background-color:

#E8E8E8; color:#666;

.section-dark{

background-color:#282e34;

58
color: #ddd;

/* image text and content styling */

.ptext{

position:

absolute; top:

50%;

width: 100%;

text-align:

center; color:

#000;

font-size: 27px;

letter-spacing:

8px;

text-transform: uppercase;

.ptext .border

img{ position:

relative; top: -

195px;

left: -20px;

max-height:

390px; opacity:

0.90;

59
}

.ptext .border.trans{ backgroun

d-color: transparent;

/* Header Video Conent styling */

60
.v-header{

height:

100vh;

display: flex;

align-items:

center; color: #fff;

.container{

padding-left: 1rem;

padding-right:

1rem; margin: auto;

text-align: center;

overflow: hidden;

fullscreen-video-

wrap{ position:

absolute; top: 0;

left: 0;

width: 100%;

height: 100vh;

overflow:

hidden;

.fullscreen-video-wrap

61
video{ min-width: 116%;

min-height: 100%;

62
margin-bottom: -

500px; margin-left: -

290px;

.header-overlay{

height: 100vh;

width: 100%;

position:

absolute; top: 0;

left: 0;

background:

#d3d3d3; z-index: 1;

opacity: 0.40;

.header-content{

/* border: solid 5px white; */

z-index: 2;

max-height:

100vh; max-width:

100vw; margin: 0

auto;

/* Logo & name */

.logo img{

63
/* background-color: lightgray;

*/ margin-top: 30px;

64
max-width: 100vw;

/* margin-left: -100px; */

.logo-name img{

/* background-color: lightblue;

*/ max-width: 100vw;

height: 270px;

position: relative;

bottom: 320px;

.header-content

h1{ font-size:

50px; margin-

bottom: 0;

/* align-content: center; */

.header-content p{

font-size:

1.5rem; display:

block;

padding-bottom: 2rem;

/* About Me Button */

.logo-name a{

/* background-color: lightgreen;

65
*/ display: block;

left: 0;

66
}

.btn{

color: #fff;

font-size: 1.2rem;

text-decoration:

none; position:

relative; bottom:

320px;

left: 28px;

/* Hover Shaking effect */

.btn:hover, .ptext .border img:hover{

animation: shake 0.82s cubic-bezier(.36,.07,.19,.97)

both; transform: translate3d(0, 0, 0);

backface-visibility:

hidden; perspective:

1000px;

@keyframes shake

{ 10%, 90% {

transform: translate3d(-1px, 0, 0);

20%, 80% {

transform: translate3d(2px, 0, 0);

67
}

30%, 50%, 70% {

68
transform: translate3d(-4px, 0, 0);

40%, 60% {

transform: translate3d(4px, 0, 0);

/* Rating styling */

.checked {

color:

orange;

.rating{

display: inline-

block; padding:

20px;

/* skills */

.skill-container

{ width: 100%;

background-color: #ddd;

.skills {

text-align: right;

padding-right:

69
20px; line-height:

40px; color: white;

70
}

.skill-name{

text-align:

left;

margin: 15px 0;

.learn{ width: 100%;

background-color: #616161;

.back-end {

width: 90%;

background-color: #616161;

.front-end {

width: 85%;

background-color: #616161;

.testing {

width: 80%;

background-color: #616161;

.design {

width: 75%;

71
background-color: #616161;

72
/* Mobile friendly */

@media(max-width:960px)

.fullscreen-video-wrap

video{ min-width: 116%;

min-height: 100%;

margin-bottom: -

500px; margin-left: -

520px;

@media (max-width:720px) {

.fullscreen-video-wrap

video{ min-width: 116%;

min-height: 100%;

margin-bottom: -

500px; margin-left: -

640px;

@media(max-width:568px){

.fullscreen-video-wrap

video{ margin-left: -800px;

73
}

.parallax-img1, .parallax-img2, .parallax-

img3{ background-attachment: scroll;

74
.header-content{

max-width:

400px

.logo{

max-width: 367px;

.logo-name img{

/* background-color: red;

*/ position: relative;

left: -110px;

bottom: 200px;

max-width:

600px;

.logo img{

position:

relative;

.form h2{

width: 220px;

font-family: sans-

serif; text-align:

center; color: #ff7200;

font-size: 22px;

background-color:

75
#fff; border: 10px;

margin: 2px;

padding: 8px;

76
}

.form input{

width:

240px;

height: 35px;

background: transparent;

border-bottom: 1px solid

#ff7200; border-top: none;

border-right:
none; border-left:
none; color: #fff;
font-size: 15px;
letter-spacing:
1px; margin: 30px;
font-family: sans-serif;
}
.form
input:focus{ ou
tline: none;
}
::placeholder
{ color:
#fff;
font-family:Arial;
}
.btnn{
width:
240px;
height: 40px;
background:
#ff7200;
border:none;
margin-top: 30px;

77
font-size: 18px;
border-radius:
10px; cursor:
pointer; color: #fff;
transition: 0.4s
ease;

78
}
.btnn:hover{ backg
round: #fff;
color: #ff7200;
}
.btnn a{
text-decoration:
none; color: #000;
font-weight: bold;
}
.form .link{
font-family: Arial, Helvetica, sans-
serif; font-size: 17px;
padding-top:
20px; text-align:
center;
}
}
.form .link a{
text-decoration:
none; color: #ff7200;
}
.liw{
padding-top: 15px;
padding-bottom:
10px; text-align:
center;

.container

.parallax-img2{

background-image:

url('../img/img2.jpeg'); min-height: 400px;

.parallax-img3{

background-image: url('../img/img3-
79
3.jpeg'); min-height: 400px;

80
}

/* Section Styling */

.section{

text-align: center;

padding: 50px 80px;

.section-light{

background-color:

#E8E8E8; color:#666;

.section-dark{

background-

color:#282e34; color:

#ddd;

/* image text and content styling */

.ptext{

position:

absolute; top:

50%;

width: 100%;

text-align:

center; color:

#000;

}
81
font-size: 27px;

letter-spacing:

8px;

text-transform: uppercase;

}
82
.ptext .border

img{ position:

relative; top: -

195px;

left: -20px;

max-height:

390px; opacity:

0.90;

.ptext .border.trans{ backgroun

d-color: transparent;

/* Header Video Conent styling */

.v-header{

height:

100vh;

display: flex;

align-items:

center; color: #fff;

.container{

padding-left: 1rem;

padding-right:

}
83
1rem; margin: auto;

text-align: center;

overflow: hidden;

}
84
.fullscreen-video-

wrap{ position:

absolute; top: 0;

left: 0;

width: 100%;

height: 100vh;

overflow:

hidden;

.fullscreen-video-wrap

video{ min-width: 116%;

min-height: 100%;

margin-bottom: -

500px; margin-left: -

290px;

.header-overlay{

height: 100vh;

width: 100%;

position:

absolute; top: 0;

left: 0;

background:

#d3d3d3; z-index: 1;

opacity: 0.40;

85
.header-content{

/* border: solid 5px white; */

86
z-index: 2;

max-height:

100vh; max-width:

100vw; margin: 0

auto;

/* Logo & name */

.logo img{

/* background-color: lightgray;

*/ margin-top: 30px;

max-width: 100vw;

/* margin-left: -100px; */

.logo-name img{

/* background-color: lightblue;

*/ max-width: 100vw;

height: 270px;

position: relative;

bottom: 320px;

.header-content

h1{ font-size:

50px; margin-

bottom: 0;

/* align-content: center; */

87
.header-content p{

font-size:

1.5rem;

display: block;

88
padding-bottom: 2rem;

/* About Me Button */

.logo-name a{

/* background-color: lightgreen;

*/ display: block;

left: 0;

.btn{

color: #fff;

font-size: 1.2rem;

text-decoration:

none; position:

relative; bottom:

320px;

left: 28px;

}{

max-width: 568px;

padding-left: 1rem;

padding-right:

1rem; margin: auto;

.ptext .border

img{ position:

relative; top: -

180px;

left: 20px;

89
max-width: 500px;

90
8.4 SCREEN SHOTS

IR SENSORS

Fig:8.1 IR sensor

Ultrasonic:

Fig:8.2 Ultrasonic

91
8.4 BATTERY

An electrical battery is one or more electrochemical cells that convert stored chemical
energy into electrical energy.[1] Since the invention of the first battery (or "voltaic pile") in 1800 by
Alessandro Volta, batteries have become a common power source for many household and industrial
applications. According to a 2005 estimate, the worldwide battery industry generates US$48 billion
in sales each year, with 6% annual growth.There are two types of batteries: primary batteries
(disposable batteries), which are designed to be used once and discarded when they are exhausted,
and secondary batteries (rechargeable batteries), which are designed to be recharged and used
multiple times. Miniature cells are used to power devices such as hearing aids and wristwatches;
larger batteries provide standby power for telephone exchanges or computer data centers.

The usage of "battery" to describe electrical devices dates to Benjamin Franklin, who in 1748
described multiple Leyden jars (early electrical capacitors) by analogy to a battery of cannons. Thus
Franklin's usage to describe multiple Leyden jars predated Volta's use of multiple galvanic cells. It is
speculated, but not established, that several ancient artifacts consisting of copper sheets and iron bars
and known as Baghdad batteries may have been galvanic cells.

Volta's work was stimulated by the Italian anatomist and physiologist Luigi Galvani, who in
1780 noticed that dissected frog's legs would twitch when struck by a spark from a Leyden jar, an
external source of electricity.[8] In 1786 he noticed that twitching would occur during lightning
storms.[9] After many years Galvani learned how to produce twitching without using any external
source of electricity. In 1791 he published a report on "animal electricity." [10] He created an electric
circuit consisting of the frog's leg (FL) and two different metals A and B, each metal touching the
frog's le-g and each other, thus producing the circuit A-FL-B-A-FL-B...etc. In modern terms, the
frog's leg served as both the electrolyte and the sensor, and the metals served as electrodes. He
noticed that even though the frog was dead, its legs would twitch when he touched them with the
metals.

Within a year, Volta realized the frog's moist tissues could be replaced by cardboard soaked
in salt water, and the frog's muscular response could be replaced by another form of electrical
detection. He already had studied the electrostatic phenomenon of capacitance, which required
measurements of electric charge and of electrical potential ("tension"). Building on this experience,
Volta was able to detect electric current through his system, also called a Galvanic cell.

92
The terminal voltage of a cell that is not discharging is called its electromotive force (emf),
and has the same unit as electrical potential, named (voltage) and measured in volts, in honor of
Volta. In 1800, Volta invented the battery by placing many voltaic cells in series, literally piling
them one above the other. This voltaic pile gave a greatly enhanced net emf for the combination, [11]
with a voltage of about 50 volts for a 32-cell pile. [12] In many parts of Europe batteries continue to be
called piles.

8.5 SUMMARY

All the components have been assembled and the python coding are incorporated into the rove craft
and stimulation is done. The usage of "battery" to describe electrical devices dates to Benjamin
Franklin, who in 1748 described multiple Leyden jars (early electrical capacitors) by analogy to a
battery of cannons. Thus, Franklin's usage to describe multiple Leyden jars predated Volta's use of
multiple galvanic cells. It is speculated, but not established, that several ancient artifacts consisting
of copper sheets and iron bars and known as Baghdad batteries may have been galvanic cells.

93
CHAPTER 9

CONCLUSION

9.1 CONCLUSION

Nowadays we need everything computerized. Earlier we can only monitor the situations
with the help of cameras. In industries to reduce manual overhead we have implemented Internet of
Things (IoT) in Industry. As sometimes it will be late in this process, and it will harm to property as
well as life. For this purpose, we are developing a system for Automation using IoT with the help of
Artificial Intelligence to make system automated. The system developed uses basic conveyor belt
mechanism for bottle size detection and counting number of bottles. This system also gives us
flexibility for controlling conveyor belt from computer side.

The main part of the project is the industrial automation, which makes the industry to
run through internet. To control this action, we create a web page where we give the input that is
click on the button sitting at one place on your web page and getting your machinery start in your
industry.

The web page appears on users PC and internet is connected to the machines through
Wi- Fi modem. There at the receiver side motors are connected to the application to drive them. The
motors are controlled with the help of a Microcontroller. The Board we are using are free scale Arm
Embed Board KL25Z which controls the robotic arm. The user interface which is used to control the
robotic arm is made on a web page or an app. The control is given via the internet to the Wi-Fi
module. This acts as the receiver and gives the received signal to the microcontroller (KL25Z). The
microcontroller will act as per the given instructions.

The controller board is already connected to the server through the Wi-Fi module. The
signal which is given to the industrial machine is sent through the internet and hence we can access
it from any place. However, the web page or the app must require a login ID and a password for
security reasons, for a person to control the industrial appliance. The happenings in the industry
during the absence of manual attention can be seen through an ICA.

9.2 FUTURE ENHANCEMENT

In future, we can extend the framework to implement various deep learning algorithm to
improvethe accuracy rate and analyses the system to predict multiple Tasks will be implemented.
94
this system can mark images. Because SLAM corresponds with 3D point cloud information,
information

95
and identification tags can be integrated with SLAM for positioning to determine the 3D spatial
environment and conduct object marking and modelling.

Users can know the relationship between images and other objects through image content
semantics. Segmented object, positioning information, and 3D SLAM point cloud integration
enables the system to track and reconstruct information on the surrounding environment Index
illustrates a map modeled using SLAM, the most crucial process in the navigation system. Like
Google Maps, pre- established map information must be created for the system to access information
in the circulatory system. After the system acquires image data, it analyzes the map-related
information pertaining to the image. Through therefore mentioned method, the basic system
structure was completed.

The general application are, this project is used in the robot would be very useful in all the
suggested circumstances, with most interest in using it for going to new destinations, followed by
railway stations, supermarkets, airports, public buildings and last, but still evaluated very highly at 4
out of 5, everyday use. Suggestions of other circumstances in which the robot would be useful
included large public buildings and campuses, the countryside, and open spaces, ‘shared spaces’,
navigating round road works and leisure facilities.

9.3 SUMMARY

The signal which is given to the industrial machine is sent through the internet and
hence we can access it from any place. However, the web page or the app must require a login ID
and a password for security reasons, for a person to control the industrial appliance. The happenings
in the industry during the absence of manual attention can be seen through an ICA.

96
CHAPTER 10

REFERENCES
3 RobotShop Distribution Inc. “History of Robotics: Timeline.” 2008.

4 Tanaka, F., et al., “Telepresence robot helps children in communicating with teachers who speak a different language”,

Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, ACM: Bielefeld, Germany,

pp. 399-406, 2014.

5 Harik, E.H. and A. Korsaeth, “Combining Hector SLAM and Artificial Potential Field for Autonomous Navigation Inside

a Greenhouse”, Robotics, 7(2), pp. 22, 2018.

6 Singh, Harbhinder. “Design and Development of an Autonomous Robot BOOK”, 2019.

7 J. Lee, et al., “Industrial robot calibration method using denavit — Hatenberg parameters”, 17th International Conference

on Control, Automation and Systems (ICCAS), pp. 1834-1837, 2017.

8 Jeelani, S., et al., “Robotics and medicine: A scientific rainbow in hospital. Journal of Pharmacy & Bioallied Sciences”,

7(Suppl 2): p. S381-S383, 2015. [7] Yusof, M., and Dodd, T., “Pangolin: A Variable Geometry Tracked Vehicle With

Independent Track Control”, Field Robotics, pp. 917- 924, 2012.

9 C. Xin, et al., “Design and Implementation of Debris Search and Rescue Robot System Based on Internet of Things”,

International Conference on Smart Grid and Electrical Automation (ICSGEA), pp. 303-307, 2018.

10 Day, C.-P., “Robotics in Industry—Their Role in Intelligent Manufacturing”, Engineering, 4(4), pp. 440-445, 2018.

11 Aliff M, et al., “Simple Trajectory Control Method of Robot Arm Using Flexible Pneumatic Cylinders”, Journal of

Robotics and Mechatronics”, 27(6), pp. 698-705, 2015.

12 Aliff M, et al., “Control and analysis of simple-structured robot arm using flexible pneumatic cylinders”, International

Journal of Advanced and Applied Sciences, 4(12), pp. 151-157, 2017.

13 Aliff M, et al., “Control and analysis of robot arm using flexible pneumatic cylinder”, Mechanical Engineering Journal,

1(5), pp. DR0051-DR0051, 2014.

14 Abhilash Dhumatkar, et al., “Automatic Fire Fighting Robot”, International Journal of Recent Research in Mathematics

Computer Science and Information Technology, April, 2015.

15 Joga D., et al., “Virtual Reality Simulation of Fire Fighting Robot”, Dynamic and Motion, ICIUS, October 24-26, 2007.

Chris Flesher, et al., “Fire Protection Robot”, Final Report, pp. 1-78, 2004.

16 Myles Durkin, et al., “Firefighting Robot”, A Proposal, May 5, 2008.

97
17 Gerald Weed, et al., “Pokey the Fire-Fighting Robot”, A Logical Design Using Digital and Analog Circuitry, May 11,

1999.

18 Kim, J.-H., et al., “Feature Selection for Intelligent Firefighting Robot Classification of Fire, Smoke, and Thermal

Reflections Using Thermal Infrared Images”, Journal of Sensors, pp. 13, 2016.

19 R. N. Haksar and M. Schwager, “Distributed Deep Reinforcement Learning for Fighting Forest Fires with a Network of

Aerial Robots”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1067-1074, 2018.

20 Jiang, Shouchao, et al., “Buckling Behaviour of Aluminium Alloy Columns under Fire Conditions”, Thin-Walled

Structures, vol. 124, pp. 523–537, 201800

98
99

You might also like