You are on page 1of 65

Visually Impaired Audible Helpmate Android Application

BY
Sidra Asghar, Faiza
2015-GCWUF-2502, 2015-GCWUF-2567

Project Report submitted in partial fulfillment of requirements for the degree


of

MASTER OF SCIENCE
IN
COMPUTER

FACULTY OF SCIENCE AND TECHNOLOGY


GC WOMEN UNIVERSITY
FAISALABAD PAKISTAN

May 2019
DECLARATION
I hereby declare that the contents of the thesis, “Visually Impaired Audible
Helpmate” is product of my own research and no part has been copied from any
published source (except the references, standard mathematical and genetic
models/ equations/ formula/ protocols etc.). I further declare that this work has not
been submitted for award of any diploma/ degree. The university may take action
if the information provided is found inaccurate at any stage (in case of default the
scholar will be proceeded against as per HEC plagiarism policy).

Sidra Asghar
2015-GCWUF-2502
Faiza Pervaiz
2015-GCWUF-2567
The Controller of Examinations,
The members of the Supervisory Committee find the thesis submitted by Sidra
Asghar (2015-GCWUF-2502), Faiza Pervaiz (2015-GCWUF-2563) satisfactory
and recommend that it be processed for evaluation by the External Examiner(s) for
the award of degree.

NAME SUPERVISOR-I Ms. Sahar Hussnain

NAME MEMBER Ms. Zunaira Sattar

NAME MEMBER Ms. Huma Sarwar


Chapter 1

Introduction
It is estimated that there are about 253 million people with vision impairment in
the world. The group of visually impaired persons are facing trouble in their daily
life without the assistance from their family or friends. The growing number of
visually impaired people has brought attention of many researchers. Researchers
are working on understanding visually impaired people characteristics, needs, and
protecting them against risks that they may face in their daily living activities.
With advances in artificial intelligence and mobile computing, the mobile devices
have grown popularity to become one of the most usual and popular user devices.
The technology has brought more convenient to the blind and visually impaired
people with a friendly interface. With advances in new technologies, mobile
devices have grown in popularity to become one of the most common consumer
devices. Cell phones are very important part of modern life. Many of us need to
make a call or send a message at anytime from anywhere. For blind and motion-
impaired people this issue is more obvious, but other people also often face this
problem, e.g., when driving or using a smart-phone under bright sunshine. Sighted
users often find them inevitably placed under situations where non-visual
interaction is required.

There is a need for visually impaired people to get to know about the life outside
their home and they desire to have access to Internet and mobile services as normal
one. This application is more focused to support visually impaired people with
their daily mobile needs. “Visually Impaired person’s Audible Helpmate” is an
android application which supports voice commands. The application is developed
for visually impaired people. After unlocking the mobile phone, the application
will be launched without any voice command. The systems accept voice command
and perform the operations according to it. For performing the further task, it first
translates the voice into text and then produces output in the form of voice.
It performs basic functions such as calling, messaging and operations of contact
(such as add, show, delete). Previously visually impaired people need to operate
the phone keys manually by remembering the position of keys. But in case of this
application they just have to operate the phone by voice command.

1.1 Project Background/Overview


People who are visually challenged still struggle every day in performing actions
that can be simple. The blind persons have to remember the placement of buttons
on their phone which is very difficult for them. They are unable to use smart phone
like others. So, this application is made on those aspects which will give them a
facility to use smart phones freely by just handling their phone with their voice in a
smooth easy way.
As the technology have been advanced, so the use of smart phone has increased.
Previously, the blind people use mobiles that has button and they have to
remember the position of keys in order make call or send messages, but now the
blind persons also use touch mobiles. Some visually impaired persons find
difficulties in using smart phone due to their touch system, but now many systems
have been developed to facilitate the blind person. The present individual
associates are unequipped for giving satisfactory help to outwardly impeded
individuals. There's a need to build up a shrewd individual aide that is similarly
assistive for them.
The outwardly impeded people capable of being heard collaborator will assist them
with performing essential elements of telephone without being subject to anybody.
This application likewise stays away from them from clicking of catches or
recalling catch position.
In this application user is able to access the services of smart phone with their
voice command. User can easily send a message to the recipient available in their
contact list as well as to the mobile number by voice command. In this Project
physically disable person or the person having less knowledge about smart phone
or how to access the smart phone can easily access the phone with their voice or
speech command.
Individuals who are outwardly tested still battle each day in performing activities
that can be basic. The visually impaired people need to recollect the position of
catches on their telephone which is troublesome for them. They are unfit to utilize
advanced cell like others. Thus, this application is made on those viewpoints which
will give them an office to utilize advanced cells openly by simply taking care of
their telephone with their voice in a smooth simple manner.
The visually impaired persons audible assistant will help them to perform basic
functions of phone without being dependent on anyone. This application also
avoids them from clicking of buttons or remembering button position.

1.2 Objectives
 The objective of this project is to develop software application which is
user friendly, simple, fast, and cost effective.
 The main objective of this project is to develop a user-friendly application
to facilitate visually impaired persons by which they can control their smart
phone by just their voice.
 The target of this venture is to create programming application which is
easy to understand, straightforward, quick, and savvy.
 The fundamental goal of this undertaking is to build up an easy to
understand application to encourage outwardly weakened people.
 The app will provide step-by-step instruction with voice feedback to get
user from a location to selected destination.
 The principle target of this android application is to give correspondence
framework to the blinds in a successful route by exploring the forces of the
stage to the most extreme.
 They just order their phone to perform activity which they want.
 This application handles incoming as well as outgoing call using voice
command
 This application handles incoming as well as outgoing SMS using voice
command.
 All the features in the application are guided through voice-based
commands, so that it would be easier for visually impaired people to use
the application.
 User can start the speech recognition process by clicking anywhere on
screen of application.
 The user can set alarm or can stop alarm by just using voice commands.
 The user can set event by just speaking date and time or can delete
previously created event.
 This application is also beneficial for busy persons.
 In this Project physically disable person or the person having less
knowledge about smart phone or how to access the smart phone can easily
access the phone with their voice or speech command.
 The objective of this project to develop an Android application which will
interact user with voice command to perform some emergency option.

1.3 Problem Description


What:
With advances in new advances, cell phones have developed in notoriety to end up
a standout amongst the most widely recognized shopper gadgets. Phones are vital
piece of present-day life. A significant number of us have to make a call or
communicate something specific at whenever from anyplace. For visually impaired
and movement impeded individuals this issue is increasingly self-evident, however
other individuals likewise regularly face this issue, e.g., when driving or utilizing
an advanced mobile phone under splendid daylight. Located clients frequently
discover them unavoidably set under circumstances where non-visual connection is
required.
The present personal assistants are incapable of providing adequate assistance to
visually impaired people. There's a need to develop a smart personal assistant that
is equally assistive for them.
People who are visually challenged still struggle every day in performing actions
that can be simple. “Software systems that do not satisfy the users are often having
poor and incomplete design. The poor and incomplete design can be due to
systems designers and developer’s failure to involve users in the development.
According to National Eye Institute (NEI), in 2010 U.S. has 66% prevalent cases
of blindness in female and 34% in male. This is an evidenced fact that there are
many softwares developed today are still inaccessible to visually challenged people
just because of their accessibility and usability issues. Until recently, most touch
screens provided few or no accessibility features, leaving them largely unusable by
blind people.
The biggest problem faced by visually impaired persons is to use mobile phone by
their own without depending on anyone. Many blind and visually impaired people
feel embarrassed to use their phone whilst out in public, it’s people’s wrongly
perceived ideas and assumptions that make them feel this way. Previously visually
impaired people need to operate the phone keys manually by remembering the
position of keys. The blind people face challenges daily in communicating with the
world around them. They have to depend on their sighted colleagues for making a
phone call and accessing other mobile functionalities. But now with the
advancement in technology there should be an app that facilitate the visually
impaired persons.
graphical user interfaces (GUI), objects, such as buttons, menus and documents,
are presented for users to manipulate in ways that are similar to the way they are
manipulated in the real work space, only that they are displayed on the screen as
icons. The first and most essential step of interaction with GUI is target locating,
sighted people can intuitively and the target object with a quick visual scan. But
without visual access to the screen, this simple task is often very hard to finish,
which is the main cause of difficulties in non-visual interaction. A Voice
Command Device (VCD) is a device controlled by means of the human voice.
Google created a speech recognition engine called Pico TTS. These also apply to
anyone unable to look at and touch their device, leaving voice commands their
only option.
Why:
In this project we are going to explore automation capabilities and planning
capabilities of the agent and micro-controller. It highlighted the requirements on
external data and knowledge sources. The motivation behind this exploration
venture is to make an application that would empower the outwardly tested to
utilize some fundamental highlights of cell phones along these lines making their
life somewhat less complex. Through this application the visually challenged user
can always be connected with the world around.
Hence, introducing a smart personal assistant that is smart in a way that it provides
audible assistance. Visually impaired persons audible helpmate can be used as an
interface to the digital world to make the consumption of this information timely
and efficient for the user’s specific tasks. Goal of the project is to design personal
assistant that understands the semantics of the task, is able to identify the task
correctly and make appropriate decision without much HCI.
Our application has all the basic functionalities of phone like call, SMS, contacts,
alarm and events. we build this app with the purpose of providing combined
features in single product. This system is a voice recognizing application for
mobile phones that allow access to most of the functionalities of the phone and will
make it possible for visually impaired people to connect with the society. It will
provide barrier free user interface to guarantee autonomous usage of mobile phone
by a blind user.
The motivation behind this exploration venture is to make an application that
would empower the outwardly tested to utilize some fundamental highlights of cell
phones therefore making their life somewhat less difficult. The examination is
dedicated to discover a calculation which would require less time for example
acknowledgment and that would be effective. Utilizing different calculations can
help improve productivity. The purpose behind utilizing Android working
framework-based cell phones is that Android is an open source innovation now a
days. This venture is committed to those a huge number of differentially-abled
individuals who the world has wrongly marked as crippled. This venture would
ideally contribute something to the general public and help make the lives of a
large number of outwardly tested individuals simpler.
This app will help blind in a way that they just speak message and message app
will be open then app will ask the user to whom he wants to send message and ask
the user to speak the name or number of persons and the message will easily be
send to that person. Similarly, all other activities of call, contacts and alarm will be
done by just their voice.

1.4 Project scope


Mobile phones are vital piece of present-day life. A large number of us have to
make a call or communicate something specific at whenever from anyplace. For
visually impaired and movement weakened individuals this issue is increasingly
self-evident, however other individuals additionally frequently face this issue, e.g.,
when driving or utilizing an advanced mobile phone under splendid daylight.
Located clients frequently discover them unavoidably put under circumstances
where non-visual connection is required.
Hand held gadgets have been an inescapable piece of our life for right around 10
years. Of all the portable stages, Android has been the most looked for after OS by
a larger part of versatile clients. There comes the matter of the level of attention or
achieve an application created in Android is probably going to appreciate. A
significant number of us have to make a call or communicate something specific at
whenever from anyplace. Remember that individuals with inabilities have a wide
range of availability needs and that there are distinctive approaches to make
innovation changes. This venture takes a gander at the condition of availability in
the informing framework that may give bearings to expanding correspondence
advancements around the world. The point of this undertaking is to help daze
individuals by building up a client intuitive application through which a client can
get to the framework utilizing voice as information.
The blind people face challenges daily in communicating with the world around
them. They have to depend on their sighted colleagues for making a phone call and
accessing other mobile functionalities. This system is a voice recognizing
application for mobile phones that allow access to most of the functionalities of the
phone and will make it possible for visually impaired people to connect with the
society. The sighted user’s people with limited reading ability can also use this
application if they are involved in activities that prevent reading (e.g.: driving or
other eyes occupied situations).
This application underpins voice directions. The application is produced for
outwardly hindered individuals. Subsequent to opening the cell phone, the
application will be propelled with no voice order. The frameworks acknowledge
voice direction and play out the activities as indicated by it. For playing out the
further errand, it initially makes an interpretation of the voice into content and after
that produces yield as voice.
In this application user is able to access the services of smart phone with their
voice command. User can easily send a message to the recipient available in their
contact list as well as to the mobile number by voice command. In this Project
physically disable person or the person having less knowledge about smart phone
or how to access the smart phone can easily access the phone with their voice or
speech command.
The visually impaired individuals face difficulties day by day in speaking with
their general surroundings. They need to rely upon their located associates for
making a telephone call and getting to other portable functionalities. This
framework is a voice perceiving application for cell phones that enable access to
the vast majority of the functionalities of the telephone and will make it workable
for outwardly debilitated individuals to associate with the general public. The
located client's kin with restricted perusing capacity can likewise utilize this
application on the off chance that they are engaged with exercises that avoid
perusing (e.g.: driving or different eyes involved circumstances).
In this application client can get to the administrations of PDA with their voice
direction. Client can without much of a stretch make an impression on the
beneficiary accessible in their contact list just as to the versatile number by voice
order. In this Project physically impair individual or the individual having less
learning about advanced cell or how to get to the PDA can without much of a
stretch access the telephone with their voice or discourse order.

Chapter 2
Literature Review
Requirements analysis is a software engineering task that bridges that gap between
system level requirement engineering and & software design. Requirements
analysis provides the software designer with a representation of information,
interface, function and behavior that can be translated to data & information,
architectural, interface and component level designs.
With respect to extend this undertaking is exceptionally useful for the outwardly
impeded people. The goal of this venture is to create programming application
which is easy to understand, straightforward, quick, and practical. The fundamental
goal of this venture is to build up an easy to use application to encourage tie people
so they can undoubtedly utilize cell phone without somebody's assistance.
The other related applications and their author’s reviews and their work are as
follows.
Just Speak Enabling Universal Voice Control
[Yu Zhong, T.V. Raman, Casey Burkhardt, Fadi Biadsy and Jeffrey P. Bigham]
Just Speak Enabling Universal Voice Control is an android application. In this
application authors have, designed and implemented voice user interfaces in the
form of voice navigation, commands and launching other applications using C and
C++ programs. The voice application was written in C++ supported by
CodeWarrior 9.1 and designed in common Palm Operating System applications.
The important feature of Just Speak was implementation of multiple commands in
single speech. It was more time to combine multiple commands into one sentence
than repeating the whole dialog. The interaction to launch applications via voice
commands is simple and fully accessible for non-visual use. It has been released on
Google Play Store for free downloads. This application provides limited
functionalities and unable to handle extra voice commands in noisy environment.
But our application will handle different commands even in noisy environment and
give the user better results.
Visually impaired
[Jawahar, 2017]
Visually impaired is an android application based on voice commands. The
application was developed by Jawahar in Nov 6, 2017. The features of this
application are the user can call to any contact and can check recently used
applications. Other features include Bluetooth, WIFI on or off. But this application
uses shacking of phone. Even if the mobile screen is off this application gets
activated without user’s permission. This application does not have the basic
facilities like contacts, message, alarm etc. Our application has all basic facilities
that user of mobile wants and the user does not have to shake phone in order to
start the application, because shaking of phone activate application each time and
the user of application gets irritated.
Blind communicator
[Leounardo Javier]
Blind communicator is opensource launcher for blind people for using
smartphones and tablets. The launcher has a voice guide that tells the user
everything that it's happening in the device (screen off/on, screen rotation,
incoming call, etc. This launcher is compatible with Talkback. The major feature
of this application is it uses speech commands. This application uses gestures of
fingers to slide up, down, left or right to perform different functions that is the big
drawback of this application as the blind persons gets confused when to slide left
or right. Our application not uses gestures of fingers it just uses voice commands to
perform different functions.
Voice Based System in Desktop and Mobile Devices for Blind People
Voice Based System in Desktop and Mobile Devices for Blind People. This
application specified the use of ASR (automatic speech recognizer) and TTS (Text-
to Speech) get used for converting speech to text and vice versa. This included
development of text Braille systems, screen magnifiers and screen readers. Web
browser for blind were the two-web browser framework that were used by blind
people to access the internet including email Gmail- System read messages on
recipient mailbox. RSS- Real simple syndication for news Song- listen songs Book
reader-system read book Drive browser- To search drives and folders. Voice mail
architecture helped blind people to access email and other multimedia function of
operating system. This application, described the voice mail architecture used by
blind people to access email and multimedia function of operating system easily
and efficiently. This system mainly focuses on just mailing system and has less
features for visually impaired persons.
Google Now and Siri
The voice control systems are the applications or devices controllable by
commands of a human voice. There are various levels of voice control supported in
mobile and computing devices. The most popularly used applications are Google
Now and Siri. The Google Now is a google product that supports a dialogue
interaction of the android and a user. It supports various functionalities, mostly
controlling the Google products on phone – Gmail, Google drive, YouTube,
Google Photos, Google Books, Play store, etc. Apple has introduced Siri as a voice
assistant app into iOS and later partnered with Nuance to launch a more intelligent
model of Siri in iOS 3. Google Now and Siri are both constrained and are restricted
by the way they handle commands. Every Siri input is divided into words to check
for the presence of a function keyword. An example of function keyword is
Calendar. If a function keyword is present, the required function is carried out;
otherwise a google search is initiated by considering a web search query as input.
Hence, it becomes difficult for them to integrate the applications from third party.
Further, if there are multiple instructions for a task, the user has to repeat it several
times. For instance, a task like Send email and disable internet connection will not
be recognized. It has to be broken into two instructions: send email and disable
internet connection.
Voice Assistant for Visually Impaired
[Nevon projects]
Voice Assistant for Visually Impaired in Android is an innovative system for visually
impaired people and acts as a voice assistant for them. This system is used to help the
visually impaired to have access to the most important features of the phone enhancing the
quality of the system making use of different custom layouts and using speech to text. The
system has custom messaging features also. It also has dialer options as well. There is an
important thing is to know about the current time and location. The user also read the
contents of the message for checking purpose. Since the system doesn’t use internet,
the data is saved offline and is phone dependent. If the phone is formatted or lost
the data is lost. This application doesn’t allow the addition of any contact in phone
as our application does. Our voice assistant application gives better solution for
visually impaired people.
Google TalkBack
[Google, 2017]
TalkBack is an application that is a part of Google’s Android Accessibility Service.
It is developed to assist visually impaired people to use their cell phones easily. 
The application also reads out the texts loud, while the movements of the user are
carefully evaluated and spoken by the app. In order to enable this app, go to
Settings, Accessibility, and enable TalkBack service. By using this application lot
of repetitive work, like continuous tapping and speaking, speed up the
consumption of devices’ battery level. But our app will work continuously on
normal battery consumption.

Vlingo
[Nuance Communication, 2012]
A free download application that is designed to recognize the user’s
voice, Vlingo can be commanded to perform various functions on the phone. The
User needs to hold down the voice key on the phone or other mobile device and
wait for the tone before speaking their command or tap it once to start and once to
stop. From there the app can place a call, send a text, or connect with other
applications on the phone. The speech recognition only works in English and can
only connect with a limited number of apps. For the visually impaired, there are
still a number of touch movements needed that may make it difficult to use for
visually impaired person. But our application just needs users voice and did not
require more clicks.
Dragon-go
[Nuance Communication, 2013]
Dragon-go is the voice recognition application the voice recognition of Dragon
Go! is more sophisticated than with Vlingo. It can be slow at following commands
and not all of iOS devices are supported but its command recognition is excellent.
In addition, the carousel tabs it uses cannot be customized to suit you but it does
function more efficiently than Vlingo does. Only available in US English, Dragon
Go! is very capable when it comes to interacting with dozens of other apps, but is
missing that element of being able to talk back to you. This makes it a difficult
voice recognition device to use for the visually impaired who will still need to be
able to read the screen on the phone. For the visually impaired, Dragon Go! is a
viable solution for voice dialing from a cell phone but not much else as our
application. The interface is user friendly, but without the ability to translate text to
voice, this feature would be of no use for someone who is blind.
Android based voice assistant for blind people
[Marcos Brata]
Android based voice assistant for blind people gives a simple access of an
android gadget to daze individuals and individuals with low vision, just as
access to library assets with gadgets that have been introduced with this
application. This office can improve the library availability to the visually
impaired and outwardly tested network. The application still has confinements
in infiltration to the client, and the application is just constructed utilizing
Android as its stage. Also, the reliance on libraries from Google has caused
challenges in executing this application with nearby vernacular, which is just
comprehended by the neighborhood network.
Voice Over
[Junar Arciete Landicho]
Voice Over is an android application dependent on voice directions. The
highlights of this application are the client can call to any contact and can
check as of late utilized applications. Different highlights incorporate
Bluetooth, WIFI on or off. However, this application utilizes shacking of
telephone. Regardless of whether the versatile screen is off this application
gets actuated without client's authorization. This application does not have the
essential offices like contacts, message, alert and so on. Our application has
every single fundamental office that client of portable needs and the client
does not need to shake telephone so as to begin the application, since shaking
of telephone actuate application each time and the client of use gets
aggravated.
Ideal Accessibility Installer
[Ideal Group, Inc. Android Development Team, 2013]
This application is also called Platform Access Installer. It contains different
packages such as KickBack, TalkBack, and SoundBack (TKS) applications for
visually impaired and blind people. These applications add audible, vibration and
spoken feedback to Android devices. But this application just compatible with
some of the devices and didn’t work many other mobiles.
Voice Message Sender
[Mega Phone Apps, 2018]
Voice Massage Sender write message by voice and convert voice into text. The
main feature of send message by voice through is to create a secure environment
for user in which they handle multitasking such as driving and texting. Another
feature of this application is it can work without internet and this application
support multiple languages. But the drawback of this application is, it does not
delete previous messages.
Voice Call Dialer
[Delux App Zone, Dec 16, 2017]
Voice Call Dialer is easy and smart contacts dialer who helps in voice dialing and
making calls. Voice Call / Dialer is a simple application which enables voice
dialing / calls on Android Smartphone. It performs function like directly send a
message or add a number to contacts and search contacts. But the user interface of
this application is very complex and difficult to understand especially for visually
impaired persons.

In any case, these applications are not specifically intended for outwardly disabled
individuals. Not every one of the highlights in these applications are helpful for
outwardly debilitated individuals and it requires some investment and aptitudes to
figure out the highlights of these applications. It is very difficult for the outwardly
hindered individuals to utilize the previously mentioned applications, as they don't
offer voice helped approach in any of the previously mentioned applications.
The present personal assistants are incapable of providing adequate assistance to
visually impaired people. There's a need to develop a smart personal assistant that
is equally assistive for them. Small computing devices has really replaced those big
computers required for doing some large and smart computations. These mini
computers are portable, can support lots of sensors, can connect to internet, can
provide cloud support too and many more services. Implementing extra
intelligence to the system and will dramatically hide/reduce human computer
interaction (HCI).
In this venture we will investigate computerization capacities and arranging
abilities of the operator and smaller scale controller. It featured the prerequisites on
outside information and learning sources. Thus, presenting a savvy individual
colleague that is brilliant such that it gives capable of being heard help. Outwardly
hindered people capable of being heard partner can be utilized as an interface to
the computerized world to make the utilization of this data opportune and
productive for the user's particular undertakings. Objective of the undertaking is to
structure individual partner that comprehends the semantics of the errand, can
recognize the assignment effectively and settle on fitting choice absent much
human computer association.

Chapter 3

Requirement Specifications
This undertaking is connected with the propelled arrangement of voice
acknowledgment as we know the use of cell phones has expanded in the previous
years. Today, out of the 6 billion cell phones on the planet consistently observes an
expansion of around 6 million clients. We have concocted better approaches to
encourage the outwardly disabled people so they can utilize cell phones by their
own without depending anybody. Every one of the highlights in the application are
guided through voice-based directions, with the goal that it would be simpler for
outwardly impeded individuals to utilize the application. Client can begin the
discourse acknowledgment process by clicking anyplace on screen of utilization.
To improve outwardly impeded individuals' independency, this Android
application is created to assist outwardly hindered individuals with their day by
day life portable exercises. To list, an Android application is created with
numerous highlights which will facilitate the outwardly weakened client to utilize
the highlights through voice.
3.1 Existing System
The existing systems are consisting of many different elements and attributes other
than this project. Like they have created such user interface that is inconvenient for
users, the users can’t understand where to click for performing any action. Some
existing projects have limited features.

There are many applications built in android for the visually challenged, but
maximum do not address the basic problems faced by them. The applications,
though really good in their approach do not cater for the basic needs of a blind
person. After the analysis of the various applications available for the visually
challenged, the following conclusions can be drawn:

1. Most Of the applications available are built for navigation purposes for
example” Walky-talky Explorer and Intersection. But these applications are not
able to help the user with basic mobile phone features such as calling and
messaging.

2. There are some other applications which have calling and messaging features
example” Mobile Accessibility” but these take voice as input and are not very
efficient for Pakistani English accent. The applications like “Vlingo Virtual
assistant” and “speaktoit” are based on voice support which makes it difficult for
the application to understand the accent of many users.

3.Application like “VOICe for Android”, which is also meant for visually
challenged, is a universal translator for mapping images to sounds.

But these applications are not specifically designed for visually impaired people.
Not all the features in these applications are useful for visually impaired people
and it takes some time and skills to figure out the features of these applications. It
is very difficult for the visually impaired people to use the above-mentioned
applications, as they do not offer voice assisted approach in any of the above-
mentioned applications.

All the above-mentioned reasons are the primary motivation to develop a dedicate
application for visually impaired people to help them in their daily mobile usage
needs. All the projects and applications mentioned above, do not address the
connectivity problem faced by the visually challenged and thus the visually
challenged are unable to do basic mobile operations such as calling and messaging.
Thus, this provokes the need for a new application that would enable people to use
the basic operations of mobile and which would also keep in mind the needs of the
vocally and the visually challenged. This research is motivated by prior attempts to
create accessible touch screen user interfaces for blind people. The basic objective
of this project is to overcome the limitations of the applications stated above and to
help the visually challenged user to be connected with the world. To achieve this
objective the following features have been implemented _ Speed-Dial Calling,
Messaging, Location Retrieving, Alert for Battery, Current Time and Date.

The present individual collaborators are unequipped for giving sufficient help to
outwardly hindered individuals. There's a need to build up a keen individual
collaborator that is similarly assistive for them. Little registering gadgets has truly
traded those huge PCs required for doing some extensive and shrewd calculations.
These small PCs are versatile, can bolster loads of sensors, can associate with web,
can give cloud support as well and a lot more administrations. Actualizing
additional insight to the framework and will significantly stow away/lessen human
PC communication (HCI).

In this endeavor we will examine computerization limits and orchestrating


capacities of the administrator and littler scale controller. It included the essentials
on outside data and learning sources. Hence, introducing a sagacious individual
partner that is splendid with the end goal that it gives equipped for being heard
assistance. Apparently blocked individuals equipped for being heard accomplice
can be used as an interface to the mechanized world to make the usage of this
information ideal and profitable for the client's specific endeavors. Target of the
endeavor is to structure singular accomplice that understands the semantics of the
errand, can perceive the task viably and settle on fitting decision missing much
human PC affiliation.

But this application provides a very easy to interface with all basic features. The
application just has one centered button, the user has to click on the center of the
mobile and voice system will guide the user to perform further steps. The main
objective of this project is to develop a user-friendly application to facilitate the
Visually impaired persons. It will help to improve their confidence in using smart
phones without depending on anyone.
3.2 Proposed System
“Visually Impaired person’s Audible Helpmate” is an android application which
supports voice commands. The application is developed for visually impaired
people. After unlocking the mobile phone, the application will be launched without
any voice command. The systems accept voice command and perform the
operations according to it. For performing the further task, it first translates the
voice into text and then produces output in the form of voice.

This project demonstrates the idea of messaging and calling system for visually
impaired people. It allows environmental barriers to be removed for people with a
wide range of disabilities. In this project we have presented the system designs and
use cases of the application “Visually Impaired Audible Helpmate”. It is a
universal voice control assistant on Android operating system. The application
provides enhancements to all applications running on mobile system. Visually
Impaired Audible Helpmate can benefit large number of users with universal eyes-
free and hands-free voice control of their mobile devices.

It performs basic functions such as calling, messaging and operations of contact


(such as add, show, delete). Previously visually impaired people need to operate
the phone keys manually by remembering the position of keys. But in case of this
application they just have to operate the phone by voice command. The visually
impaired persons audible assistant will help them to perform basic functions of
phone without being dependent on anyone. This application also avoids them from
clicking of buttons or remembering button position.

The purpose of this research project is to create an application that would enable
the visually challenged to use some basic features of mobile phones thus making
their life a bit simpler. The research is devoted to find an algorithm which would
require less time for pattern recognition and that would be efficient. Using multiple
algorithms can help improve efficiency. The reason for using Android operating
system based mobile phones is that Android is an open source technology now a
days. This project is dedicated to those millions of differentially-abled people who
the world has wrongly labeled as disabled. This project would hopefully contribute
something to the society and help make the lives of millions of visually challenged
people easier. Through this application dialing of any of the ten presaved phone
numbers is implemented. The user can also send to these numbers any of the ten
pre-saved message templates. To use the feature of calling and messaging the user
has to enter the calling/messaging module by giving voice commands on the main
screen, subsequently he/she has to click in center on his smart phone touch screen
and ask the user to speak number to performing some activity. The application then
recognizes this number, asks the user if he wants to make a call or send a message.
To make a Call the user has to give voice command on his phone. The application
would ask the user for confirmation by asking him to shake the mobile (Call
button) else he has to press right button (End button). To send a message the user
has to say some keywords related messages. The device then responds via Audio
with the current date time of the device. Similarly, the user can also access the
alarm module of mobile phone and can set alarm.

3.3 Requirement Specifications

This project is related with the advanced system of voice recognition as we know
the usage of mobile phones has increased in the past years. Today, out of the 6
billion mobile phones in the world every month sees an increase of around 6
million users. We have invented new ways to facilitate the visually impaired
persons so that they can use mobile phones by their own without depending
anyone. All the features in the application are guided through voice-based
commands, so that it would be easier for visually impaired people to use the
application. User can start the speech recognition process by clicking anywhere
on screen of application. To enhance visually impaired people’s independency,
this Android application is developed to help visually impaired people with
their daily life mobile activities. To enumerate, an Android application is
developed with multiple features which will ease the visually impaired user to
make use of the features through voice. The functional and nonfunctional
requirements of the application are described below:

Functional Requirements

3.3.1 Use-Case Application Launch

Identifier Application Launch

Purpose First interaction of the user with system

Priority High
User will click the Application icon to open the
Pre-conditions
Application it will be instructed by the system.

After successful launching, the system will allow


Post-conditions the user to use the main screen. A splash screen
will be shown

Typical Course of Action

S# Actor Action System Response


This use case begins when a user
System Responds with a splash
1 will click the application icon to
screen and then to main screen.
open the application.

2 The User stars the application. System will show a splash screen
and then the main screen with all
The user enters application and
3 main section icons and name to
clicks or presses the section icon.
proceed to the next step.

TABLE 1

3.3.2 Use-Case 2

Identifier Entering Pre-Requisite Sections

Entering basic Sections of the Application like Call, Contact,


Purpose
Message, Alarm and Event.

Priority High

Typical Course of Action


S# Actor Action System Response

This use case begins when a user A screen will be displayed with a
wants to enter in any section of the mic on center of phone and user click
1 application. on it and application ask the user to
speak to go to any activity.

2 The User stars the application.

The user clicks on the any section or


3
item.
Section will be displayed from as
selected by the user appeared for the
The user clicks one of the available
4 new selected section in the system.
sections of application.

User entered the new Section as per


5
click.

TABLE 2

3.3.3 Use-Case 3

Identifier Running

Viewing the current state of the system desired section.


Purpose Like if user selected the contact section it will show the all
contacts and the user can delete or add new contact.

Priority High
Typical Course of Action

S# Actor Action System Response


A menu appeared like if user
This use case begins when a user
selected the Contact section it will
1 wants to enter the specific part of
show the all contacts with their
the application.
name and phone number.

2 The User stars the application.

The user clicks on the Contact Section will display the required
3
section. section and the user can add or
delete contacts.
The user clicks one of the available
4
sections as provided by the system.

User is prompted to enter the next


Required section displayed as
part of the application if he wanted
5 designed keeping in view the
to move and to learn more about the
requirements.
application.

TABLE 3

Non-Functional Requirements
3.3.4 Performance Requirements
 This product is developed for Client Server Architecture, various clients
can simultaneously entertain different request.
 Performance of action on user request.
 Action will be performed on user requirement.
 Each action will be performed according to the user demand.
 User can move to next section by just voice commands.
 Sections can be reviewed after first visit.
 Can operate daily.
 Perform by giving accurate voice commands.
 Proper Backup and Recovery procedures.
 The application will provide the people various functions through their
phones that will help them to do their daily activities and task.
 The people i.e. the blind people and people with low eyesight will just have
to operate their phones through their voice.
 The voice commands will be received by the software and then it will be
processed according to the task requirement of the particular user and then
it will be given to the user through voice command.
 People will have to use their imagination less to do the tasks which they do
through their hand. The possibility of any error or bugs while
communicating with the user with their phones is very much less.
 The key feature of this endeavor is to develop a straightforward application
to support apparently debilitated individuals.
 The application will furnish well-ordered guidance with voice criticism to
get client from an area to choose goal.
 The rule focus of this android application is to give correspondence
structure to the blinds in a fruitful course by investigating the powers of the
phase to the most outrageous.
 All the highlights in the application are guided through voice-based
directions, with the goal that it would be simpler for outwardly disabled
individuals to utilize the application.

3.3.5 Safety Requirements


Backup of the database must be a regular process. This process will be executed on
daily basis or before and recovery or update process.
Security Requirements
All the features of the product are secured only privileged users are allowed to
make changes in the existing data. Critical type of entries like return of new items,
changing the sections of the system users, opening of new Heads, Categories,
Items are assigned to privileged users. Auto backup of the database is carried out in
two different ways like incremental and complete backup. This is achieved by
privileged users. An item which is required to be deleted is first copied into the
backup database or table and then removed from the database. An item which is
deleted by a user, modified, returned, its log information is stored i.e. name of the
user, time and date of the manipulation from which point of section.

3.4 Use case

User cases are dependent on having at least partial understanding of the


requirements of the system. They are required to be used for deeply understanding
the working of the system and related activities. Use case are usually referred to as
behavior diagrams used to describe a set of actions (use cases) that some system or
systems (subject) should or can perform in collaboration with one or more external
users of the system (actors). A use case simply is a narrative document that
describes the sequence of events of an actor using a system to complete a process.
They are described as stories or cases of using a system. Use case illustrates and
implies requirements in the stories expressed by using the system.

3.4.1 Use case and domain processes


A use case describes a process. A process describes from start to finish, all the
events in a sequence, actions, responses, transactions required to produce or
complete something by an actor.

 Start a System/User Action


 Perform User’s actions
 Select the Main section, Category and Items from the given list.
 Moving to the user’s required section.
 Performance of the items as per the user’s request

Use Case: Start Application

Actors: Application Manager/Developer


Purpose: Start Application
To enter give information, category, items, sections.

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter in any section of the
application.
2. The User starts the application. 3. A splash screen will be displayed
with animation and after that a main
menu screen.

4. The user speaks to open the sections 5. Menu Screen will be displayed from
and sub sections. as selected by the user appeared for the
new selected section in the application.

Start Application

Fade in
User Splash

Fade out Splash

Figure 3.1: start application

Use Case: Entering Pre-Requisite Sections

Actors: User, Application Manager/Developer

Purpose: Entering basic Sections of the Application like phone,


contact

A screen will be displayed with sections and sub sections. Section will be
displayed from as selected by the user appeared for the new selected section in the
application.

TYPICAL COURSE OF EVENTS

Actor Actions Application Response


1. This use case begins when a user 5. The user clicks one of the available
wants to enter in any section of the sections of application like phone,
application. contact, message, alarm, event and
weather.

2. The User stars the application.

3. The user speaks by clicking the mic 6. A screen will be displayed with
in the center to enter the any specific main sections.
menu or section.

4. User entered the new Section as per 7. Section will be displayed from as
click. selected by the user appeared for the
new selected section in the application.
Phone Section

Contact
Section
Message
User

Alarm Section

Event Section

Figure 3.2: pre-requisite section

Use Case: Phone Section

Actors: User

Purpose: Phone section will contain call log, make Call, receive call, end
call

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter the specific part of the
application.

2. The User stars the application.

3. The user click on mic. 4. The user speak in mic to reach one
of the available sections as provided
by the application.

5.User makes the selection. Different 6. A menu appeared with call log,
types of more sections are available make call, receive call, end call.
for different selection to enter.

Phone Section

User Call Log


Use Case: Contact

Actor: User

Purpose: The user can find contacts, save contact, delete contact

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter the Contact section of
the application.

2. The User starts the application.

3. The user click the mic and speak 4. The user clicks Contact section as
Contact to open Contact section. provided by the application.

5. User is prompted to enter the next 6. Section will display the options to
part of the application if he wanted to user weather he wants to find contact,
move and to learn more about the add contact or wants to delete contact
application.

Contact Section

User Find contact

Add Contact

Delete Contact

Figure 3.4: Contact Section


Use Case: Message Section
Actor: User

Purpose: In message section the user can read message, send message,
delete message.

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter the Message section of
the application.

2. The User stars the application.

3. The user click on mic. 4. The user speaks in mic to open


message section

5. The user clicks Message section as 6. The user can send new message,
provided by the application. delete any previous message or the
application reads a message for user on
receiving.

Message Section

Send Message
User

Delete Message

Receive Message

Figure 3.5: Message section


Use Case: Alarm Section

Actor: User

Purpose: The user can trigger alarm, set alarm and can stop alarm.

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter the Alarm section of the
application.

2. The User stars the application.

3. The user click on the mic. 4. The speaks in mic to open alarm
section of the application.

5. User is prompted to enter the next 6.The user trigger alarm, set-up new
part of the application if he wanted to alarm and stop the alarm.
move and to learn more about the
application.

Alarm Section

Trigger Alarm

User
Set-Up Alarm

Stop Alarm

Figure 3.6: Alarm section


Use Case: Event Section

Actor: User

Purpose: The user can save new event, delete event.

TYPICAL COURSE OF EVENTS

Actor Actions Application Response

1. This use case begins when a user


wants to enter the Event section of the
application.

2. The User starts the application.

3. The user click on the mic and speak 4. The user enters events section as
calendar. provided by the application.

5. User is prompted to enter the next 6. User save new event with song,
part of the application if he wanted to delete any previously created event
move and to learn more about the
application.

Event Section

User Calendar

Save new Event

Delete Event

Figure 3.7: event section


Chapter 4

Design
Design
In systems design the design functions and operations are described in detail,
including screen layouts, process diagrams and other documentation. The output of
this stage will describe the new system as a collection of modules or subsystems.
Design elements describe the desired system features in detail, and generally
include functional hierarchy diagrams, screen layout diagrams, tables of process
diagrams, pseudo-code, and a complete entity-relationship diagram with a full data
dictionary. These design elements are intended to describe the system in sufficient
detail, such that skilled developers and engineers may develop and deliver the
system with minimal additional input design.

4.1 System Architecture

Figure 4.1 System Architecture


4.2 Design Constraints
We prefer this android studio framework 3.1.4 because everything is clear and it is
easy to use. Everyone can create any application on it easily, it is user friendly tool
for development of applications.
Android Studio is the official Integrated Development Environment (IDE) for
Android app development, based on IntelliJ IDEA [4]. Many features are available
like fast and feature rich emulator i.e. providing virtual devices for testing
applications without the need of an actual Android mobile device, flexible gradle
based build system, instant run function for pushing the new changes without
creating new APK file, GitLab integration, etc.
4.3 Design Methodology
There are some main modules that are used in the application.

GUI: It is used to interact with the user. GUI reflects the basic appearance of the
application.

Voice Recognition: For intelligent voice assistant application is done using


Google Server. This process involves the conversion of acoustic speech into a set
of words and is performed by software component. Accuracy of speech recognition
systems differ in vocabulary size and confusability, modality of speech (isolated,
discontinuous, or continuous speech, read or spontaneous speech), task and
language constraints. The system consists of five modules: feature extraction,
phone model training, dictionary preparation, grammar estimation, and sentence
decoding.

TTS Engine: A text-to-speech (TTS) system converts normal language text into
speech. Synthesized speech can be created by concatenating pieces of recorded
speech that are stored in a database. The output is given in the form of speech

Voice Input Manager: It manages the command given by user. It sends the Input
given by user to the database manager, Database Manager: It compares Input given
by user that is in the form of voice with the database which contains vocabulary of
words. It sends response to the action performer.

Action Performer: It takes response from the database manager as Input and
decides which action should be performed. Action can be in the form of text
message or call.
1. Text message: Users are able to send the SMS to a specific person in the
phonebook as well by giving a correct command which contains the messaging
request keywords. The message should be sent to the destination immediately.

2. Calling service: The application should allow the users to make a call to the
person in the contacts or by saying mobile number of the person to whom user
wants to call. By giving a correct command with the calling request to a stored
person, the Android phone should successfully direct to the number of the person
requested.

4.4High Level Design


Conceptual or Logical:

A data flow diagram (DFD) is a graphical representation of the "flow" of data


through an information system, modelling its process aspects. A DFD is often used
as a preliminary step to create an overview of the system, which can later be
elaborated.

User gives the input into the form of voice; this voice command is recognized by
the application. Then action is performed as per the command given. Command
given is compared with the database.

Figure 4.2

Input is given by user in the form of voice. Using microphone, voice is converted
in binary. GoogleVoiceAPI will convert this voice data in text form and then the
action is performed according to the command given by the user by comparing
with the database.

Figure 4.3

Process:

As activity diagram shows that user views this app then he/she must do follows as;

 User will enter and there is splash screen first.


 Then It will display the main menu screen to him/her.
 The application will ask the user to speak for opening some activity (e.g.
call, message, alarm)
 The user’s command will be recognized and the targeted section will be
open for the user.
Figure 4.4

Physical:

After the successful launch of application, the mic starts and it waits for user to
speak, when the mic receives the data it converts the data into the text format. The
text is then matched with the action which is to be performed. Action is performed.
Figure 4.5

Module:

Code is a group of characters and or digits that identify and describe something
online application. Codes serve many useful purposes. Because codes often are
shorter than the data they represent, they save storage space and costs, reduce
transmission time, and decrease data entry time. Finally, codes can reduce data
input errors in situations when the coded data is easier to remember and enter than
the original source data, when only certain valid codes are allowed, and when
something within the code itself can provide immediate verification then code is
correct.

Because users must deal with the coded data, the coding methods must be
acceptable to them. Application analysts are frequently charge with analyzing and
defining coding schemes.

There are some major modules that are used in this application.

Graphical User Interface: It is utilized to connect with the client. GUI mirrors the
fundamental appearance of the application.

Voice Recognition: For astute voice right hand application is finished utilizing
Google Server. This procedure includes the transformation of acoustic discourse
into a lot of words and is performed by programming segment. Precision of
discourse acknowledgment frameworks vary in vocabulary size and confusability,
methodology of discourse (disengaged, spasmodic, or ceaseless discourse, read or
unconstrained discourse), assignment and language imperatives. The framework
comprises of five modules: include extraction, telephone model preparing, lexicon
readiness, language structure estimation, and sentence disentangling.

Text-To-Speech Engine: A content to-discourse (TTS) framework changes over


typical language content into discourse. Blended discourse can be made by linking
bits of recorded discourse that are put away in a database. The yield is given as
discourse

Voice Input Manager: It deals with the order given by client. It sends the Input
given by client to the database director, Database Manager: It looks at Input given
by client that is as voice with the database which contains vocabulary of words. It
sends reaction to the activity entertainer.

Activity Performer: It takes reaction from the database administrator as Input and
chooses which activity ought to be performed. Activity can be as instant message
or call.

1. Instant message: Users can send the SMS to a particular individual in the
phonebook also by giving a right order which contains the informing demand
catchphrases. The message ought to be sent to the goal right away.

2. Calling administration: The application ought to enable the clients to make a call
to the individual in the contacts or by saying portable number of the individual to
whom client needs to call. By giving a right order with the calling solicitation to a
put away individual, the Android telephone ought to effectively direct to the
quantity of the individual mentioned.

Security:

Every one of the highlights of the item are verified just advantaged clients are
permitted to make changes in the current information. Basic sort of passages like
return of new things, changing the segments of the framework clients, opening of
new Heads, Categories, Items are doled out to special clients. Auto reinforcement
of the database is done in two unique ways like gradual and complete
reinforcement. This is accomplished by favored clients. A thing which is required
to be erased is first duplicated into the reinforcement database or table and
afterward expelled from the database. A thing which is erased by a client, adjusted,
restored, its log data is put away for example name of the client, time and date of
the control from which purpose of segment.

4.5 Low Level Design


A class diagram is a static structure, it is used to give the structure of the system.
The structure of the whole system is described using different classes. The
components of a class are attributes, methods and the relationship between among
them. Some of the relationships between the classes are inheritance, association,
aggregation. A class diagram is a box with three sections. The top section indicates
the name of the class. The second one gives the attributes and the last one is for the
methods of the class. A class is a way to represent an object. An object can be a
person, thing or even an event that can occur in the system.
Figure 4.4 is the class diagram of the visually Impaired Audible Helpmate
application. The class diagram consists of all the major functionalities in its
classes. The major classes present in this application are: User class, message class,
speech input class. All these are extended from the Base Activity which is
provided by Android OS by default.

VIAH app
Figure 4.6 Class Diagram

The above figure is the class outline of the Visually Impaired Audible Helpmate
application. The class chart comprises of all the significant functionalities in its
classes. The real classes present in this application are: User class, message class,
discourse input class. All these are reached out from the Base Activity which is
given by Android OS as a matter of course.

4.5.1 Entities
Call, Message, Contact, Alarm and Events. It is also helpful for blind person to make a
call, save a contact, set an alarm.
All of these articles are categorized in various sections like: -

Sr.# Head Sections Sections Category Section Item

Sub-menus

 Call User can make call, send


Visually Impaired  Message message or can set an
1.
Audible Helpmate  Contact alarm using voice.

 Alarm
 Event

It will make a call by


2. Call  Make Call
using user’s voice

It will send message to the


3. Message  Send Message
contact selected by user.

 Set Alarm It will set alarm on time


4. Alarm  Delete Alarm specified by user.
 Stop Alarm

It includes saving and


deleting of contact
 Save Contact
5. Contact number.
 Delete Contact

It includes the Calendar


 Calendar
6. Event and setting of events.
 Set Event

7. New section Category Item


New section Category Dynamically at run time
Dynamically
Dynamically at run time
At run time
4.5.2 Entity Diagrams

Figure 4.7: first screen

Figure 4.8: main menu


Figure 4.9: Call Section

Figure 4.10: Message section

Figure 4.11: Contact section


Figure 4.12: Alarm section

Figure 4.13: Event section


4.6 GUI Design
It is utilized to connect with the client. GUI mirrors the fundamental appearance of
the application. The following section gives the overview of the application’s user
interface. A user interface is where the user interacts with the application. It is used
to interact with the user. GUI reflects the basic appearance of the application. The
overall flow of the application and the modules will be explained in this section.
4.6.1 Splash Screen
The first screen that appears just after launching application is splash screen. This
splash screen appears for few seconds and then direct toward Home Screen.

Figure 4.14 Splash Screen


4.6.2 Home Screen
The home screen is the second screen that the user visits upon opening of the
application. In this screen, the application features are provided as list of options to
the user. After opening the application, the speech recognition feature runs in the
background to hear the user’s voice about the option that has been selected.

Figure 4.15 Home Screen


4.6.3 Message

This message feature invoked when the user responds to speech recognition with
the word “Message” which runs in the background of home screen. In the below
screen user first say message in home screen then the message activity will open
and ask the user to speak phone number and message to be sent.

Figure 4.16 Message Screen

When the user speaks the number and message, it will be written in above screen,
now the application asks the user to click the screen and message will be sent to
the receiver.

Figure 4.17 Message Sending Screen


4.6.4 Call Service

The application should allow the users to make a call to the person in the contacts
or by saying mobile number of the person to whom user wants to call. By giving a
correct command with the calling request to a stored person, the Android phone
should successfully direct to the number of the person requested.

This Call feature invoked when the user responds to speech recognition with the
word “call” which runs in the background of home screen.

In the below screen user first say call in home screen then the call activity will
open and ask the user to speak phone number to make call.

Figure 4.18 Call Screen


4.6.4 Alarm Service

This Alarm feature invoked when the user responds to speech recognition with the
word “Alarm” which runs in the background of home screen.

In the below screen user first say Alarm in home screen then the alarm activity will
open and ask the user to speak time to set alarm and alarm will set by user’s voice
command.

Figure 4.19 Alarm Screen


4.6.5 Event Service

This Event feature invoked when the user responds to speech recognition with the
word “Event” which runs in the background of home screen.

In the below screen user first say Event in home screen then the Event activity will
open and ask the user to speak date and time to set event and event will set by
user’s voice command.

Figure 4.20 Event Screen


Chapter 5

System Implementation

5.1 System Architecture

Building of visual impaired audible helpmate Application Using Linux piece,


framework libraries, Android run time, application outline work, etc. five sections.
It is appeared in Figure 5.1.

A. Linux Kernel

Android relies on Linux 2.6 version. It provides core services: security, memory
management, process management, network group, driven model. The core part is
equivalent to a abstract level between the hardware layer and other software in the
systems. The center part is comparable to a theoretical dimension between the equipment
layer and other programming in the frameworks. It gives center administrations: security,
memory the executives, process the board, organize gathering, driven model. The center
part is comparable to a dynamic dimension between the equipment layer and other
programming in the frameworks. The middle part is equivalent to a hypothetical
measurement between the gear layer and other programming in the systems.

B. Library and Android Runtime

Android includes a set of C/C++ libraries. Various components of Android system are us
functions are exposed to developers through the Android application framework.
Android's core libraries provide most of the function to the Java class libraries. Every
Android application runs in its own process, and enjoys the proprietary instance
distributed by Dalvik virtual machine, and support multiple virtual machines efficiently
run on the same device. Android incorporates a lot of C/C++ libraries. Different parts of
Android framework are us capacities are presented to designers through the Android
application structure. Android's center libraries give the vast majority of the capacity to the
Java class libraries.

C. Application Framework

Upper core application program of Android system is on frame arrangement API


development, application Architecture can simplify component reuse mechanism. Any
application can publish its own features. These functions can be used to any other
application (of course, it is restricted from the framework constraints safety standards);
and the same to reuse mechanism, the framework supports component replacement. Upper
center application program of Android framework is on edge plan API improvement,
application Architecture can rearrange part reuse instrument. Any application can
distribute its very own highlights. These capacities can be utilized to some other
application (obviously, it is confined from the structure imperatives wellbeing principles);
and the equivalent to reuse system, the structure underpins part substitution.

D. Applications

Android applications are written in Java programming language. The Android SDK tools
compile the code along with any data and resource files—into an archive file with an .apk
suffix. All the code in a single .apk file is considered to be one application and is the file
that Android-powered devices use to install the application. The Android platform default
includes a set of core applications. It includes home, browser, communication services,
contacts and other applications. These applications are written by the Java programming
language.

Fig. 5.1 Android System Architecture


5.1.2 Setting Application Permissions and Libraries

Figure 5.2 shows the permissions that are set in the Android Manifest file. The
application needs permission of call phone, send message, create event. The
application also requires permission to read contacts, read message, and also read
phone state.

Figure 5.2 Permissions


5.1.3 Consuming speech to text

Figure 5.3. shows the speech to Text method which is useful for fetching the data
from user and open activity that user wants. This method converts user’s voice into
text

Figure 5.3. speech recognition

The below method is called right after the speech recognizer to send back the
recognized voice in string.

Figure 5.4 speech recognition


After recognizing user’s speech, conditions are used to match the voice commands
from user with our defined commands. These conditions are used to open any
activity that user wants to open. For example, if user say call, the call command
converted into text and then matched by application’s defined commands and then
call activity will open.

Figure 5.5 command matching


5.1.4 Consuming text to speech

Text to speech uses OnInitListener and it has two methods onInit and
onDestroy. Here only one method is used as shown in figure.
Text to speech method is used to convert text into voice, we have used this method
to guide the user for performing different activities.

Figure 5.6 Text to Speech


5.1.5 Speech Notifications

The speak method is used to provide notifications to the user through voice.

Figure 5.7 shows the speech commands notified to the user while saving the
contact.

Figure 5.7 Speech notifications


5.1.6 Make Call

The voice recognition method is used to provide the facility to make call to user
through voice. Figure 7 shows to make call through voice with this code.

Figure 5.8 Make Call

5.1.7 Send Message


The voice recognition method is used to provide the facility to send message to
user through voice.

Figure 5.9 shows to send message through voice with this code.

Figure 5.9 send message

5.1.8 Receive Message


In this method we receive the message from user and also listen that message
which we receive from the user with voice.

With this feature of “visual impaired audible helpmate Application” application


response us.

Figure 9 shows to send message through voice with this code.

Figure 5.10 Receive Message

After this listener a new method is called to check the phone number of the
receiver to check whether it is in contact list or not.

Figure 5.11 Checking Phone number of the Receiver

5.1.9 Set Alarm


The voice recognition method is used to provide the facility to “set alarm” to user
through voice.

Figure 10 shows to set alarm through voice with this code.

Figure 5.12 Set Alarm

5.1.10 Create Event


The voice recognition method is used to provide the facility to “Create Event” to
user through voice by using calendar.

Figure 10 shows to create event through voice with this code.

Firstly numbers i.e. date and year are matched and then months are matched with
applications defined commands.

Figure 5.13 Commands for Event


After commands matching the event is created. Event is inserted into calendar
according time and date spoken by user. If any command speak by user is incorrect
it generates message “invalid Command”.

Figure 5.14 Create Event

You might also like