Professional Documents
Culture Documents
Documentation
Documentation
BY
Sidra Asghar, Faiza
2015-GCWUF-2502, 2015-GCWUF-2567
MASTER OF SCIENCE
IN
COMPUTER
May 2019
DECLARATION
I hereby declare that the contents of the thesis, “Visually Impaired Audible
Helpmate” is product of my own research and no part has been copied from any
published source (except the references, standard mathematical and genetic
models/ equations/ formula/ protocols etc.). I further declare that this work has not
been submitted for award of any diploma/ degree. The university may take action
if the information provided is found inaccurate at any stage (in case of default the
scholar will be proceeded against as per HEC plagiarism policy).
Sidra Asghar
2015-GCWUF-2502
Faiza Pervaiz
2015-GCWUF-2567
The Controller of Examinations,
The members of the Supervisory Committee find the thesis submitted by Sidra
Asghar (2015-GCWUF-2502), Faiza Pervaiz (2015-GCWUF-2563) satisfactory
and recommend that it be processed for evaluation by the External Examiner(s) for
the award of degree.
Introduction
It is estimated that there are about 253 million people with vision impairment in
the world. The group of visually impaired persons are facing trouble in their daily
life without the assistance from their family or friends. The growing number of
visually impaired people has brought attention of many researchers. Researchers
are working on understanding visually impaired people characteristics, needs, and
protecting them against risks that they may face in their daily living activities.
With advances in artificial intelligence and mobile computing, the mobile devices
have grown popularity to become one of the most usual and popular user devices.
The technology has brought more convenient to the blind and visually impaired
people with a friendly interface. With advances in new technologies, mobile
devices have grown in popularity to become one of the most common consumer
devices. Cell phones are very important part of modern life. Many of us need to
make a call or send a message at anytime from anywhere. For blind and motion-
impaired people this issue is more obvious, but other people also often face this
problem, e.g., when driving or using a smart-phone under bright sunshine. Sighted
users often find them inevitably placed under situations where non-visual
interaction is required.
There is a need for visually impaired people to get to know about the life outside
their home and they desire to have access to Internet and mobile services as normal
one. This application is more focused to support visually impaired people with
their daily mobile needs. “Visually Impaired person’s Audible Helpmate” is an
android application which supports voice commands. The application is developed
for visually impaired people. After unlocking the mobile phone, the application
will be launched without any voice command. The systems accept voice command
and perform the operations according to it. For performing the further task, it first
translates the voice into text and then produces output in the form of voice.
It performs basic functions such as calling, messaging and operations of contact
(such as add, show, delete). Previously visually impaired people need to operate
the phone keys manually by remembering the position of keys. But in case of this
application they just have to operate the phone by voice command.
1.2 Objectives
The objective of this project is to develop software application which is
user friendly, simple, fast, and cost effective.
The main objective of this project is to develop a user-friendly application
to facilitate visually impaired persons by which they can control their smart
phone by just their voice.
The target of this venture is to create programming application which is
easy to understand, straightforward, quick, and savvy.
The fundamental goal of this undertaking is to build up an easy to
understand application to encourage outwardly weakened people.
The app will provide step-by-step instruction with voice feedback to get
user from a location to selected destination.
The principle target of this android application is to give correspondence
framework to the blinds in a successful route by exploring the forces of the
stage to the most extreme.
They just order their phone to perform activity which they want.
This application handles incoming as well as outgoing call using voice
command
This application handles incoming as well as outgoing SMS using voice
command.
All the features in the application are guided through voice-based
commands, so that it would be easier for visually impaired people to use
the application.
User can start the speech recognition process by clicking anywhere on
screen of application.
The user can set alarm or can stop alarm by just using voice commands.
The user can set event by just speaking date and time or can delete
previously created event.
This application is also beneficial for busy persons.
In this Project physically disable person or the person having less
knowledge about smart phone or how to access the smart phone can easily
access the phone with their voice or speech command.
The objective of this project to develop an Android application which will
interact user with voice command to perform some emergency option.
Chapter 2
Literature Review
Requirements analysis is a software engineering task that bridges that gap between
system level requirement engineering and & software design. Requirements
analysis provides the software designer with a representation of information,
interface, function and behavior that can be translated to data & information,
architectural, interface and component level designs.
With respect to extend this undertaking is exceptionally useful for the outwardly
impeded people. The goal of this venture is to create programming application
which is easy to understand, straightforward, quick, and practical. The fundamental
goal of this venture is to build up an easy to use application to encourage tie people
so they can undoubtedly utilize cell phone without somebody's assistance.
The other related applications and their author’s reviews and their work are as
follows.
Just Speak Enabling Universal Voice Control
[Yu Zhong, T.V. Raman, Casey Burkhardt, Fadi Biadsy and Jeffrey P. Bigham]
Just Speak Enabling Universal Voice Control is an android application. In this
application authors have, designed and implemented voice user interfaces in the
form of voice navigation, commands and launching other applications using C and
C++ programs. The voice application was written in C++ supported by
CodeWarrior 9.1 and designed in common Palm Operating System applications.
The important feature of Just Speak was implementation of multiple commands in
single speech. It was more time to combine multiple commands into one sentence
than repeating the whole dialog. The interaction to launch applications via voice
commands is simple and fully accessible for non-visual use. It has been released on
Google Play Store for free downloads. This application provides limited
functionalities and unable to handle extra voice commands in noisy environment.
But our application will handle different commands even in noisy environment and
give the user better results.
Visually impaired
[Jawahar, 2017]
Visually impaired is an android application based on voice commands. The
application was developed by Jawahar in Nov 6, 2017. The features of this
application are the user can call to any contact and can check recently used
applications. Other features include Bluetooth, WIFI on or off. But this application
uses shacking of phone. Even if the mobile screen is off this application gets
activated without user’s permission. This application does not have the basic
facilities like contacts, message, alarm etc. Our application has all basic facilities
that user of mobile wants and the user does not have to shake phone in order to
start the application, because shaking of phone activate application each time and
the user of application gets irritated.
Blind communicator
[Leounardo Javier]
Blind communicator is opensource launcher for blind people for using
smartphones and tablets. The launcher has a voice guide that tells the user
everything that it's happening in the device (screen off/on, screen rotation,
incoming call, etc. This launcher is compatible with Talkback. The major feature
of this application is it uses speech commands. This application uses gestures of
fingers to slide up, down, left or right to perform different functions that is the big
drawback of this application as the blind persons gets confused when to slide left
or right. Our application not uses gestures of fingers it just uses voice commands to
perform different functions.
Voice Based System in Desktop and Mobile Devices for Blind People
Voice Based System in Desktop and Mobile Devices for Blind People. This
application specified the use of ASR (automatic speech recognizer) and TTS (Text-
to Speech) get used for converting speech to text and vice versa. This included
development of text Braille systems, screen magnifiers and screen readers. Web
browser for blind were the two-web browser framework that were used by blind
people to access the internet including email Gmail- System read messages on
recipient mailbox. RSS- Real simple syndication for news Song- listen songs Book
reader-system read book Drive browser- To search drives and folders. Voice mail
architecture helped blind people to access email and other multimedia function of
operating system. This application, described the voice mail architecture used by
blind people to access email and multimedia function of operating system easily
and efficiently. This system mainly focuses on just mailing system and has less
features for visually impaired persons.
Google Now and Siri
The voice control systems are the applications or devices controllable by
commands of a human voice. There are various levels of voice control supported in
mobile and computing devices. The most popularly used applications are Google
Now and Siri. The Google Now is a google product that supports a dialogue
interaction of the android and a user. It supports various functionalities, mostly
controlling the Google products on phone – Gmail, Google drive, YouTube,
Google Photos, Google Books, Play store, etc. Apple has introduced Siri as a voice
assistant app into iOS and later partnered with Nuance to launch a more intelligent
model of Siri in iOS 3. Google Now and Siri are both constrained and are restricted
by the way they handle commands. Every Siri input is divided into words to check
for the presence of a function keyword. An example of function keyword is
Calendar. If a function keyword is present, the required function is carried out;
otherwise a google search is initiated by considering a web search query as input.
Hence, it becomes difficult for them to integrate the applications from third party.
Further, if there are multiple instructions for a task, the user has to repeat it several
times. For instance, a task like Send email and disable internet connection will not
be recognized. It has to be broken into two instructions: send email and disable
internet connection.
Voice Assistant for Visually Impaired
[Nevon projects]
Voice Assistant for Visually Impaired in Android is an innovative system for visually
impaired people and acts as a voice assistant for them. This system is used to help the
visually impaired to have access to the most important features of the phone enhancing the
quality of the system making use of different custom layouts and using speech to text. The
system has custom messaging features also. It also has dialer options as well. There is an
important thing is to know about the current time and location. The user also read the
contents of the message for checking purpose. Since the system doesn’t use internet,
the data is saved offline and is phone dependent. If the phone is formatted or lost
the data is lost. This application doesn’t allow the addition of any contact in phone
as our application does. Our voice assistant application gives better solution for
visually impaired people.
Google TalkBack
[Google, 2017]
TalkBack is an application that is a part of Google’s Android Accessibility Service.
It is developed to assist visually impaired people to use their cell phones easily.
The application also reads out the texts loud, while the movements of the user are
carefully evaluated and spoken by the app. In order to enable this app, go to
Settings, Accessibility, and enable TalkBack service. By using this application lot
of repetitive work, like continuous tapping and speaking, speed up the
consumption of devices’ battery level. But our app will work continuously on
normal battery consumption.
Vlingo
[Nuance Communication, 2012]
A free download application that is designed to recognize the user’s
voice, Vlingo can be commanded to perform various functions on the phone. The
User needs to hold down the voice key on the phone or other mobile device and
wait for the tone before speaking their command or tap it once to start and once to
stop. From there the app can place a call, send a text, or connect with other
applications on the phone. The speech recognition only works in English and can
only connect with a limited number of apps. For the visually impaired, there are
still a number of touch movements needed that may make it difficult to use for
visually impaired person. But our application just needs users voice and did not
require more clicks.
Dragon-go
[Nuance Communication, 2013]
Dragon-go is the voice recognition application the voice recognition of Dragon
Go! is more sophisticated than with Vlingo. It can be slow at following commands
and not all of iOS devices are supported but its command recognition is excellent.
In addition, the carousel tabs it uses cannot be customized to suit you but it does
function more efficiently than Vlingo does. Only available in US English, Dragon
Go! is very capable when it comes to interacting with dozens of other apps, but is
missing that element of being able to talk back to you. This makes it a difficult
voice recognition device to use for the visually impaired who will still need to be
able to read the screen on the phone. For the visually impaired, Dragon Go! is a
viable solution for voice dialing from a cell phone but not much else as our
application. The interface is user friendly, but without the ability to translate text to
voice, this feature would be of no use for someone who is blind.
Android based voice assistant for blind people
[Marcos Brata]
Android based voice assistant for blind people gives a simple access of an
android gadget to daze individuals and individuals with low vision, just as
access to library assets with gadgets that have been introduced with this
application. This office can improve the library availability to the visually
impaired and outwardly tested network. The application still has confinements
in infiltration to the client, and the application is just constructed utilizing
Android as its stage. Also, the reliance on libraries from Google has caused
challenges in executing this application with nearby vernacular, which is just
comprehended by the neighborhood network.
Voice Over
[Junar Arciete Landicho]
Voice Over is an android application dependent on voice directions. The
highlights of this application are the client can call to any contact and can
check as of late utilized applications. Different highlights incorporate
Bluetooth, WIFI on or off. However, this application utilizes shacking of
telephone. Regardless of whether the versatile screen is off this application
gets actuated without client's authorization. This application does not have the
essential offices like contacts, message, alert and so on. Our application has
every single fundamental office that client of portable needs and the client
does not need to shake telephone so as to begin the application, since shaking
of telephone actuate application each time and the client of use gets
aggravated.
Ideal Accessibility Installer
[Ideal Group, Inc. Android Development Team, 2013]
This application is also called Platform Access Installer. It contains different
packages such as KickBack, TalkBack, and SoundBack (TKS) applications for
visually impaired and blind people. These applications add audible, vibration and
spoken feedback to Android devices. But this application just compatible with
some of the devices and didn’t work many other mobiles.
Voice Message Sender
[Mega Phone Apps, 2018]
Voice Massage Sender write message by voice and convert voice into text. The
main feature of send message by voice through is to create a secure environment
for user in which they handle multitasking such as driving and texting. Another
feature of this application is it can work without internet and this application
support multiple languages. But the drawback of this application is, it does not
delete previous messages.
Voice Call Dialer
[Delux App Zone, Dec 16, 2017]
Voice Call Dialer is easy and smart contacts dialer who helps in voice dialing and
making calls. Voice Call / Dialer is a simple application which enables voice
dialing / calls on Android Smartphone. It performs function like directly send a
message or add a number to contacts and search contacts. But the user interface of
this application is very complex and difficult to understand especially for visually
impaired persons.
In any case, these applications are not specifically intended for outwardly disabled
individuals. Not every one of the highlights in these applications are helpful for
outwardly debilitated individuals and it requires some investment and aptitudes to
figure out the highlights of these applications. It is very difficult for the outwardly
hindered individuals to utilize the previously mentioned applications, as they don't
offer voice helped approach in any of the previously mentioned applications.
The present personal assistants are incapable of providing adequate assistance to
visually impaired people. There's a need to develop a smart personal assistant that
is equally assistive for them. Small computing devices has really replaced those big
computers required for doing some large and smart computations. These mini
computers are portable, can support lots of sensors, can connect to internet, can
provide cloud support too and many more services. Implementing extra
intelligence to the system and will dramatically hide/reduce human computer
interaction (HCI).
In this venture we will investigate computerization capacities and arranging
abilities of the operator and smaller scale controller. It featured the prerequisites on
outside information and learning sources. Thus, presenting a savvy individual
colleague that is brilliant such that it gives capable of being heard help. Outwardly
hindered people capable of being heard partner can be utilized as an interface to
the computerized world to make the utilization of this data opportune and
productive for the user's particular undertakings. Objective of the undertaking is to
structure individual partner that comprehends the semantics of the errand, can
recognize the assignment effectively and settle on fitting choice absent much
human computer association.
Chapter 3
Requirement Specifications
This undertaking is connected with the propelled arrangement of voice
acknowledgment as we know the use of cell phones has expanded in the previous
years. Today, out of the 6 billion cell phones on the planet consistently observes an
expansion of around 6 million clients. We have concocted better approaches to
encourage the outwardly disabled people so they can utilize cell phones by their
own without depending anybody. Every one of the highlights in the application are
guided through voice-based directions, with the goal that it would be simpler for
outwardly impeded individuals to utilize the application. Client can begin the
discourse acknowledgment process by clicking anyplace on screen of utilization.
To improve outwardly impeded individuals' independency, this Android
application is created to assist outwardly hindered individuals with their day by
day life portable exercises. To list, an Android application is created with
numerous highlights which will facilitate the outwardly weakened client to utilize
the highlights through voice.
3.1 Existing System
The existing systems are consisting of many different elements and attributes other
than this project. Like they have created such user interface that is inconvenient for
users, the users can’t understand where to click for performing any action. Some
existing projects have limited features.
There are many applications built in android for the visually challenged, but
maximum do not address the basic problems faced by them. The applications,
though really good in their approach do not cater for the basic needs of a blind
person. After the analysis of the various applications available for the visually
challenged, the following conclusions can be drawn:
1. Most Of the applications available are built for navigation purposes for
example” Walky-talky Explorer and Intersection. But these applications are not
able to help the user with basic mobile phone features such as calling and
messaging.
2. There are some other applications which have calling and messaging features
example” Mobile Accessibility” but these take voice as input and are not very
efficient for Pakistani English accent. The applications like “Vlingo Virtual
assistant” and “speaktoit” are based on voice support which makes it difficult for
the application to understand the accent of many users.
3.Application like “VOICe for Android”, which is also meant for visually
challenged, is a universal translator for mapping images to sounds.
But these applications are not specifically designed for visually impaired people.
Not all the features in these applications are useful for visually impaired people
and it takes some time and skills to figure out the features of these applications. It
is very difficult for the visually impaired people to use the above-mentioned
applications, as they do not offer voice assisted approach in any of the above-
mentioned applications.
All the above-mentioned reasons are the primary motivation to develop a dedicate
application for visually impaired people to help them in their daily mobile usage
needs. All the projects and applications mentioned above, do not address the
connectivity problem faced by the visually challenged and thus the visually
challenged are unable to do basic mobile operations such as calling and messaging.
Thus, this provokes the need for a new application that would enable people to use
the basic operations of mobile and which would also keep in mind the needs of the
vocally and the visually challenged. This research is motivated by prior attempts to
create accessible touch screen user interfaces for blind people. The basic objective
of this project is to overcome the limitations of the applications stated above and to
help the visually challenged user to be connected with the world. To achieve this
objective the following features have been implemented _ Speed-Dial Calling,
Messaging, Location Retrieving, Alert for Battery, Current Time and Date.
The present individual collaborators are unequipped for giving sufficient help to
outwardly hindered individuals. There's a need to build up a keen individual
collaborator that is similarly assistive for them. Little registering gadgets has truly
traded those huge PCs required for doing some extensive and shrewd calculations.
These small PCs are versatile, can bolster loads of sensors, can associate with web,
can give cloud support as well and a lot more administrations. Actualizing
additional insight to the framework and will significantly stow away/lessen human
PC communication (HCI).
But this application provides a very easy to interface with all basic features. The
application just has one centered button, the user has to click on the center of the
mobile and voice system will guide the user to perform further steps. The main
objective of this project is to develop a user-friendly application to facilitate the
Visually impaired persons. It will help to improve their confidence in using smart
phones without depending on anyone.
3.2 Proposed System
“Visually Impaired person’s Audible Helpmate” is an android application which
supports voice commands. The application is developed for visually impaired
people. After unlocking the mobile phone, the application will be launched without
any voice command. The systems accept voice command and perform the
operations according to it. For performing the further task, it first translates the
voice into text and then produces output in the form of voice.
This project demonstrates the idea of messaging and calling system for visually
impaired people. It allows environmental barriers to be removed for people with a
wide range of disabilities. In this project we have presented the system designs and
use cases of the application “Visually Impaired Audible Helpmate”. It is a
universal voice control assistant on Android operating system. The application
provides enhancements to all applications running on mobile system. Visually
Impaired Audible Helpmate can benefit large number of users with universal eyes-
free and hands-free voice control of their mobile devices.
The purpose of this research project is to create an application that would enable
the visually challenged to use some basic features of mobile phones thus making
their life a bit simpler. The research is devoted to find an algorithm which would
require less time for pattern recognition and that would be efficient. Using multiple
algorithms can help improve efficiency. The reason for using Android operating
system based mobile phones is that Android is an open source technology now a
days. This project is dedicated to those millions of differentially-abled people who
the world has wrongly labeled as disabled. This project would hopefully contribute
something to the society and help make the lives of millions of visually challenged
people easier. Through this application dialing of any of the ten presaved phone
numbers is implemented. The user can also send to these numbers any of the ten
pre-saved message templates. To use the feature of calling and messaging the user
has to enter the calling/messaging module by giving voice commands on the main
screen, subsequently he/she has to click in center on his smart phone touch screen
and ask the user to speak number to performing some activity. The application then
recognizes this number, asks the user if he wants to make a call or send a message.
To make a Call the user has to give voice command on his phone. The application
would ask the user for confirmation by asking him to shake the mobile (Call
button) else he has to press right button (End button). To send a message the user
has to say some keywords related messages. The device then responds via Audio
with the current date time of the device. Similarly, the user can also access the
alarm module of mobile phone and can set alarm.
This project is related with the advanced system of voice recognition as we know
the usage of mobile phones has increased in the past years. Today, out of the 6
billion mobile phones in the world every month sees an increase of around 6
million users. We have invented new ways to facilitate the visually impaired
persons so that they can use mobile phones by their own without depending
anyone. All the features in the application are guided through voice-based
commands, so that it would be easier for visually impaired people to use the
application. User can start the speech recognition process by clicking anywhere
on screen of application. To enhance visually impaired people’s independency,
this Android application is developed to help visually impaired people with
their daily life mobile activities. To enumerate, an Android application is
developed with multiple features which will ease the visually impaired user to
make use of the features through voice. The functional and nonfunctional
requirements of the application are described below:
Functional Requirements
Priority High
User will click the Application icon to open the
Pre-conditions
Application it will be instructed by the system.
2 The User stars the application. System will show a splash screen
and then the main screen with all
The user enters application and
3 main section icons and name to
clicks or presses the section icon.
proceed to the next step.
TABLE 1
3.3.2 Use-Case 2
Priority High
This use case begins when a user A screen will be displayed with a
wants to enter in any section of the mic on center of phone and user click
1 application. on it and application ask the user to
speak to go to any activity.
TABLE 2
3.3.3 Use-Case 3
Identifier Running
Priority High
Typical Course of Action
The user clicks on the Contact Section will display the required
3
section. section and the user can add or
delete contacts.
The user clicks one of the available
4
sections as provided by the system.
TABLE 3
Non-Functional Requirements
3.3.4 Performance Requirements
This product is developed for Client Server Architecture, various clients
can simultaneously entertain different request.
Performance of action on user request.
Action will be performed on user requirement.
Each action will be performed according to the user demand.
User can move to next section by just voice commands.
Sections can be reviewed after first visit.
Can operate daily.
Perform by giving accurate voice commands.
Proper Backup and Recovery procedures.
The application will provide the people various functions through their
phones that will help them to do their daily activities and task.
The people i.e. the blind people and people with low eyesight will just have
to operate their phones through their voice.
The voice commands will be received by the software and then it will be
processed according to the task requirement of the particular user and then
it will be given to the user through voice command.
People will have to use their imagination less to do the tasks which they do
through their hand. The possibility of any error or bugs while
communicating with the user with their phones is very much less.
The key feature of this endeavor is to develop a straightforward application
to support apparently debilitated individuals.
The application will furnish well-ordered guidance with voice criticism to
get client from an area to choose goal.
The rule focus of this android application is to give correspondence
structure to the blinds in a fruitful course by investigating the powers of the
phase to the most outrageous.
All the highlights in the application are guided through voice-based
directions, with the goal that it would be simpler for outwardly disabled
individuals to utilize the application.
4. The user speaks to open the sections 5. Menu Screen will be displayed from
and sub sections. as selected by the user appeared for the
new selected section in the application.
Start Application
Fade in
User Splash
A screen will be displayed with sections and sub sections. Section will be
displayed from as selected by the user appeared for the new selected section in the
application.
3. The user speaks by clicking the mic 6. A screen will be displayed with
in the center to enter the any specific main sections.
menu or section.
4. User entered the new Section as per 7. Section will be displayed from as
click. selected by the user appeared for the
new selected section in the application.
Phone Section
Contact
Section
Message
User
Alarm Section
Event Section
Actors: User
Purpose: Phone section will contain call log, make Call, receive call, end
call
3. The user click on mic. 4. The user speak in mic to reach one
of the available sections as provided
by the application.
5.User makes the selection. Different 6. A menu appeared with call log,
types of more sections are available make call, receive call, end call.
for different selection to enter.
Phone Section
Actor: User
Purpose: The user can find contacts, save contact, delete contact
3. The user click the mic and speak 4. The user clicks Contact section as
Contact to open Contact section. provided by the application.
5. User is prompted to enter the next 6. Section will display the options to
part of the application if he wanted to user weather he wants to find contact,
move and to learn more about the add contact or wants to delete contact
application.
Contact Section
Add Contact
Delete Contact
Purpose: In message section the user can read message, send message,
delete message.
5. The user clicks Message section as 6. The user can send new message,
provided by the application. delete any previous message or the
application reads a message for user on
receiving.
Message Section
Send Message
User
Delete Message
Receive Message
Actor: User
Purpose: The user can trigger alarm, set alarm and can stop alarm.
3. The user click on the mic. 4. The speaks in mic to open alarm
section of the application.
5. User is prompted to enter the next 6.The user trigger alarm, set-up new
part of the application if he wanted to alarm and stop the alarm.
move and to learn more about the
application.
Alarm Section
Trigger Alarm
User
Set-Up Alarm
Stop Alarm
Actor: User
3. The user click on the mic and speak 4. The user enters events section as
calendar. provided by the application.
5. User is prompted to enter the next 6. User save new event with song,
part of the application if he wanted to delete any previously created event
move and to learn more about the
application.
Event Section
User Calendar
Delete Event
Design
Design
In systems design the design functions and operations are described in detail,
including screen layouts, process diagrams and other documentation. The output of
this stage will describe the new system as a collection of modules or subsystems.
Design elements describe the desired system features in detail, and generally
include functional hierarchy diagrams, screen layout diagrams, tables of process
diagrams, pseudo-code, and a complete entity-relationship diagram with a full data
dictionary. These design elements are intended to describe the system in sufficient
detail, such that skilled developers and engineers may develop and deliver the
system with minimal additional input design.
GUI: It is used to interact with the user. GUI reflects the basic appearance of the
application.
TTS Engine: A text-to-speech (TTS) system converts normal language text into
speech. Synthesized speech can be created by concatenating pieces of recorded
speech that are stored in a database. The output is given in the form of speech
Voice Input Manager: It manages the command given by user. It sends the Input
given by user to the database manager, Database Manager: It compares Input given
by user that is in the form of voice with the database which contains vocabulary of
words. It sends response to the action performer.
Action Performer: It takes response from the database manager as Input and
decides which action should be performed. Action can be in the form of text
message or call.
1. Text message: Users are able to send the SMS to a specific person in the
phonebook as well by giving a correct command which contains the messaging
request keywords. The message should be sent to the destination immediately.
2. Calling service: The application should allow the users to make a call to the
person in the contacts or by saying mobile number of the person to whom user
wants to call. By giving a correct command with the calling request to a stored
person, the Android phone should successfully direct to the number of the person
requested.
User gives the input into the form of voice; this voice command is recognized by
the application. Then action is performed as per the command given. Command
given is compared with the database.
Figure 4.2
Input is given by user in the form of voice. Using microphone, voice is converted
in binary. GoogleVoiceAPI will convert this voice data in text form and then the
action is performed according to the command given by the user by comparing
with the database.
Figure 4.3
Process:
As activity diagram shows that user views this app then he/she must do follows as;
Physical:
After the successful launch of application, the mic starts and it waits for user to
speak, when the mic receives the data it converts the data into the text format. The
text is then matched with the action which is to be performed. Action is performed.
Figure 4.5
Module:
Code is a group of characters and or digits that identify and describe something
online application. Codes serve many useful purposes. Because codes often are
shorter than the data they represent, they save storage space and costs, reduce
transmission time, and decrease data entry time. Finally, codes can reduce data
input errors in situations when the coded data is easier to remember and enter than
the original source data, when only certain valid codes are allowed, and when
something within the code itself can provide immediate verification then code is
correct.
Because users must deal with the coded data, the coding methods must be
acceptable to them. Application analysts are frequently charge with analyzing and
defining coding schemes.
There are some major modules that are used in this application.
Graphical User Interface: It is utilized to connect with the client. GUI mirrors the
fundamental appearance of the application.
Voice Recognition: For astute voice right hand application is finished utilizing
Google Server. This procedure includes the transformation of acoustic discourse
into a lot of words and is performed by programming segment. Precision of
discourse acknowledgment frameworks vary in vocabulary size and confusability,
methodology of discourse (disengaged, spasmodic, or ceaseless discourse, read or
unconstrained discourse), assignment and language imperatives. The framework
comprises of five modules: include extraction, telephone model preparing, lexicon
readiness, language structure estimation, and sentence disentangling.
Voice Input Manager: It deals with the order given by client. It sends the Input
given by client to the database director, Database Manager: It looks at Input given
by client that is as voice with the database which contains vocabulary of words. It
sends reaction to the activity entertainer.
Activity Performer: It takes reaction from the database administrator as Input and
chooses which activity ought to be performed. Activity can be as instant message
or call.
1. Instant message: Users can send the SMS to a particular individual in the
phonebook also by giving a right order which contains the informing demand
catchphrases. The message ought to be sent to the goal right away.
2. Calling administration: The application ought to enable the clients to make a call
to the individual in the contacts or by saying portable number of the individual to
whom client needs to call. By giving a right order with the calling solicitation to a
put away individual, the Android telephone ought to effectively direct to the
quantity of the individual mentioned.
Security:
Every one of the highlights of the item are verified just advantaged clients are
permitted to make changes in the current information. Basic sort of passages like
return of new things, changing the segments of the framework clients, opening of
new Heads, Categories, Items are doled out to special clients. Auto reinforcement
of the database is done in two unique ways like gradual and complete
reinforcement. This is accomplished by favored clients. A thing which is required
to be erased is first duplicated into the reinforcement database or table and
afterward expelled from the database. A thing which is erased by a client, adjusted,
restored, its log data is put away for example name of the client, time and date of
the control from which purpose of segment.
VIAH app
Figure 4.6 Class Diagram
The above figure is the class outline of the Visually Impaired Audible Helpmate
application. The class chart comprises of all the significant functionalities in its
classes. The real classes present in this application are: User class, message class,
discourse input class. All these are reached out from the Base Activity which is
given by Android OS as a matter of course.
4.5.1 Entities
Call, Message, Contact, Alarm and Events. It is also helpful for blind person to make a
call, save a contact, set an alarm.
All of these articles are categorized in various sections like: -
Sub-menus
Alarm
Event
This message feature invoked when the user responds to speech recognition with
the word “Message” which runs in the background of home screen. In the below
screen user first say message in home screen then the message activity will open
and ask the user to speak phone number and message to be sent.
When the user speaks the number and message, it will be written in above screen,
now the application asks the user to click the screen and message will be sent to
the receiver.
The application should allow the users to make a call to the person in the contacts
or by saying mobile number of the person to whom user wants to call. By giving a
correct command with the calling request to a stored person, the Android phone
should successfully direct to the number of the person requested.
This Call feature invoked when the user responds to speech recognition with the
word “call” which runs in the background of home screen.
In the below screen user first say call in home screen then the call activity will
open and ask the user to speak phone number to make call.
This Alarm feature invoked when the user responds to speech recognition with the
word “Alarm” which runs in the background of home screen.
In the below screen user first say Alarm in home screen then the alarm activity will
open and ask the user to speak time to set alarm and alarm will set by user’s voice
command.
This Event feature invoked when the user responds to speech recognition with the
word “Event” which runs in the background of home screen.
In the below screen user first say Event in home screen then the Event activity will
open and ask the user to speak date and time to set event and event will set by
user’s voice command.
System Implementation
A. Linux Kernel
Android relies on Linux 2.6 version. It provides core services: security, memory
management, process management, network group, driven model. The core part is
equivalent to a abstract level between the hardware layer and other software in the
systems. The center part is comparable to a theoretical dimension between the equipment
layer and other programming in the frameworks. It gives center administrations: security,
memory the executives, process the board, organize gathering, driven model. The center
part is comparable to a dynamic dimension between the equipment layer and other
programming in the frameworks. The middle part is equivalent to a hypothetical
measurement between the gear layer and other programming in the systems.
Android includes a set of C/C++ libraries. Various components of Android system are us
functions are exposed to developers through the Android application framework.
Android's core libraries provide most of the function to the Java class libraries. Every
Android application runs in its own process, and enjoys the proprietary instance
distributed by Dalvik virtual machine, and support multiple virtual machines efficiently
run on the same device. Android incorporates a lot of C/C++ libraries. Different parts of
Android framework are us capacities are presented to designers through the Android
application structure. Android's center libraries give the vast majority of the capacity to the
Java class libraries.
C. Application Framework
D. Applications
Android applications are written in Java programming language. The Android SDK tools
compile the code along with any data and resource files—into an archive file with an .apk
suffix. All the code in a single .apk file is considered to be one application and is the file
that Android-powered devices use to install the application. The Android platform default
includes a set of core applications. It includes home, browser, communication services,
contacts and other applications. These applications are written by the Java programming
language.
Figure 5.2 shows the permissions that are set in the Android Manifest file. The
application needs permission of call phone, send message, create event. The
application also requires permission to read contacts, read message, and also read
phone state.
Figure 5.3. shows the speech to Text method which is useful for fetching the data
from user and open activity that user wants. This method converts user’s voice into
text
The below method is called right after the speech recognizer to send back the
recognized voice in string.
Text to speech uses OnInitListener and it has two methods onInit and
onDestroy. Here only one method is used as shown in figure.
Text to speech method is used to convert text into voice, we have used this method
to guide the user for performing different activities.
The speak method is used to provide notifications to the user through voice.
Figure 5.7 shows the speech commands notified to the user while saving the
contact.
The voice recognition method is used to provide the facility to make call to user
through voice. Figure 7 shows to make call through voice with this code.
Figure 5.9 shows to send message through voice with this code.
After this listener a new method is called to check the phone number of the
receiver to check whether it is in contact list or not.
Firstly numbers i.e. date and year are matched and then months are matched with
applications defined commands.