You are on page 1of 6

2020 International Conference on Decision Aid Sciences and Application (DASA)

A Low cost Augmented Reality system for


Wide Area Indoor Navigation
​Vivek Dosaya​ ​ hashwat Varshney
S Vijaya Kumar Parameshwarappa
Department of Information​ ​ Department of Information ​Department of Information
​ Science and Engineering​ ​ Science and Engineering Science and Engineering
2020 International Conference on Decision Aid Sciences and Application (DASA) | 978-1-7281-9677-0/20/$31.00 ©2020 IEEE | DOI: 10.1109/DASA51403.2020.9317014

Ramaiah Institute of Technology,​ ​Ramaiah Institute of Technology,​ ​Ramaiah Institute of Technology,


​Bangalore, India ​Bangalore, India ​ Bangalore, India
​vkdosaya@gmail.com ​ s.varshney9761@gmail.com vijaykbp@yahoo.co.in

​ kshay Beniwal
A ​ Shraddha Tak
​ ​Department of Information ​Department of Information
S​ cience and Engineering​ S​ cience and Engineering
​ Ramaiah Institute of Technology,​ ​Ramaiah Institute of Technology,​
​Bangalore, India ​Bangalore, India
​akshaybeniwal12@gmail.com ​shraddhatak810@gmail.com

Abstract — In today's world, there are a lot of outdoor I. INTRODUCTION


navigation apps for visually challenged people, but there
are none that can precisely tell a user's location inside a Wayfinding and Indoor Navigation in Modern Times is
large structure. Indoor navigation is a complex task for important and essential. There are multiple scenarios
the visually challenged as well as for the general public,
where these are needed the most but are not available, or
especially in large structures like malls, airports,
museums, factories, etc. Present solutions and technologies are inaccurate or slow in functioning as seen in the case
are not cost-effective as well as complex. Hence, we are of navigating museums, theme parks, finding an item in
proposing a low-cost model that uses Augmented Reality a grocery store or helping a visually challenged person
to place virtual anchors across a structure, so a person can navigate through a large structure. We also aim to make
navigate from one location to another with the help of navigation accurate and easily accessible at all times in
these anchors. The model doesn't use technologies like all circumstances. Few of the challenges of indoor
GPS. Machine Learning, and Artificial Intelligence but navigation are pinpointing the exact location of a user
here, the anchors placed are pervasive and persistent
inside a structure because GPS will show the same
across the indoor environment for smooth navigation.
Once placed, these virtual anchors remain at their location location even if the person is on different floors, another
and can be used at any time by any person registered on challenge is to use data from Aarogya setu app in our
our app. This model can be extended for the general routes as the data provided is with respect to outdoors
public in any indoor space and also can be enhanced by and is not an exact location inside any indoor structure.
gamification for better user interaction and retention. This Various studies show that Covid-19 is an airborne virus
model can also be extended to collaborate with the and hence can spread to anyone who inhales the air
Aarogya Setu app, which can help us identify routes that exhaled by an infected person. Hence, social distancing
go through spaces through which covid positive patients
is of paramount importance in these times.
have passed which in turn helps us avoid those routes in
real-time navigation.
We chose AR for this application, as it enables us to
Keywords: Augmented reality, GPS, Blind Navigation, make sure that a smartphone is the only requirement to
Indoor Navigation, Mixed Reality, Unity, Covid-19, Vector use our application. With almost everyone having a
Mapping. smart phone the user has to pay no additional amount to
acquire any new piece of hardware. All the updates to

© IEEE 2021. This article is free to access and download, along with rights for full text and data mining, re-use and analysis.

Authorized licensed use limited to: IEEE Xplore. Downloaded on190


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Decision Aid Sciences and Application (DASA)

the applications can be pushed remotely without the applications was written in [2] to achieve accurate and
need of physically going to a location. AR maps a precise visualization. In [3], the system uses a camera to
location on the basis of surroundings instead of latitude determine the user's position by detecting
and longitudes, hence it can accurately tell us about on non-interfering fiducial marks in real time. In [4],
which floor the user is standing, which is not possible
Sebastian Kasprzak et al. proposed a system for
with GPS. However, there are certain limitations of
navigation in indoor spaces using augmented reality.
using AR, for example, it is not very effective in the
dark. Any indoor location has to be moderately to highly
In [5], the augmentation consists of a space-time map of
lit in order for it to work.
the maze overlaid on the real-world maze. Rainer Mautz
The Aarogya Setu app collects four categories of data— and Sebastain Tilch [6] observed that by fusing data
demographic data (like travel history, name, age, from images and sensors (such as INS, GNSS or
gender), contact data (the proximate distance between magnetic sensors), the performance of optical
the individuals), self-assessment data (by answering positioning systems can be improved. ​[7], Xiao A et al.
various medical related question in the app) and location Et al. designed a system to locate users in large indoor
data(latitude and longitude of a user’s position). This is spaces. The system uses common objects as references,
collectively called response data. In a scenario where a
such as windows, tables and doors, to locate users.
covid positive patient uses our app for navigation, we
Giudice et al [8] observed many Mobility aids or
can see which all virtual anchors were travelled by this
person which in turn can help us avoid using these Electronic Travel Aids for blind people. They described
anchors in future routes for other people so the spread of 4 important factors important for implementation of
the virus is minimised. Remaining sections of the paper technology aiding Blind Navigation : Sensory
includes the following: Section II, discusses the related Translation Rules, Selection of Information, Device
works in the areas of Augmented Reality. These papers Operation, Form and Function.In [9] ​Vijaya Kumar B.P
have helped to explain the limitations of navigation et al discusses extent of the tasks scope in horticulture
using augmented reality and how they tried to overcome business, stock administration offices and other farming
those limitations. Section III, describes the system
partners. In ​[10] Naresh et al describes the primary
design of our model. It tells the entire flow of the app
accentuation of this work is to decide the ideal purpose
with respect to a user. Section IV, discusses the
implementation of the project, explaining how virtual of testing and impediments faced while choosing an
anchors are placed in the real world and how with the ideal point for a test.
help of spatial sound (emitted from these anchors).
Section V, shows how the deserted results were achieved
despite the challenges we faced during the
implementation. Concluding remarks are narrated in
section VI along with how we can extend this project for
better navigation for large factory workers and offices.
III. PROPOSED MODEL

The proposed model aims at limiting the use of


II. RELATED WORK
hardware, GPS, and other computationally intensive
Some of the works relating augmented reality, cloud technologies such as Deep Learning, Artificial
Intelligence, etc in the field of navigation and instead
computing and mobile computing navigation are briefly
use Augmented Reality to overcome challenges faced in
explained in this section. indoor navigation.
In [1] L. Atzori, T. Dessi and V. Popescu describes the
location of the user in the indoor environment is
estimated by only relying on the sensors of commercial
smartphones.The implementation of mobile phone

Authorized licensed use limited to: IEEE Xplore. Downloaded on191


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Decision Aid Sciences and Application (DASA)

A. SYSTEM DESIGN ​Figure 2. User Interaction and control flow diagram.

Fig. 2 Illustrates the flow of system procedure with the


real time functioning of the AR application. Here in the
deployment, the virtual anchors are placed across the
structure also to ensure that the anchors are at a
sufficient distance from each other and unblocked by
any physical object. Placing an anchor involves scanning
of that surrounding, which creates a vector map that
stores key features of the surroundings. Figure 3,
illustrates the interaction of each part of the application
that complements each other. Now once a user opens the
application he is asked to scan his surroundings and the
destination. The scanning helps us make a vector map
Fig. 1, illustrates the proposed architecture of the with which we can find the closest virtual anchor. Figure
implemented Augmented model. 5, illustrated how the anchors look when placed in the
The architecture uses client side architecture where the real world. Now the closest anchor emits spatial sound
mobile app interacts with the Microsoft azure ie, it’s loudness depends on the proximity and frequency
cloud.There are three components to the presented depends on the direction. A user can then easily navigate
system, the user, the mobile application and the cloud. these anchors. After passing through an anchor the next
For the mobile application we have used Unity to next anchor enroute destination starts emitting the spatial
compile an Swift app bundle that makes our application sound. Similarly after crossing every anchor the next
run on an iOS device. For cloud we have used microsoft anchor starts emitting spatial sound until the user
azure cloud that acts as our database for storing the reaches his destination. User Interface :The user
virtual anchors. All the virtual anchors are associated Interface consists of a mobile application which when
with a beeping spatial sound whose frequency and launched scans his surroundings to find his location
loudness depends on the proximity and direction of the within the structure and the nearest anchor point, user is
user with respect to the spatial anchor. If the user is then prompted for the destinations and then all the
walking in the direction of the spatial anchor the anchors enroute user's destination are lit up.
loudness increases prompting the user to keep walking
in the same direction. If the user is not facing the anchor
then the frequency of the sound decreases. He can then
turn around and see which direction has the maximum
frequency and that will be the correct direction.

Figure 3. Data flow for the proposed system

B. Functional Requirement

First time users will have to enter credentials and enter


the application. ​Whereas already existing users will be
authenticated by our OAuth flow every time they open
the app without having to enter credentials again.

Authorized licensed use limited to: IEEE Xplore. Downloaded on192


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Decision Aid Sciences and Application (DASA)

Native development kits of iOS and Android such as 1080*2280, minimum screen density of 440 ppi. An iOS
ARkit and ARCore will scan the user’s location to create device must have the following attributes: A10 Fusion
A vector of the user's surrounding data and on chipset, a minimum screen size of 750*1334, minimum
completion will ask the user about the destination he RAM of 2024 MBs.
wants to go to. This data is sent to our API which sends c. Software Functional Requirement​s
For an android phone the software attributes must be:
it to Azure cloud to find the destination Anchor. Once
Android 10 and above, ES version 3.2 and above. For an
found all the anchors from the user's location to the iOS phone the iOS version should be 13 or above.
destination appears on the screen. The user is asked to Currently, almost 40% android users have android 10
follow the beeping sound of the closest anchor. The and above on their devices and 82% iOS users have iOS
sound intensity and frequency varies on the distance of 13 and above. It is poised to increase by the next phone
the user from the anchor and the direction in which the renewal cycle and by then we will have more people
user's phone is. With the help of this sound the user having access to this technology.
crosses the anchor. On crossing every anchor a success
sound is emitted to notify the user. Pseudo Code

a. Procedural requirements for reaching the location Procedure 1: For Mapping a structure

Description and Priority: A set of anchor points will be Input:​ ​Live feed from the user’s camera.
put on the screen of your phone from your present
location to the destination location. Nearest anchor point BEGIN:
will emit a spatial sound. Stimulus/ Response Sequence: Step 1:​ ​Scan the environment ​(feature vector map)
There are various responses from the mobile application Step 2:​ ​POST the data ​ // * for backend database *//
to keep user on track for example: on crossing an anchor ​ Step 3:​ ​ On SUCCESS ​walk to the next point
successfully the anchor emits a success sound while on ​ // * to place the next anchor *//
reaching the destination there is a voice message Step 4: ​Repeat Steps 2 and 3 until the user reaches the
informing the user that he has reached the destination, destination​.
all the queries are voice based. Detailed Requirement END
Sequence:
REQ-1: Active internet connection in user’s phone. Procedure 2: For navigating the structure
REQ-2: Move the device to scan the surrounding and to Input: ​Source S, Destination D, Anchors A, and live
place the anchor points for the navigation. feed from user’s camera.
REQ-3: A safe route is formed after taking the data
from Arogya setu app so that the user does not come in BEGIN
contact with the route travelled by an infected person. Step 1:​ Input​ (present location and D​)
REQ-4: Send a voice message to the user to follow the Step 2:​ ​Scan the environment (f​ eature vector map​)
beeping sound to reach your destination. ​Step 3​:​ GET request​ //*​ to fetch the nearest anchor
point on the basis of feature vector map *//
REQ-5: As the person moves in the direction of the
Step 4:​ Render (​All anchors from S to D​)
anchor the loudness of the sound increases. To find the
Step 5:​ ​Identify the direction with beep sound
direction of the anchor he has to follow the frequency of
Step 6:​ Rotate phone towards high frequency beep
beeping sound.
Step 7:​ Repeat steps 5 and 6 ​// * until to reach D
REQ-6: After reaching at the location app will notify via with success sound *//
voice message that you have reached your destination
safely and successfully. END
b. Hardware Functional Requirements
An android phone must have the following attributes:
RAM of 5466 MBs, minimum screen size of

Authorized licensed use limited to: IEEE Xplore. Downloaded on193


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Decision Aid Sciences and Application (DASA)

​IV IMPLEMENTATION PLATFORM

We have implemented this model using Unity. Unity


gives us a compiled app bundle which can run on the
iOS device. The scanning of the environment is done
using native bundle provided by iOS which is called
ARKit. It helps us create a vector map of the user's
surroundings. We have used the azure cloud for backend
and database. All the anchors are stored in azure and can
be used at any time when the app is under use. Figure 5
shows the results of the working system where the
anchors are loaded in the requested path to guide the
user. These anchors are saved in the cloud and they will
persist forever and are shared to other users also.

Figure 5. A typical working module of the system


they are placed in relation to the first anchor, as in how
far or how high or low are they with respect to the first
anchor. Hence all we need to find is the first anchor and
the rest would appear without any processing.

V. RESULTS AND DISCUSSIONS

Figure 5, shows the prototype application in action. The


Augmented Reality App was developed successfully.
The initial usability test verified the system as a proof of
concept for using augmented reality for indoor
navigation. Based on the results the system will be
further elaborated. The results clearly showed that the
Figure 4. Proposed Functional Modules user reached his requested destination by guiding
through the application. Once the application starts then
the system will proceed with the voice message of the
user and that should be initiated by the destination, and
Internally, for placing the first anchor in a structure we
then the virtual path is loaded using the anchors and the
scan the surrounding of that anchor which creates a 3
virtual arrow signs that will guide the user.The user has
dimensional feature vector map. This feature vector map
to point his phone camera towards the anchors and a
is a set of unique values of the surroundings in 3
sound will emit when the user moves towards the
dimensions, so if we scan that location again we will be
anchors the frequency of the sound will increase and
able to find that particular anchor again.The next set of
when he points away the sound frequency will decrease.
anchors do not need any scanning of the surroundings.
These anchors are saved in the cloud and they will
persist forever and are shared to other users also. The
user merely needs to pass each anchor to retrieve the
arrow to the next anchor. Compared to existing
techniques the implemented application stands apart in
its ability to constantly improve with performance and
simplify navigation. Systematic method was followed

Authorized licensed use limited to: IEEE Xplore. Downloaded on194


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.
2020 International Conference on Decision Aid Sciences and Application (DASA)

throughout the project from coming up with an idea, Mobile Services Resources and Users Barcelona, pp.
together in requirements, designing the model and 158-163, 2011.
consequently implementing it. Testing [14] was done to
tackle parts of various kinds. There are a few limitations [3] A. Mulloni, D. Wagner, I. Barakonyi and D.
we faced while implementing this idea such as the Schmalstieg, "Indoor positioning and navigation with
technique was not effective in poorly lit rooms. This camera phones", IEEE Pervasive Computing, vol. 8, no.
technique not yet deals with notifying a visually 2, pp. 22-31, 2009.
impaired user if there is a person or an obstacle in his
[4] Kasprzak, S., Komninos, A., & Barrie, P. (2013).
route.
Feature-based indoor navigation using augmented
reality. Paper presented at the 2013 9th international
VI. CONCLUSION AND FUTURE SCOPE
conference on intelligent environments.
With this system we are bringing a change in the lives of [5] B. F. Goldiez, "Techniques for assessing and
visually impaired people by easing navigation in indoor
improving performance in navigation and wayfinding
spaces and also reducing the risk of getting infected by
using mobile augmented reality", 2004.
covid-19. This model can be extended to be more user
friendly by gamification of the existing flow which can [6] Sebastian Tilch, Rainer Mautz, "CLIPS
have animated characters at certain checkpoints enroute proceedings", Indoor Positioning and Indoor Navigation
a user's destination. We can use object detection (IPIN) 2011 International Conference on, pp. 1-6, 2011.
algorithms to detect any moving objects which can come
in the way while the user is following virtual anchors [7] Xiao A, Chen R, Li D, Chen Y, Wu D. An Indoor
and instruct user’s about them so there are no collisions. Positioning System Based on Static Objects in Large
We can also enhance the model for better functionality Indoor Scenes by Using Smartphone Cameras. Sensors
when stairs are involved in the routes. Individuals spend (Basel). 2018;18(7):2229. .
most of their time in indoor spaces and as such, the
importance of indoor positioning techniques increases. [8]​Giudice, N.A.; Legge, G.E. Blind Navigation and the
Role of Technology. In The Engineering Handbook of
ACKNOWLEDGMENTS Smart Technology for Aging, Disability, and
Independence; Helal, A., Mokhtari, M., Abdulrazak, B.,
We would like to thank our beloved principal, MSRIT Eds.
for his support and encouragement. We would like to
[9] ​Vijaya Kumar B.P, Mahadeva M N K, M. S. Pawan
express our sincere thanks to all the teaching and
Ranjith, N. D. Nadig and Nikita M K. P, "Augmentation
non-teaching faculty of the ISE Department and my dear
on Satellite Imagery with Information Integrated
friends who helped in all the ways while working on the
Farming," ​2019 IEEE International Conference on
project.
Electrical, Computer and Communication Technologies
REFERENCES (ICECCT)​, Coimbatore, India, 2019, pp. 1-5, doi:
10.1109/ICECCT.2019.8869021.
[1] Luigi Atzori, Tiziana Dessi and Vlad, Popescu,
[10] Naresh, E., Kumar, B. P. Vijaya Kumar.,
“Indoor Navigation System using Image and Sensor
Niranjanamurthy, M., & Nigam, B. (2019). Challenges
Data Processing on a Smartphone”, ​2012 13th
and issues in test process management. Journal of
International conference on Optimization of Electrical
Computational and Theoretical Nanoscience, 16(9),
and Electronic Equipment(OPTIM), ​Brasov,
3744–3747.
Romania, July.2012
.
[2] M. Kessel and M. Werner, “SMARTPOS: Accurate
and precise indoor positioning on mobile phones”,
Proceedings of the 1st International Conference On

Authorized licensed use limited to: IEEE Xplore. Downloaded on195


May 14,2021 at 22:27:20 UTC from IEEE Xplore. Restrictions apply.

You might also like