You are on page 1of 16

How ARKit Works

Case Study #1: Augmented Reality in Moving Vehicles

Case Study #2: Scaling Scenes For Augmented Reality Experiences

Case Study #3: Indoor Navigation with Augmented Reality


Using Innovative Technologies for Indoor Positioning
Image Recognition & Machine Vision
Rendering Paths with Augmented Reality: The Importance of Occlusion

Case Study #4: Face Based Augmented Reality

Augmented reality was considered science fiction in the past. However, it has
evolved to become an integral part of the digital experience that we live in today.
Brought to fruition with the power of iPhone hardware and software
technologies, AR has entered the mainstream for many different real world
applications. Thanks to Apple’s software development tools in ARKit,
development of advanced mobile AR solutions has never been easier.

Despite unprecedented accessibility and power at the hands of developers,


there are still challenges that must be overcome. Let’s talk about how we can
use non-standard solutions for real world projects in order to get around
technological limitations. By working around these constraints, we can achieve
tech solutions for the businesses of our clients.

How ARKit Works


To create an augmented reality experience on an iOS device, three steps must
be done: tracking, scene analysis, and rendering. In more detail, AR applications
require input from sensors like cameras, accelerometers, and gyroscopes. This
data then must be processed in order to determine motion of the camera in the
real world. Once this is complete, 3D virtual objects can be drawn on top of the
image and displayed to the user.

The most optimal environment for ARKit apps is a well-textured and lit area, a
flat surface for visual odometry, and a static scene of motion odometry. When
the environment does not meet these requirements, ARKit provides users with

Augmented Reality in iOS Apps: ARKit Development Guide


information on the tracking state. The three possible states are: not available,
normal, and limited.

ARKit has advanced far beyond simply displaying objects on flat surfaces.
Vertical surfaces, image recognition, and objects placed with geographic
coordinates are all possible now with the latest features of ARKit 5. However, it’s
important to consider that every virtual object requires a frame of reference.
Without a reference point, virtual objects cannot be displayed in a static location
on a user’s screen.

Standard solutions that are available to developers make it easy to add AR


experiences to any app. This has resulted in a great deal of AR apps on the App
Store. This is a good thing, as it increases user exposure to the technology and
helps businesses gain more engagement for their services and products.

However, AR is capable of much more than these standard solutions. To be


competitive, your business must be unique. Developing a brand new experience
that offers the user something new paves the way to innovation and cornering
of the market. Let’s talk about some example cases to demonstrate this in
action.

Case Study #1: Augmented Reality in Moving Vehicles


When ARKit was first announced, we began developing a product that would
allow users in moving vehicles to transform the environment outside their
windows. However, initially, ARKit was not designed to support experiences in
motion. Naturally, we ran into some serious roadblocks as visual and motion
data didn’t match up. Without a static scene, AR experiences wouldn’t work
properly.

Augmented Reality in iOS Apps: ARKit Development Guide 1


This was a very complicated issue to compensate for. Instead of constantly
dealing with the motion data, we opted for a simpler solution. With geographic
coordinates we could render an object more accurately. For example, while
looking out the window of a moving vehicle, AR pins can be displayed showing
the names of various locations as you move past. These could be distant cities,
buildings, or other landmarks.

Our solution then relied on periodic GPS location updates. We could use routing
to calculate intermediate positions and combine this with compass data to
correctly place these pins in the scene. This was simpler than what ARKit
provided, but nonetheless it worked to our advantage.

Vibration is another obstacle that we encountered in this example. As the image


shook, so too did virtual content. In order to correct this, we came up with an
image stabilization algorithm. By taking the position calculations from the
previous task and adding a phase of data smoothing with the moving average,
we could improve the stability of the system. However, this has a major issue:
these averages may result in a completely different direction.

Augmented Reality in iOS Apps: ARKit Development Guide 2


If the average of 90 degrees and 92 degrees is 91, the difference in direction is
marginal. However, the average of 360 degrees and 2 degrees is 181. This is a
very different result that can throw off the correct rendering of virtual objects in
ARKit. This also resulted in some lag whenever the user quickly moved their
device.

The solution to this issue required conditional adaptation. We interpreted small


vibrations to be shaking, while large vibrations meant that the camera had been
moved deliberately by the user. By adjusting the smoothing of the data with this
in mind, we were able to improve the experience of displaying objects outside
the window of a moving vehicle.

This method could be improved for vehicles closer to the ground with ARKit 5’s
enhanced location anchors. Introduced with ARKit 4, location anchors use Apple
Maps Look Around data to identify landmarks like buildings. This allows for
better visual positioning of the device and enables more accurate placement of
virtual objects with geographic coordinates. By using landmarks as reference
points, virtual objects like text, pins, and other elements can be placed in
supported areas with unprecedented accuracy.

Given good enough network infrastructure with ultra wideband technology, it


may be possible to use this technology to better determine one’s dynamic
position and display AR elements accordingly in dense city areas where GPS
coverage is limited. This also would reduce the need for image processing with
VPS solutions.

Case Study #2: Scaling Scenes For Augmented Reality


Experiences
In order to best optimize AR experiences for users, ARKit must use a scale that
matches the real world. This helps to immerse the user and help them believe
that the virtual objects on their display could actually exist. Realistic scaling is
also helpful for placing objects on surfaces near users.

However, we run into some challenges when we want to place virtual elements
above buildings or near surfaces, not directly on those surfaces and buildings.
For example, you may want to display a pin above a destination in a navigation

Augmented Reality in iOS Apps: ARKit Development Guide 3


app, or some text above a popular landmark. In the past, this required a more
complex solution. However, with the latest version of ARKit 5 it’s easy!

The latest versions of ARKit have location anchors, a feature which was also
helpful in the moving vehicle example. If we wanted to place text or a pin above
a destination, we could simply give the app the geographic coordinates of the
location and where the virtual element should be displayed. When the user’s
camera spots the landmark and compares it to Apple Maps Look Around data,
the virtual object can be displayed on their screen correctly.

However, Location Anchors are not available in all cities and are not yet available
in non urban areas, so alternative methods of placement may be necessary like
GPS, UWB, or BLE beacons.

No matter what method is employed for placement, the scale of the virtual
object must always be proportional to the distance from the user. Naturally,
virtual objects that are farther away must be made smaller in the user’s view,
while closer objects are larger.

However, when solving this problem for ourselves, we decided to transform


coordinates with a projection on an invisible sphere. This helped when
displaying coordinates that were very close to each other on the map.

Augmented Reality in iOS Apps: ARKit Development Guide 4


Case Study #3: Indoor Navigation with Augmented Reality
Getting around has never been easier with the technologies that we have
available to us. When outside in cities, GPS technology’s unprecedented
accessibility has made getting lost almost impossible. Augmented reality
experiences have been crafted so that users can hold up their mobile device and
get on-screen directions in the form of virtual elements drawn over the real
world.

However, indoor navigation with AR is not so simple. Without GPS as widely


available, it can become easy to lose one’s place inside a large indoor
environment like a shopping mall, convention center, or airport terminal.

Although major mapping apps primarily focused on outdoor GPS navigation


have begun to create extensive indoor maps of major areas, there is still a major
problem: scene positioning. GPS isn’t accurate enough for indoor environments
to make it reliable for positioning a user for augmented reality navigation.
Without an accurate position, drawing AR directions in the real world is much
more difficult.

Using Innovative Technologies for Indoor Positioning


In order to get around this problem, we cannot rely on GPS. Other technologies,
like Bluetooth Low Energy (BLE), Wi-Fi Round-Trip Time (RTT), and Ultra-Wide
Band (UWB), can help us achieve a far more precise location for augmented

Augmented Reality in iOS Apps: ARKit Development Guide 5


reality navigation. Once a user’s position can be found, ARKit and the rendering
engine can take care of the rest.

By embracing an IoT solution with beacons, regardless of technology chosen, AR


navigation becomes possible. By triangulating the position of the device through
use of beacons, users can find their way through a building. However, there are
some obstacles to success. Developers must ensure that these beacons are
placed in such a way that they can best serve users. Depending on the
technology used, interference can still be a problem. For example, UWB signals
cannot pass through walls, people, plants, and more.

Triangulation is necessary for indoor positioning. As long as the user is within


range of three beacons, the user’s position can be ascertained. However, this
can be challenging given that there may be obstacles that can impede the
system. Maintaining each beacon is necessary as well.

Image Recognition & Machine Vision


Beacon technology can be expensive in some conditions. It requires an
investment in infrastructure that may not be desired for a project. To solve this
problem, we created a solution that did not use any kind of beacon technology.

Augmented Reality in iOS Apps: ARKit Development Guide 6


Our solution works similar to how ARKit location anchors query a database of
Apple Maps Look Around images to find matches. Instead of looking for
buildings as landmarks, we used visual markers. These could be QR codes or any
other kind of marking easily identified by a computer. These markers
communicate location metadata to the mobile device when scanned. These may
be placed on walls or floors.

Once a marker is scanned, the mobile device understands its 3D coordinates.


When used in this way, visual markers eliminate the need for expensive
equipment and routine machine maintenance. They may need to be cleaned
and objects may need to be moved away from them to not obscure them from
view.

Rendering Paths with Augmented Reality: The Importance of


Occlusion
In most ARKit applications, rendering the content is the easy part. There is one
important consideration in indoor navigation, and that is route occlusion. For
example, when a drawn route goes around a corner, we would expect that the
path would be occluded by the wall. When it is not occluded, users can become
confused.

Augmented Reality in iOS Apps: ARKit Development Guide 7


Showing the route with an arrow instead of a drawn path to guide the user is a
possible solution to this issue, but not as convenient. We could also simply show
the route within a certain distance from the user so that it does not extend too
far away around corners. Both solutions are much easier to implement but may
not be the best options.

To take this a step further, we chose to develop a very low-polygon 3D model of


the building. By overlaying this model over the top of the real world as seen
through the user’s camera, occlusion can be better supported. Although this
requires more time and effort in development, it allowed for a much more
natural and understandable indoor navigation solution with ARKit. We believe
that this is the most cost effective solution while providing the highest quality for
users.

Best of all, creating the 3D model of the building is easier than ever with
advancing technology. The iPhone 12 Pro’s LiDAR scanner can be used for this
purpose as well, allowing for the scan of entire rooms into 3D environments.

Check out the video below to see indoor navigation routing technology in action!

Augmented Reality in iOS Apps: ARKit Development Guide 8


Case Study #4: Face Based Augmented Reality
One of the advantages that ARKit has over ARCore is hardware compatibility and
quality. iPhone TrueDepth cameras since the iPhone X have been capable of a
very consistent and high level of quality for face-based augmented reality.
Similar to other kinds of AR, this technology works in three stages: tracking,
scene analysis, and rendering.

ARKit’s data processing results in tracking data, face mesh, and blend shapes.
Tracking data gives us a better idea of where to display the content as the
subject face moves. Face mesh generates geometry of the face. Blend shapes
are parts of the face that have analog information. For example, if the eyes were
half open, the result would be 50% for that blend shape.

Augmented Reality in iOS Apps: ARKit Development Guide 9


Face-based AR is helpful for face masks that are commonly used in apps like
Snapchat, as well as animated virtual faces like Animoji. However, we can take
this a step further. Face-based AR can help users track their weight loss.

It may seem like a stretch, but it is a viable possibility. When a person gains and
loses weight, their face varies in width. An iPhone True Depth camera can detect
these variations and track them over time with an app. This can allow users to
track changes in their weight loss efforts over time.

Augmented Reality in iOS Apps: ARKit Development Guide 10


One solution that is rising in popularity are virtual fitting rooms. Face-based AR
can help consumers try on products at home like sunglasses and makeup
without having to leave the house. Also, face-based AR can be helpful for
controlling on-screen elements through using blend shapes like eye motions.
With a little imagination, face-based AR opens up a great deal of possibilities
with ARKit solutions.

Augmented Reality in iOS Apps: ARKit Development Guide 11


Imagination is really the most important part of developing augmented reality
solutions with ARKit. Although its commercial, functional, and business
applications may not be immediately obvious, its applications in indoor
navigation, facial recognition, and display of information based on geographic
coordinates shows us that AR has incredible potential. Many useful applications
of AR may not have even been thought of yet. Time will tell what the future of AR
will look like.
It’s up to you to decide what you want to explore to keep your business
competitive in an innovative market. Augmented reality services by MobiDev are
here to help you find the solutions that you need to gain that edge.

Augmented Reality in iOS Apps: ARKit Development Guide 12

You might also like