You are on page 1of 46

VISION OS

HOW SPATIAL

UX WORKS
Swipe
Gone are the days of flat or

2.5 D designs.

Consumer UX has just acquired a


new dimension. Literally.

And it changes everything -->

Repost
Before we dive in, there’s

something to note.

Forget screens. The basic unit of

spatial interface is a scene now.

Want to test your product in a

scene?

Here’s how -->

Repost
Simulator

When creating apps, you can


simulate their behavior in the
provided simulator.

Simulator is now included in

the new Xcode (Apple’s development


environment)
Repost
Simulator

Other than interacting with apps, you


can look around, pan, orbit and
move.

Repost
Simulator

Mixing apps and reality also means


to be mindful of different
surrounding and lighting conditions.

And it’s possible to change the


background scene to simulate that.

Repost
Basics
For cross-platform apps, you have to
prioritise system colours, instead of
the custom ones.
Otherwise, elements might not be
displayed properly.

Repost
Basics

The same applies to semantic fonts


(Adaptive)

Repost
Basics

In this example, the text needs to be


optimised for the glass

Using semantic colors and font in


your app, can auto optimize things
on Vision OS --->

Repost
Basics

Repost
Materials

“Materials” are the new thing in


Vision OS
They adjust the contrast based on
lighting conditions and color of
objects behind them

Repost
Materials

There’s no distinction between dark


and light mode on Vision OS now.
One thing less to worry about!

Repost
Materials

Glass texture is the default now.


That way the surrounding area

is still visible.
Additionally, the transparency mixes
the UI with the reality.

Repost
Materials

Hover effects helps to understand


which interactive elements are
looked at

(equivalent of mouse hover)

Hover effects range is slightly bigger


than other platforms, as looking at
objects is less precise.

Repost
Inputs and interactions

The input system is brand new.

Let’s start with eyes.

Looking at an element, pinching and


releasing your finger = click or tap
gesture

Looking + pinching + moving and


releasing your finger = pan gesture
Repost
Inputs and interactions

if you’re close enough to the app


(spatially), you can reach out and
touch it to interact.

You can also use trackpad.

Repost
Inputs and interactions

Voiceover and switch control are


also available
There’s a maximum of 2
simultaneous inputs on this platform
For example, for an app requiring 4
finger touch to do an action, you can
use an equivalent of 2 touches on
Vision OS

Repost
Anatomy of an App

Classical apps are composed of:

Sheet
Alert
Popovers

Vision OS apps have a few


differences:

Repost
Anatomy of an App

Sheets

Sheets (settings) push the content


and placed on top while the previous
content is still visible.

And won’t get closed if you tap away.


Repost
Anatomy of an App

Alerts

Repost
Anatomy of an App

Popovers

They can move outside of the main


screen.

Repost
Anatomy of an App

Ornaments

You can now place controllers or


content outside of the app screen.

They are lifted forward to add depth

Repost
Anatomy of an App

Examples: Tab bar is placed in an


Ornament

Repost
Anatomy of an App

Safari navigation bar

Repost
Anatomy of an App

Freeform toolbar

Repost
RealityKit

You can now create elements (3D

pixels in this example) with depth,

that are affected by surrounding light

Repost
Different kind of scenes

Window

Mainly for content that is 2D.

They can be resized.

Shown alongside other running apps.

Repost
Different kind of scenes

Volume

Mainly for content that is 3D

Their size in all 3 dimensions is


controlled by the app.

Shown alongside other running apps.


Repost
Different kind of scenes

Each scene can be set as


shared (alongside other scenes)
o
immersive (full space).

Certain additional features like ARKit


hand tracking (not the same as the
normal gesture tracking) are
available in immersive space.

Repost
Different kind of scenes

Immersion
There are 3 kind of immersive
settings:
Mixed immersion (surrounding is
fully visible
Progressive immersion
(adjustable, with passthrough
180deg view)


3.Full immersion (hide passthrough)


Examples --->

Repost
Different kind of scenes

Full space with in a virtual stage (full


immersion)

Repost
Different kind of scenes

Full space can still show existing


environment.

Repost
Reality Composer Pro
Another new tool is Reality
Composer Pro

While Xcode is mainly for code, the

Reality Composer allows editing and


previewing scenes and their 3D
content.
Repost
Reality Composer Pro

You can add different components


using the graphical interface:

Repost
Reality Composer Pro

Contrary to 3D apps, the coordinate


system here is centered on users.

Repost
Reality Composer Pro

Additional information can be


requested by the app, like the exact
position of your hands, while running
in full immersion.

Repost
Reality Composer Pro

You can import your own content,


and also use existing/provided
content
Some basic knowledge of 3D
programs (Blender, C4D) or 3D
engines (Unity, Unreal) should help.

Repost
Reality Composer Pro

Adding and placing objects is very


intuitive.

Repost
Reality Composer Pro

Scenes can also act as reusable


objects (like components)
You can create a scene for one
object, e.g. create a cloud and reuse
that across your scene --->

Repost
Reality Composer Pro

The cloud scene is used 3 times


across the main scene.

Repost
Reality Composer Pro

Another exciting component of a


scene are audio sources.

Here’s an overview of different


formats and their properties.
Repost
Reality Composer Pro

View of composing a spatial audio


(bird sound) and the bird object.

Repost
Reality Composer Pro

Repost
Reality Composer Pro

You then have the option to preview


your scene on a device....

...if you happen to have one lying


around.

Repost
That’s a wrap!

Repost
Don’t miss any guides,

Follow and

hit the bell icon.

You might also like