You are on page 1of 39

DESIGN FOR VR

To keep in mind: Design VR exclusive titles that can only be played in VR


(and not the umpteenth FPS game).
1.
LOCOMOTION
Implementing locomotion in VR is one of the most difficult things
to do, and one of the most important to get right.
Poorly designed or improperly implemented locomotion can
easily make even seasoned VR users uncomfortable or sick.
Any acceleration, rotation, or movement not initiated by
real-world movement of a user is uncomfortable because the
user’s vision tells them they are moving through space, but their
body perceives the opposite. The big problem is to deal with the
Acceleration (and not necessary the Speed).
So far, we haven’t found the “magic bullet” for locomotion. Here
few hints:

● Designing experiences that does not need locomotion (or


use mechanisms for the transportation that doesn’t expose
them to vection as elevators or the world fade to black
because losing consciousness).

● Teleportation allows users to move around a scene without


having to deal with vection-inducing acceleration.

● Move the user through the scene at a constant velocity and


along a straight line (in a railroad car that moves within a
scene). The better is to allow users to initiate and terminate
the movement through some form of input.
● Blinks and snap turns omit visual information that’s likely to
cause vection by simply cutting to black or instantaneously
changing the viewpoint.

● Tunnel vision allows the vignetting of the eye buffers so that


the field of view is narrowed and peripheral vision blocked
off.

● Move the environment, not the user. This one is tough to


effectively implement, but done properly users can
comfortably “grab” and pull the scene towards them

Most users will acclimate to your VR experience over time


making them less susceptible to discomfort so you may choose to
offer more comfortable experiences early, then introduce more
intense experiences later in the app.
2.
interactions
There is commonality in the VR controllers, they all have:
● Home/System button
● Menu button
● Trigger
● Grip
But there is no standard for the controls. Depending on the
game & the platform, the functions of buttons change.
But VR design doesn’t have the luxury of different buttons for
different actions.
Instead, most controls must be gestures within the environment
you design.
Touch controllers give you access to hands in VR. When done
properly, virtual hands let you interact with the virtual world
intuitively; after all, you already know how to use your hands.
When the brain sees virtual hands and accepts them as a
representation of the physical hands, we won the bet.
Tracked controllers can’t simulate the torque or resistance we
feel when manipulating weighty objects. Interactions that
involve significant resistance, like lifting a heavy rock or pulling a
large lever, don’t feel believable. However, lightweight
interactions, like flicking a light switch, are easily believable.
GOO
CE DA
FFO
DAN RDA
OR NCE
AFF
D
BA

Another challenge is letting your user know what they can


interact with, and how, for example, a big shiny doorknob to
denote a door or travel point.
These cues or signals, called “affordances”, should be
self-explanatory.
Sometimes the innate properties of an object cannot imply what
it affords. Signifiers make affordances clearer. Signifiers often
reduce number of possible interpretations and/or make intended
way of using an object more explicit. Avoid using signifiers that
are text-based, as the text will have to be translated.
The best way to pick up an object in VR is to grab the object the
way it was designed to be held. When a person tries to pick up
an object that affords gripping in an obvious way, you should
snap the object into their hand at the correct alignment.
Some objects don’t have an obvious handle or grip (e.g. a
soccer ball). You shouldn’t snap or correct the object in this case,
just stick it to the hand at whatever positional offset it was at
when the grip was invoked. We recommend that you avoid
having users pick up objects off the floor.
Throwing objects reliably with tracked controllers is harder
than it looks. For example, a frisbee is thrown using a completely
different motion than the way a paper airplane is thrown. A
throw Helper that automatically corrects the direction and the
speed is very welcome.
Designing for VR means that you’ll need to be fairly forgiving of
inaccurate motions. You could use Snap Zones (or "Snap Drop
Zone") is a designated area where you can place an object near
it, illuminating a shape to indicate you can place it there. Upon
releasing the held object, it will "snap" into place.
particles

Feedback communicates the results of a user action. Visually,


objects can change shape, color, opacity and even location in
space (like a button moving toward an approaching finger to
signify that it’s meant to be pushed).
3.
Visual & UI
People want to be immersed in an extraordinary world, one with
vivid, imaginative landscapes; they want to be taken into the
world of their maker. But that doesn’t mean your content has to
be high-poly with incredible life-like graphics. Keep the style
consistent to help the user feel immersed.
In VR, player is immersed in 360° environments. You want to keep
important things within the primary action panel as much as
possible.
The simple act of illuminating an object will bring attention
to your user.
We can build massive worlds in VR, but if it takes forever to
teleport around them. Use the teleport distance as Unit to
design your level. You should also work with perspective lines, it
means the player should see the next interesting points from, in
other words don’t let the player to be lost in your world.
Aside from visual challenges, you want to avoid making your
user look back and forth between something close and
something far away so their eyes don’t have to switch focus too
often. So, be careful with distance, try to keep a task’s pertinent
visuals at the same level of depth.
Level of
energy

Keep menus in-world to encourage immersion. Menus and HUDs


counteract VR immersion, a better alternative is to include all
the necessary information within the world itself (you could
place any timers or countdowns on the avatar’s watch) = diegetic
interface. This needs creative problem-solving but users love it.
Flat: The interface is skinned for the 3D space. It’s difficult to
read text or images in perspective. There is no sense of
grounding in the space. It’s a wall.

Curved: Usually better. The content is curved around the user, so


the tiles always face the user, making it much easier to read text
or images.

Given the current technology constraints of screen-based VR


displays your eyes focus to a distance of 2m and placing content
between 2–10m feels most natural and comfortable.
Designing Screen Interfaces for VR (Google I/O '17)

The challenge is to design screens that could be easily read at


any distance. Google teams solved this problem with
"Distance-Independent Millimeters".
A DMM is 1 millimeter viewed at 1 meter away. Measure your UI
elements in terms of DMMs and you can translate into world
space by multiplying the DMMs by distance.
Notice that we have a tendency to look around 6° below the
horizon line which implies that the UIs should not be placed in
the dead center but slightly below the horizon line.
Small elements appear weird when curving, keep them flat.
Display big UI on cylinder is weird too, a smallest curve is better.
The Google design team has put together guidelines for font
sizes that are readable for most people, for body content size
your text 24 DMMs.

For ray-based hit sizes, at least 64x64 DMMs with 16 DMM


padding.
The easy way to use DMM with Unity UI is to put your canvas
inside a ParentUI game object (scale 1), as dimensions are in
meters, so the Canvas has to be scaled down by 1000% (so
1/1000). Now a button with a width of 100 is 100 DMMs. If you
move away x3 distance the ParentUI, just adjust the scale to 3.
4.
Audio
Audio is incredibly important for feeling fully immersed in a VR
experience.
Use spatial sounds, if you want your user to look in a specific
direction, emanating sound from that area will get their attention.

You can animate the position of the sound. This helps the user to
locate the sound more quickly.
Integrating collision and interaction sounds (audio feedback)
into the experience in a very early stage of the production is a
good idea.
Use the Audio to tell your story, you can for example have radio
in a scene that plays the music from the 60s so that the players
can clearly get it that they are in a specific time period.
The goal of 3D spatial audio is to convincingly place sounds in a
three dimensional space so that the user perceives the sounds as
coming from the real physical objects in their VR experience.
REALTIME SOUND OCCLUSION &
PROPAGATION REFLECTION

When working on immersive audio environments, audio


spatializers are great to create an immersive and realistic
experience. The magic comes from the HRTF but it’s a long story...

Have a look, you can use the free Oculus Audio Spatializer Plugin
or the free Steam Audio Plugin or the paid DearVR.
5.
test, test, test
Kaizen is an approach to creating
continuous improvement.

Concentrate and remove the unfun


elements during production, you should
be left with the fun. Sure it seems
simple, but often simplicity is all that is
required for success.

You might also like