You are on page 1of 6

Available online at www.sciencedirect.

com

ScienceDirect
Available online at www.sciencedirect.com

ScienceDirect
Procedia Manufacturing 00 (2019) 000–000
www.elsevier.com/locate/procedia
Procedia Manufacturing 42 (2020) 2–7

International Conference on Industry 4.0 and Smart Manufacturing (ISM 2019)

Supporting Teamwork in Industrial Virtual Reality Applications


Josef Wolfartsberger*, Jan Zenisek, Norbert Wild
Center of Excellence for Smart Production
University of Applied Sciences Upper Austria, Campus Steyr, Hagenberg, Wels
* Corresponding author. Tel.: +43-5-0804-33152 ; E-mail address: josef.wolfartsberger@fh-ooe.at

Abstract

Virtual Reality (VR) systems allow for novel modes of visualization and interaction to support engineering design reviews. However, there are
still research challenges to be addressed until companies can fully benefit from the technology’s potential. Our previous research showed that
social exclusion of VR users sharing the same physical space with colleagues during a design review session has a negative influence on the
communication and cooperation among team members. The work in this paper presents approaches to counteract this issue in a shared VR space
for industry purpose. We describe the implementation of our concepts based on touch input and visual cues in an interactive VR environment for
design review. Our evaluation in laboratory setup reveals that simple visual cues provide effective means to reduce the time to find certain details
in complex VR scenes. We conclude our work with thoughts on future development steps to foster the communication between team members
in a diversified VR environment.

© 2020
© 2020The TheAuthors.
Authors. Published
Published by Elsevier
by Elsevier B.V. B.V.
Thisisisan
This anopen
openaccess
access article
article under
under the CCthe BY-NC-ND
CC BY-NC-ND licenselicense (http://creativecommons.org/licenses/by-nc-nd/4.0/)
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
Peer-reviewunder
Peer-review underresponsibility
responsibilityofof
thethe scientific
scientific committee
committee of the
of the International
International Conference
Conference on Industry
on Industry 4.0Smart
4.0 and and Smart Manufacturing.
Manufacturing.

Keywords: Virtual Reality; design review; engineering; multi-user experience; social exclusion

1. Introduction In this paper, we discuss the challenges that can arise with
Virtual Reality application for engineering design review. We
Virtual, Mixed and Augmented Reality technologies are present our ideas to counteract communication problems
evolving at fast pace in terms of display quality, ergonomics, between team members, where only one user is wearing a VR
interaction design and software tools. Virtual Reality (VR) – headset and the others are following the scene on an external
and the whole spectrum of “XR”-technologies – provides a screen (as depicted in Fig. 1). Two concepts have been
huge potential for combining digital, virtual and physical prototypically implemented and tested in laboratory setup. The
worlds into one cyber-physical system. Especially in the field study reveals that simple visual cues provide effective means to
of engineering design review, VR allows for novel modes of reduce the time to find certain details in complex VR scenes.
visualization and interaction to examine prototypes in a realistic Thus, we support the communication process in a diversified VR
way starting in the earliest design stages. However, there are environment.
still research challenges to be addressed like the lack of well- The following section briefly describes our VR system and
established usability guidelines, cybersickness, and social and its core features. Afterwards, measures against social exclusion
cognitive aspects. In a previous work [16] the potential of VR are discussed and our ideas to counteract the issue in VR for
for design review was analyzed in a realistic setup. Our findings industry applications are presented. Finally, we make proposals
show that VR isolates users from their team members which led for future work.
users to ask for features that make their colleagues (sharing the
same room) perceptible in VR.

2351-9789 © 2020 The Authors. Published by Elsevier B.V.


This is an©open
2351-9789 access
2020 The article
Authors. under by
Published theElsevier
CC BY-NC-ND
B.V. license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Peer-review
This is an open under responsibility
access article of the
under the CC scientificlicense
BY-NC-ND committee of the International Conference on Industry 4.0 and Smart Manufacturing.
(http://creativecommons.org/licenses/by-nc-nd/4.0/)
10.1016/j.promfg.2020.02.016
Peer-review under responsibility of the scientific committee of the International Conference on Industry 4.0 and Smart Manufacturing.
Josef Wolfartsberger et al. / Procedia Manufacturing 42 (2020) 2–7 3
2 Author name / Procedia Manufacturing 00 (2019) 000–000

2. VR for Design Review

The purpose of VR is to allow a person to experience and


manipulate the environment as if it were the real world. The VR
subject experiences hypes at regular intervals, and they
disappear shortly thereafter. With the advent of more powerful
graphics hardware and innovative tracking technologies, the
topic has been revisited in recent years. According to Gartner's
hype cycle [4], VR has already reached the plateau of
productivity in 2017.
VR offers great potential that goes beyond just looking at
virtual models. The idea of Virtual Prototyping or Virtual
Design Review allows users to examine prototypes in a realistic
way starting in the earliest design stages. Many companies
conduct design reviews to detect errors in their products early
on before the physical product is manufactured. In fact, the
reduction of costs and the enhancement of hardware and
software quality have led VR to being widely used in the Fig. 1. A typical VR-supported design review setup: One person is in control
of the VR headset. The team members are watching the scene on an external
automotive industry [8] (see for example Volkswagen [14]).
screen. The VR user is isolated from her team members.
According to Kovar et al. [6] VR can reduce costs and time
during the design of new machines. The key benefit of VR is
having the ability to test a number of factors without
undertaking the time and costs of building the structure, thereby
reducing the number of errors present in the completed product.
Design Review in general is a cognitive process where
expert information must be communicated to collaborators for
efficient decisions [11]. Communication during the design
process has a substantial role because it exchanges messages
and conveys ideas to people with different skills and interests.
There has been a lot of research on the topic of VR-supported
design review [1,2,3,5,9,10,12]. Analyzing the related work
indicates that current VR tools to support design review can
have a positive effect on the review outcomes’ quality. Fig. 2. Real design review session – VR user examines 3D model, his view is
Nevertheless, further research of VR-supported design review shared on a screen for his colleagues
in real-world industrial settings based on authentic CAD
models is needed, until companies can fully benefit from the In detail, we made the following observations:
technology's potential.
In our preliminary work [15,16] the potential of VR to Issue 1: The VR user chooses the right view on a specific
support a decision making process in an industrial setting was component by moving her head. This interaction is intuitive,
analyzed. The goal was to compare the effectiveness of VR- but the image (also for the team members watching the screen)
supported team-based design review to conventional is always in motion. Many participants reported they had
approaches with CAD-software support (see the test setup in difficulty focusing on a particular component. The person in
Fig. 1 and 2). Our results indicate that a VR-supported design control of the VR headset was repeatedly asked to hold her
review allows users to see more faults in a 3D model than in a head still so that the others could concentrate on a detail. In
CAD software-based approach. Nevertheless, a number of VR- addition, colleagues sharing the same room were not visually
related weaknesses came to light. The observations and perceptible in VR.
findings taken from the audio recordings during the evaluation
gave some interesting indications on how the technologies have Issue 2: Design review is a collaborative process, but VR
influenced communication and cooperation among group isolates users from their team members. Users asked for
members. features to collaborate or to point out certain details in a 3D
model. In the tested situation, colleagues told the VR user
where to look at, which caused misunderstandings and a
noticeable loss of time. Spoken descriptions are often
insufficient to indicate a specific detail of a machinery
component.
4 Josef Wolfartsberger et al. / Procedia Manufacturing 42 (2020) 2–7
Author name / Procedia Manufacturing 00 (2019) 000–000 3

3. Supporting communication in VR

Based on this feedback, our test system (called “VRSmart”),


was then augmented with features to support the
communication between team members in a VR setup with the
goal to counteract social exclusion of the person in control of
the VR headset. Our approach takes into account the added
complexity of the third dimension, the different nature of
interaction as well as different usability challenges [13].

3.1. Freezing the view

To address issues 1 (see above), a feature was added to


“freeze” the current view of the VR user for those following
Fig. 4. External (non-VR) user draws a visual cue on a touch screen.
the scene on the screen (without VR headset). Therefore, users
can focus on a particular component without being dependent
on the VR user’s movement. In this mode, the view on the
scene can be controlled with an Xbox-Controller, while the VR
user still sees her own view controlled by her head and body
movement. To make colleagues perceptible, the current view
of non-VR users is represented as virtual glasses in the VR
environment, as depicted in Fig. 3. By this means, the VR user
knows what her colleagues see at the moment. Details can be
discussed from different angles and there is no need for the VR
user to hold her head still.

Fig. 5. VR view of the same scene – the VR user sees the visual cue drawn by
an external user.

4. Evaluation

In order to test the effectiveness of both approaches, an


evaluation was planned and executed at University of Applied
Sciences, Upper Austria. Based on our previous experiences
with VR for industry purposes, we have defined two research
questions:
Fig. 3. Representation of non-VR users in the VR environment
 Do visual cues provide means to reduce the time to find
3.2. Visual Cues objects in simple/complex 3D scenes?
 Does the visualization of non-VR users reduce the feeling
Issue 2 is a more complex task, since it requires novel ways of isolation for the VR user?
of interactions between VR and non-VR users sharing the same
physical space. There has been research on this topic. For 4.1. Methodology
example, Lankes et al. [7] designed a collaborative VR system
where users can intervene in the virtual world from outside via To answer the questions, an abstract use case was designed
tablets in the same tracking environment. By this means, the where participants were asked to fulfill tasks in ten consecutive
authors try to counteract the feeling of isolation. Following this levels. The overall goal was to find certain objects in virtual
idea, the standard screen was replaced with a touch screen. rooms based on verbal descriptions and/or visual cues.
Non-VR users can pinpoint details in the virtual environment Resembling a realistic design review session, a non-VR user
on the touchscreen. These visual cues appear as bright flares in watched the scenery on an external screen and gave instructions
VR, as depicted in Fig. 4 and 5. Details in a 3D model can be and hints where to find the objects. We have refrained from
intuitively highlighted. The VR user gets cues where to look at presenting real CAD models (like an engine or a power unit)
and is no longer dependent on verbal descriptions. because it would give expert people an advantage over those
who are not familiar with the particular technical components.
Instead, the goal was to look for certain colored balls in rooms
with a set of predefined objects. We compared two methods:
Josef Wolfartsberger et al. / Procedia Manufacturing 42 (2020) 2–7 5
4 Author name / Procedia Manufacturing 00 (2019) 000–000

 Method 1: Solving the tasks by listening to the non-VR


user’s spoken instructions
 Method 2: Solving the tasks by listening to the non-VR
user’s spoken instructions and seeing the visual cues as
described in Section 3.2.

After a quick introduction, the users were asked to put on


the VR headset (HTC Vive Pro with one controller). In a first
step, they were given some time to get used to the setup and the
navigation concept (looking around and teleporting through the
environment). Afterwards, the users were told to navigate from
room to room (level 1-5) and to touch certain colored balls with
the HTC Vive controller. Following the concept of method 1,
the non-VR user described where to go and what to do by voice
commands (without the usage of visual cues). Each level had a Fig. 6. The level overview for the evaluation. Users teleport from room to
room and follow the non-VR user’s instructions.
predefined difficulty level, starting with an easy task (e.g.
touching one red ball out of six differently colored balls) and
followed by medium and difficult tasks (e.g. touching up to
three balls out of a bigger amount of balls with the same color).
A level overview with exemplary voice instructions is depicted
in Table 1. Figure 6 and 7 show screenshots from the different
levels. As soon as the first five levels have been finished, users
were asked to fill out a questionnaire. Afterwards, they
completed level six to ten, which included voice commands
supported by visual cues (method 2). The cues were
implemented as virtual laser pointers (see Fig. 8), triggered by
touching the exact position on the external screen. These levels
resembled the first five ones, but the positions and colors of the
objects were changed. To prevent a certain learning effect, the
starting method was changed from user to user. As soon as all
levels have been completed, a final questionnaire was filled out
to get an insight into the usefulness of our approach.
Fig. 7. (Upper left) easy level, (upper right) medium level, (bottom left and
We measured the time to fulfill the tasks for each level. In right) difficult levels
addition, we have taken notes of our observations during the
evaluation. The following section discusses the results in detail.

Table 1: Test levels, their corresponding difficulty and exemplary spoken


instructions
Level Difficulty Exemplary instruction
1 easy “Take the red ball from the back side of the
table.”
2 easy “Take the blue ball behind the green wall.”
3 medium “Look at the yellow cylinders, left row, take the
second ball from the top. Then, look at the blue
cylinders, right row, take the third ball from the
bottom.”
4 difficult “In the back row, second row, take the red ball
from the third column, then…” (+ instruction
for ball 2 and 3)
Fig. 8. Representation of the non-VR user during the evaluation. By touching
5 difficult “Walk to the back of the room, turn around, the screen, a visual cue (laser pointer) is triggered to pinpoint the object of
walk to the second row starting from your interest.
current position, turn your head to the left, take
the blue ball at the bottom…” (+ instruction for
ball 2 and 3)
6 Josef Wolfartsberger et al. / Procedia Manufacturing 42 (2020) 2–7
Author name / Procedia Manufacturing 00 (2019) 000–000 5

4.2. Results from the VR sessions 4.3. Results from the questionnaires

Altogether, ten people took part in the evaluation (six male, The questionnaires were filled out after the first part of the VR
four female) with an average age of 29 years (ranging from 25 session (before the method described in section 4.1 was
to 35 years). Their previous knowledge about VR was changed) and at the end of the evaluation. Besides demographic
estimated as “good” (M: 1.8, SD: 0.92, where 1 means “very data, we wanted to know how the users rate the usefulness of
good” and 5 means “very bad” on a 5-point-Likert scale) the features and if there was any effect on counteracting social
Table 2 shows the average time for each level to be exclusion.
completed. For example, level 1 (easy) was completed within The spoken instructions were rated as mostly
20 seconds on average without visual guide. This time was understandable (M: 2.1, SD: 0.88). In combination with visual
reduced to 15 seconds with the support of visual cues. The data cues, all participants rated the level of helpfulness as “very
shows that nearly every level was completed quicker with the high” (M: 1.0). Most of them felt comfortable while solving the
support of visual guides. With increasing difficulty, this effect tasks (M: 1.7, SD: 0.82), no matter if they saw a visual
also intensified (see level 4/5 and 9/10). Altogether, visual cues representation of the non-VR user in the virtual environment or
speeded up the process of finding the right objects by 81%. not. We asked them, if they felt alone while interacting in VR.
When we only look at difficult levels, this value even increases Those, who only heard voice instructions in the first step, did
to 114%. not feel isolated (M: 4.0, SD: 1.15, where 1 means “very
Observations during the evaluation also showed that users isolated” and 5 means “not isolated at all”). Seeing a visual
quickly accepted this form of support as a very helpful and representation of the non-VR user in VR resulted in a higher
intuitive tool to cooperate with non-VR users. Especially in value (M: 4.7, SD: 0.48). Surprisingly, social exclusion was no
level 5 and 10 (the most difficult ones, see Figure 7, bottom big issue for both groups, maybe because the time spent in VR
right) users had problems to follow and interpret the spoken was comparatively short (with an average time of 20 minutes
instructions correctly, because there were hardly any reference per session) and most of the participants were already familiar
points to clearly describe the position of the searched objects. with VR technology.
Here, the visual cues had the biggest impact. In general, both groups wished for more ways to interact
with people outside of the VR environment, like seeing realistic
representations of facial expressions and gestures.The latter
comment provides an interesting direction for future work.

5. Conclusion and Future Work

In this paper, we presented our ideas to counteract social


exclusion in Virtual Reality application for engineering design
review.
Referring to the research questions in section 4 we can say
that the visual cues as presented in this paper provide effective
means to reduce the time to find details in complex VR scenes.
As we have seen in our previous research, VR design reviews
of complex CAD data in VR often suffer from communication
problems between VR users and team members, who are
watching the VR scene from an external point of view (e.g. a
TV screen). Spoken descriptions are often insufficient to
indicate a specific detail of a machinery component. With our
work, we present an easy-to-implement and effective solution
to support design reviews by the use of visual cues triggered by
touch input. We help to make VR design review sessions less
time-consuming and support the communication between both
worlds - VR and real environment.
In our evaluation, we could not reproduce the issue of social
exclusion in VR, since the time spent in VR was too short.
Table 2: Results of the test sessions for level 1 – 10. The bars indicate the
Nevertheless, for future work it will be important to have a
average time it took the participants to complete the level with visual guides closer look at means to counteract this issue.
(orange) and without visual guides (blue) The current prototype was tested in laboratory setup with ten
participants. In a next step, an evaluation in a realistic industry
setup with a bigger test group is planned. We want to evaluate
the influence on the communication and cooperation among
group members conducting design review in VR using the tools
described in the paper.
Josef Wolfartsberger et al. / Procedia Manufacturing 42 (2020) 2–7 7
6 Author name / Procedia Manufacturing 00 (2019) 000–000

Acknowledgements of the Annual Symposium on Computer-Human Interaction in Play (CHI


PLAY), Amsterdam, Netherlands, ACM, pp. 553-560, 2017.
[8] Lawson, G. D., Salanitri, B. and Waterfield, B. Vr processes in the
The project "Smart Factory Lab" is funded by the European automotive industry, In Human-Computer Interaction: Users and
Fund for regional development (EFRE) as part of the program Contexts, Springer International Publishing, pp. 208-217, 2015.
"Investing in Growth and Jobs 2014-2020". [9] Madathil, K. C. and Greenstein, J.S. An investigation of the efficacy of
collaborative virtual reality systems for moderated remote usability testing,
Applied Ergonomics 65. pp.501-514. 2017.
[10] Martini, A, Colizzi, L., Chionna, F., Argese, M., Bellone, P. and Cirillo,
References V. Palmieri. A novel 3d user interface for the immersive design review. In
IEEE Symposium on 3D User Interfaces (3DUI), IEEE, pp. 175-176, 2015
[1] Bassanino, M., Wu, K., Yao, J., Khosrowshahi, F., Fernando T., and [11] Noël, F. Nguyen, A., Ba, N. and Sadeghi, S. Qualitative comparison of 2D
Skjærbæk, J. The Impact of Immersive Virtual Reality on Visualisation for and 3D perception for information sharing dedicated to manufactured
a Design Review in Construction. In 14th International Conference product design. IEEE 3rd International Conference on Cognitive
Information Visualisation, London, pp. 585-589, 2010. Infocommunications (CogInfoCom), Kosice, pp. 261-265, 2012.
[2] Bruno, F. and Muzzupappa, M. Product interface design: A participatory [12] Santos, P., Stork, A., Gierlinger, T., Pagani, A., Paloc, C., Barandar, I.,
approach based on virtual reality. In International Journal of Human- Conti, G., de Amicis, R., Witzel, M., Machui, O. and Jimenez, J.M.,
Computer Studies 68 (5), 254-269, 2010. Araújo, B., Joaquim, J. and Bodammer, G. Improve: An innovative
[3] Freeman, I.J., Salmon, J.L. and Coburn, J.Q. CAD integration in virtual application for collaborative mobile mixed reality design review,
reality design reviews for improved engineering model interaction, In International Journal on Interactive Design and Manufacturing (IJIDeM),
ASME 2016 International Mechanical Engineering Congress and 1 (2), 115-126, 2007.
Exposition, The American Society of Mechanical Engineers (ASME), pp. 1- [13] Seidel, I., Gärtner, M., Froschauer, J., Berger, H., and Merkl, D. Towards
10, 2016. a holistic methodology for engineering 3D Virtual World applications. In
[4] Gartern Inc., Top trends in the gartner hype cycle for emerging International Conference on Information Society, London, UK, IEEE, pp.
technologies. https://www.gartner.com/smarterwithgartner/top-trends-in- 224-229, 2010.
the-gartner-hype-cycle-for-emerging-technologies-2017. Accessed: 2019- [14] Volkswagen’s Virtual Engineering Lab using Microsoft’s HoloLens to
10-25. design their future cars. https://mspoweruser.com/volkswagens-virtual-
[5] Gomes de Sá, A., Zachmann, G. Virtual reality as a tool for verification of engineering-lab-using-microsofts-hololens-design-future-cars/. Accessed:
assembly and maintenance processes, Computers and Graphics 23 (3), 2019-10-25.
389-403, 1999. [15] Wolfartsberger, J., Zenisek, J., Sieve, C., and Silmbroth, M. A virtual
[6] Kovar, J., Mouralova, K., Ksica, F., Kroupa, J., Andrs, O., and Hadas, Z. reality supported 3D environment for engineering design review. In
Virtual reality in context of Industry 4.0 proposed projects at Brno International Conference on Virtual System & Multimedia (VSMM),
University of Technology, In International Conference on Mechatronics - Dublin Ireland, IEEE, pp. 1-8, 2017.
Mechatronika (ME), Prague, pp. 1-7, 2016. [16] Wolfartsberger, J., Analyzing the potential of Virtual Reality for
[7] Lankes, M., Hagler, J., Kostov, G., and Diephuis, J. Invisible Walls: Co- engineering design review. Automation in Construction, vol. 104, pp. 27-
Presence in a Co-located Augmented Virtuality Installation. In Proceedings 37, 2019.

You might also like