You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/335794702

Distance Estimation In Virtual Reality And Augmented Reality: A Survey

Conference Paper · May 2019


DOI: 10.1109/EIT.2019.8834182

CITATIONS READS

28 2,682

2 authors:

Fatima El Jamiy Ronald Marsh


University of North Dakota University of North Dakota
17 PUBLICATIONS   324 CITATIONS    59 PUBLICATIONS   381 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Super Resolution View project

All content following this page was uploaded by Fatima El Jamiy on 23 September 2019.

The user has requested enhancement of the downloaded file.


Distance Estimation In Virtual Reality And
Augmented Reality: A Survey
Fatima El Jamiy Ronald Marsh
Computer Science Department Computer Science Department
University Of North Dakota University Of North Dakota
Grand Forks, North Dakota, USA Grand Forks, North Dakota, USA
fatima.eljamiy@ndus.edu ronald.marsh@und.edu

Abstract—The purpose of Virtual Reality (VR) is to provide Studying depth perception issue to improve it requires the use
a consistent simulation of the realistic world and make of various methods to measure the perceived distance. These
interaction between different worlds and objects possible. techniques and their effectiveness are investigated in this
Perceiving depth and distance correctly in VR is essential, but, paper.
many previous work showed an underestimation of distance in
Virtual Reality with Head Mounted Displays (HMDs). The On the other hand, Augmented Reality (AR) is growing
present work gives a literature review of the design challenges fast and pledge to enhance human potential in different areas.
of such systems, distance perception issue in VR, and study A design challenge for AR systems is assuring that virtual
perceptual research in virtual environments. We will give a objects are accurately positioned in the real world. Correct
review of the history of the work and efforts done in visual distance perception though is difficult to achieve in AR
perception to measure the perceived distance. A particular focus systems and some research showed a great compression of
will be on distance estimates methods and techniques in VR and distances in AR systems as VR systems. This paper will
AR developed throughout these work. Depth perception is one present the main research done to understand and explain this
of the important elements in virtual reality. The perceived depth issue.
is influenced by Head Mounted Displays (HMD) that
inevitability decrease the virtual content’s depth perception. The paper is structured as follows: Section 2 gives a
background on perception, depth cues and the main methods
Keywords—Virtual Reality, Augmented Reality, Perception, used to measure distance in Virtual Reality. In section 3, we
Distance Estimation, Head Mounted Displays represent a review of related work on Virtual Reality depth
perception. Section 4 examined related work in Augmented
I. INTRODUCTION Reality depth perception. We conclude our paper in Section 5.
Virtual Reality (VR) provides a typical environment
system to study perception research, because different features II. BACKGROUND
of the world can be handled and controlled through the VR A. Perception and Depth cues
system. Experiments that cannot be done in the real world, in
real time, including unprecedented operations of a The process of creating a real description of the real
comprehensive visual world allows to study the elements world is a complex system that involve many computing
influencing space and distance perception. Many research resources. The operation involving the generation of the
conducted confirmed a consistent underestimation of image on the human eye and its processing by the brain is not
measured distance in VR with HMDs. This distance completely understood. Creating a virtual environment system
underestimation is not yet completely understood, and using able to have a full simulation of the human visual system is
photorealistic rendering does not help in improving the not reachable yet, because many input parameters called depth
accuracy of the perceived distance. cues are involved in the computation of depth and distance and
are not yet fully discerned.
Virtual reality environments suffer from several
limitations. Among the main limitations, the quality and size Visual perception has been an active research for a
of HMD displays. The screen resolution of most HMD decade with a lot of work done to demystify the depth
displays is restricted, and the field of view is restricted perception mechanism. Depth perception occurs from the
compared with the real field of view. Another limitation that incorporation of various visual depth cues. Research done to
impact the VR system is the peripheral optic flow that can be understand depth perception process involves studying the
changed when the displays peripheral geometry is deformed depth cues, the visual characteristics, and the mechanism of
by the optics. Additional limitations concern the how these parameters are performed by the human visual
representation of the perceived distance in the HMD displays. system and integrated together by the brain to generate the
Eye movements are influenced by the HMD light used to representation of the visual real world.
make it focus at near ranges, which remove the parallax and As mentioned by Loomis [12], Loomis et. Al [13], and
as a result loose depth. In order to benefit from VR systems to Loomis et. Al [14], they tried to explain and run experiments
study perception, it is necessary to take into consideration to understand how this visual function works. They examined
these limitations that are challenging to cope with and should the process of mapping between real space and its creation,
be solved. The same result is concluded by all research done the relationship between the perceived space and motor action.
confirming a compression of depth perception in VR and
the effect of each input cue in the creation process of the Depth cues involve two cues types; monocular cues and
perceived distance and depth. binocular cues:

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


• Uses one eye
• Involve : occlusions, lightning, motion parallax,
Monocular cues accomodation, and perseptive

• Uses two eyes


• Execution of accommodation-convergence reflex
Binocular cues
• Application of disparities to infer the perceived scene

B. Virtual Reality
In 1992, Aukstakalnis and Blatner proposed a generic
definition of Virtual Reality. They consider VR as a mean for
users to visualize and interact with varied and complex
computers and data. They introduced three components to
take into account to allow a user to interact in real time in the
virtual environment (see Fig 1). These components are
Immersion, Interaction and Imagination.

Fig. 2. Three mediations of the Reality


Fig. 1. The three Is
• Functional definition of VR: virtual reality will allow
to get out of the physical reality to change virtually time, place
and type of interaction: interaction with an environment
Tisseau [38] defines Virtual Reality as a universe of simulating reality or interaction with an imaginary or
models that proposes the triple mediation of the senses, the symbolic world. Finally, Fuchs et al. [39] proposed a
action and the mind (see Fig 2). This mediation of the senses definition that encompasses all definitions. For this purpose,
allows the perception of the real, the mediation of the actions they are based on the ultimate goal of VR: “ The ultimate goal
makes it possible to carry out experiments while the mediation of VR is to enable one or more sensory-motor and cognitive
of the mind allows a mental representation of the reality. activity in an artificial world, created digitally which can be
Fatima and Ronald [41] explained in their work the basic imaginary, symbolic or a simulation of certain aspects of the
components of perception and Virtual Reality and how these real world”.
componenets communicate and interact with each other to To summarize the different definitions of VR and AR,
perform and process depth perception. Milgram and Khisino [40] proposed a unification of concepts
More recently, Fuchs et al. [39] proposed two definitions by proposing a continuum linking VR and AR (Fig 3).
of VR, a technical definition and a functional one:
• Technical definition of VR: VR is a scientific and C. Distance Perception Methods
technical domain exploiting computers and behavioral
interfaces in order to simulate in a virtual world the behavior Depth perception process is an unseen cognitive condition
of of 3D entities, which interact in real time with each other, and therefore not attainable directly. Using human capacity to
and with one or more pseudo-natural immersion users via compare between things and quantities, research use
sensorimotor channels; egocentric distance comparison to judge depth and distance.
Fig. 3. Virtual Reality Continuum

According to Cutting and Vishton [1], human distance TABLE I. THE VISUAL SPACE TYPES

perception still not completely discerned even after being


considerably studied for almost 100 years. The work done
The personal space Up to 2m, targets can be
over these years helped in demystifying the most basics to
comprehend how distance perception works, and most grasped and handled by hands
importantly, generated different experimental measurements The action space Between 2 and 30m. Within
and protocols. Measuring distance perception is challenging, this subspace, actions such as
it cannot be assessed directly because it is a conscious event, walking and running are
and for that reason, the different experimental techniques
include quantifying judgments reported by a subject. The possible and allows to grasp
main employed methods to measure distance perception objects, throw them and also
include matching tasks, verbal reports, and bisection task. In communicate and carry out
the matching task, the position of a marker in one orientation conversations.
is adjusted by a subject to correspond to the distance to an
The distant space Beyond 30m; in this
objective. In verbal reporting, subjects use a unit of
measurements to give and communicate a distance between subspace, running and
them and an object. Whereas, in the bisection task, subjects walking are performed and
position a mark in the middle of the distance between them includes targets that can be
and an object. reached or moved away from.
Thompson et al. [2] discuss other measurement protocols
that are also used to quantify the distance parameter blindly,
such as blind walking and blind reaching where the subject try The subject in this technique is in a motionless position. Da
to reach an already seen object with eyes covered. Silva [5] utilizes verbal reporting to investigate distances as
As mentioned by Cutting and Vishton [2], the visual space long as 9 km.But, different research has shown that distances
is splited into three subspaces. The personal space, the action with verbal reports are commonly compressed, and Loomis
space, and the distant space. and Philbeck [6] investigated more the impact of cognitive
ability on verbal reporting method. The reported distance by
verbal reports are considered objective since they do not
a) Blind walking method: require arranging and controlling objects by hands. All these
Thompson et al.[2] specifies that blind walking is matters have been a drive for more research to come up with
considered as the main technique used to assess and quantify other possible measurement techniques.
distance judgments. In this technique, a subject sees an object
and then walk towards it with covered eyes. Waller and b) Bisection method:
Richardson work [3] demonstrate blind walking’s dominance The literature shows that many research have been
as a technique, where their experiments showed the conducted to find and study the best methods to assess
performance of this task by subjects with a marked precision. distances. Bisection according to Da Silva [5] is also used to
Furthermore, the perceived distance measured without bias in examine distances as long as hundreds of meters. Humans
the real world is proven by the blind walking task that is naturally move and control their bodies accurately and with a
mostly close to the expected value. Nevertheless, Loomis and high degree of proficiency, specifically, within personal and
Philbeck [4] showed that blind walking does not perform well action space. This is why the perceived space is a major key
in distances beyond 20 m. It seems that the method has limited for any distance estimation technique.
distance performance. Gilinsky [7] performed experiments showing a great
compression in distance reported using bisection. However,
a) Verbal Reporting method: these results were not enough strong since only two subjects
Verbal reporting is used to assess distance for the three were involved. Bodenheimer et al. [8]; Da Silva, [5]; Purdy
types of subspaces, personal, action and distance distances. and Gibson [9]; Rieser, Ashmead, Talor, & Youngquist [10]
conducted the same experiment but with hundreds of
observers and showed that real world distances were precisely accurate no matter how is the environment, either rich or poor
reported using bisection. Interesting results were reported by with depth cues. Results from experiments run by Knapp and
Lappin et al. [11] that demonstrated a considerable impact of Loomis [22] and Willemsen et al. [23] showed a consistent
the environment context on distance estimations. Estimations underestimation of distances measured using HMD when
were different from the same experiment in two different compared with the same judgment in the real world. Same
environments. results were gotten by Messing et al. [24], and Thompson et
al. [25], distance judgments were underestimated using HMD.
III. DEPTH PERCEPTION IN VIRTUAL REALITY
From the literature, many research works have
Swan et al. [15], Waller and Richardson [16], and suggested different explanations of depth underestimation in
Thompson et al. [2] did a rigorous work in studying distance both, real world and virtual world with head mounted displays.
estimation in virtual environment. Their work investigated Knapp and Loomis in [22] studied the impact of the field of
distance estimations in virtual environment with Head view on depth underestimation in virtual environment using
Mounted Displays in action subspace. A constant and HMD. Results showed that field of view has no influence on
recurrent result from their work is that distances in virtual determining depth perception. Willemsen et al. [26] studied
environment are underestimated comparing with the real mechanical qualities and their contribution to the distance
world perceived distances. misestimation issue in VR HMD. Results showed that the
mechanical part has a great deal with depth misestimation in
Waller and Richardson [3] performed three experiments
VR. Other research done by Thompson [25] concluded that
to investigate the fundamental process of the impact of virtual
the absence of realism element in the scene coming from
environment interaction and the circumstances that may be
graphical computer based does not influence the depth
generated under it. A rich set of experiments included 28
perception in VE. Interrante et al. [27] studied how the
distance estimations in various laboratories. 14 experiments
difference between the real world and the environment world
were conducted for VR estimations and 14 others for real
causes depth perception issue. Observers are conscious that
world distance estimations. Experiment 1 was performed to
the experimental environment is not real. The work of Mon-
make subjects first interact with the immersive virtual
Williams and Tresilian [28] studied other depth cues and their
environment and then assess distances. Subjects
influence on depth perception such as accommodation and
overestimated distances in the physical world, and this
convergence or directed actions.
explains that interacting with the VE recalibrated the
perceptual system. Experiment 2 concluded that the IV. DEPTH PERCEPTION IN AUGMENTED REALITY
interaction process does not need the visual information to
give accurate distance judgment, only the body information is Augmented Reality (AR) implicates incorporating
sufficient. Results got by the authors showed that the real virtual objects into real world. AR systems deploy lightning
world estimations averaged 99.9% of the actual distance while and shadows as main depth cues to allow the virtual scene to
VR distance estimations averaged 71% of the actual distance. be flawlessly integrated and mixed with the real environment,
which makes virtual scene more real. However, this is not an
Jones et al. [17]; Mohler, Creem-Regehr, and Thompson easy task since realism as a property of a generated scene is
[18]; Waller and Richardson [3] conducted studies to not easy to quantify because it is complicated to specify if this
investigate the performance of the distance estimation when scene is the same as the real one.
subjects are not isolated from the real world. Results proved
that the distance estimation accuracy were improved when In the literature, many works have done to figure out the
subjects move and interact with the virtual environment and depth perception issue in Virtual Reality. However, few
react and control their actions. experiments for the same purpose were conducted for
Augmented Reality. Many of these research examined virtual
Experiments run by Waller and Richardson [3] did not objects seen with Head Mounted Displays, and performed
employ bisection as a technique to measure distance. blind walking to measure distances in action subspaces. Swan
However, two other work have applied bisection to investigate et al. [15] demonstrated in his work that distances in AR are
distance judgment in Virtual Reality Head Mounted Displays: underestimated comparing with distances in the real world,
Bodenheimer et al. [8] and Williams, Johnson, Shores, and but the results of underestimation in AR were commonly
Narasimham [19]. Results from both works showed a great lower than the ones from VR. Another work though by Jones
compression in VR, although Bodenheimer et al. [8] et al. [29] performed an experiment comparing distances in
demonstrated an expansion of bisected intervals in personal different three environments; real world, VR and AR. Results
space with short distances, and accurate bisected intervals in showed that distances were not underestimated in AR but they
the physical world. were in VR. But, Grechkin et al. [30] found different results
Realism is a key element in identifying the correct asserting the opposite of these results, where distances were
distance estimation and depth perception. It is considered a underestimated in both AR and VR. Nevertheless, Jones et al.
crucial question issue in virtual reality as depth perception is [17] revealed the reasons behind these opposite results, where
essential in various motor based actions performed in virtual the results in his work found that a key factor is involved and
reality such as reaching, pointing, grasping and intercepting. tune the underestimation in AR. The key is moving and getting
According to Gibson [20], monocular cues such as motion familiar with the scene in the real world, make observers make
parallax, dynamic shadows, textured objects impact the visual correct estimates in both AR and VR for the same scene, while
experience, that is in case enhanced, it does enhance the they underestimate distances if they stay stationary without
based-body interactions. This is true for real world moving around the real world scene as happened in the
environment, the rich it is with depth cues, the more accurate experiment done by Grechkin et al. [30].
it is. However, as Murgia and Sharkey [21] investigated in What can be concluded from these chain of works is that
their work, in the virtual environment, depth estimation is not because observers are used to see virtual objects in real world
and can walk around the physical scene, the underestimation REFERENCES
of distances of AR using HMD is unlikely to happen as it will [1] J. E. Cutting, and P. M. Vishton, “Perceiving layout and knowing
be in VR. distances: The integration, relative potency, and contextual use of
different information about depth,” In W. Epstein & S. J. Rogers Eds.,
The studies presented in the first paragraph, all Perception of space and motion. Handbook of perception and
investigated depth perception in optical see through AR cognition, San Diego, CA, pp. 69–117, 1995.
systems by wearing an HMD and see the scene through optical [2] W. B. Thompson, , R. Fleming, S. Creem-Regehr, and J. K. Stefanucci,
combiners. However, few studies have been conducted to “Visual perception from a computer graphics perspective,” Boca
investigate depth perception in video see through AR, Raton, FL: CRC Press, 2011.
involving wearing a VR HMD to view the world with an [3] D. Waller, and A. R. Richardson, “Correcting distance estimates by
interacting with immersive virtual environments: Effects of task and
attached camera. Messing and Durgin [31] examined distance available sensory information,” Journal of Experimental Psychology.
estimates in AR in an environment using monocular HMD and Applied, 14 (1), 61–72, 2008.
blind walking technique. Results proved an underestimation [4] J. M. Loomis and J. W. Philbeck, “Measuring spatial perception with
equal to the same found for VR. In 2013 and 2014, two works spatial updating and action,” In M. Behrmann, R. L. Klatzky, & B.
in the literature focused on studying the influence of stereo Macwhinney (Eds.), Embodiment, ego-space, and action (pp. 1–43).
viewing and auxiliary augmentations with stereo camera and New York, NY: Psychology Press., 2008.
HMD. Kytö et al. [32] performed this experiment in these [5] J. A. D. Da Silva, “Scales for perceived egocentric distance in a large
conditions using verbal reports as a technique to assess open field: Comparison of three psychophysical methods,” The
American Journal of Psychology, 98 (1), 119–144, 1985.
distances, and they found distance estimates performances
[6] J. M. Loomis, and J. W. Philbeck, “Measuring spatial perception with
were improved. Kytö, Mäkinen et al. [33], on the other hand, spatial updating and action,” In M. Behrmann, R. L. Klatzky, & B.
run the same experiments in the same condition environment Macwhinney (Eds.), Embodiment, ego-space, and action, pp. 1–43,
using matching tasks instead of verbal reports as a technique New York, NY: Psychology Press. 2008.
to measure distances, and they got the same results improving [7] A. S. Gilinsky, “Perceived size and distance in visual space,”
the distance accuracy. Authors believe that in order to Psychological Review, 58 (6), 460–482, 1951.
completely and thoroughly understand and investigate the [8] B. Bodenheimer, J. Meng, H. Wu, G. Narasimham, B. Rump, T. P.
impact of optical and video see through AR on AR distances McNamara, T. H. Carr, and J.J. Rieser, “Distance estima- tion in
virtual and real environments using bisection,” In R. Fleming & M.
estimates, it is essential to compare both in the same Langer (Eds.), Proceedings of the 4th Symposium on Applied
experiment with the two conditions using the same Perception in Graphics and Visualization, pp. 35–40, ACM Press,
measurement method. To the authors knowledge, no 2007.

experiments of this kind has been done yet. [9] J. Purdy and E. J. Gibson, “Distance judgment by the method of
fractionation,” Journal of Experimental Psychology, 50 (6), 374–380,
Other research were done to study depth perception in 1955.
AR using other devices types; tablet and phone based AR. Dey [10] J. J. Rieser, D. H. Ashmead, C. R. Talor, and G. A. Youngquist, “Visual
et al . [34], Dey et al. [35], Dey and Sandor [36] and Sandor perception and the guidance of locomotion without vision to previously
et al. [37] studied depth perception in tablet and phone based. seen targets;” Perception, 19 (5), 675–689, 1990.
They developed various depth cues techniques and checked [11] J. S. Lappin, A. L. Shelton, and J. J. Rieser, “Environmental context
their efficiency in distance space employing verbal report influences visually perceived distance,” Perception & Psychophysics,
68 (4), 571–581, 2006.
method to evaluate distances. Dey et al. [35] did the same
experiment to validate if the screen size and resolution has any [12] J. Loomis, “Distal attribution and presence,” Presence, vol. 1, pp. 113–
119, 1992. 

impact on depth perception. Results showed that large screen
[13] J. M. Loomis, J. A. Da Silva, J. W. Philbeck, and S. S. Fukusima,
considerable enhanced distance estimate. “Visual perception of location and distance,” Current Directions in
Psychological Science, vol. 5, pp. 72–77, 1996. 

[14] J. M. Loomis and J. M. Knapp, “Visual perception of egocentric
V. CONCLUSION AND DISCUSSION distance in real and virtual environments,” In L. J. Hettinger and M. W.
Haas (Eds.), Virtual and Adaptive Environments, pp. 21–46, 2003. 

The evident underestimation of distances as found by most [15] J. E. Swan, A. Jones, E. Kolstad, M. A. Livingston, and H. S.
of the research done recently is an occult problem not Smallman, “Egocentric depth judgments in optical, see-through
completely discerned by research community. Throughout the augmen- ted reality,” IEEE Transactions on Visualization and
work done in this area, authors reported that seeing the virtual Computer Graphics, 13 (3), 429–442, 2007.
world through Virtual Reality HMD is the main cause behind [16] D. Waller , and A. R. Richardson, “Correcting distance estimates by
interacting with immersive virtual environments: Effects of task and
the compression of distances in this type of environment available sensory information,” Journal of Experimental Psychology.
compared with distances in the real world. So, studying how Applied, 14 (1), 61–72, 2008.
the virtual system with HMD operates will help in [17] J. A. Jones, J. E. Swan, G. Singh, and S. R. Ellis, “Peripheral visual
understanding how perception works in this environment and information and its effect on distance judgments in virtual and
what are the factors impacting depth perception. In this paper, augmented environments,” In S. Creem-Regehr, & K. Myszkowski
we mainly reviewed the state of the art of design challenges (Eds.), Proceedings of the ACM SIGGRAPH Symposium on Applied
Perception in Graphics and Visualization (APGV), pp. 29–36, New
and limitations related to Virtual Reality and Augmented York, NY: ACM Press, 2011.
Reality. This survey is also a full overview of the main
[18] B. J. Mohler, S. H. Creem-Regehr, and W. B. Thompson, “The
methods and techniques used to measure perceived distance influence of feedback on egocentric distance judgments in real and
with the main results found by authors in different works virtual environments,” In R. Fleming & S. Kim (Eds.), Proceedings of
performed to study depth perception. the 3rd Symposium on Applied Perception in Graphics and
Visualization, New York, NY: ACM Press, pp. 9–14, 2006.
This survey is a part of a project that the laboratory is [19] B. Williams, D. Johnson, L. Shores, and G.Narasimham, “Distance
working on to study depth perception in real time rendering perception in virtual environments,” In S. H. Creem-Regehr & K.
systems in VR HDMs. Myszkowski (Eds.), Presented at the ACM Symposium on Applied
Perception in Graphics and Visualization, New York, NY: ACM Press,
p. 193, 2008.
[20] J. Gibson, The Ecological Approach to Visual Perception. London: [37] C. Sandor, A. Cunningham, A. Dey, and V. V. Mattila, “An augmented
Lawrence Erlbaum Associates, 1986. reality X-ray system based on visual saliency,” In V. L. Park & T.
[21] A. Murgia and P. M. Sharkey, “Estimation of distances in virtual Höllerer (Eds.), Presented at the International Symposium on Mixed
environments using size constancy,” The International Journal of and Augmented Reality (ISMAR 2010). Piscataway, NJ: IEEE, pp. 27–
Virtual Reality, vol. 1, pp. 67–74, 2009. 36, 2010.
[22] J. M. Knapp and J. M. Loomis, “Limited field of view of head-mounted [38] Gerval J.P., Popovici M., Ramdani M., El Kalai O., Boskoff V. and
displays is not the cause of distance under- estimation in virtual Tisseau J. (2002). “Virtual Environments for Children”, In Proceedings
environments,” Presence, vol. 13, pp. 572–577, 2004. 
 International Conference on Computers and Advanced Technology
Education (CATE), Cancun, Mexico, 416-420.
[23] P. Willemsen, M. B. Colton, S. H. Creem-Regehr, and W. B.
Thompson, “The effects of head-mounted display mechanical [39] M. Auvray and P. Fuchs, “Perception, immersion et interaction
properties and field of view on distance judgments in virtual sensorimotrices en environnement virtuel,” In A. Grumbach & E.
environments,” ACM Trans. Appl. Percept., vol. 6, no. 2, pp. 1–14, Klinger (Eds.), R´ealit´e Virtuelle et Cognition. Num´ero sp´ecial de
2009. Intellectica, vol. 45, no. 1, pp. 23–35, 2007
[24] R. Messing and F. H. Durgin, “Distance perception and the visual [40] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual
horizon in head-mounted displays,” ACM Trans. Appl. Percept., vol. 2, displays. IEICE TRANSACTIONS on Information and Systems,
no. 3, pp. 234–250, 2005. 77(12), 1321–1329.
[25] W. B. Thompson, P. Willemsen, A. A. Gooch, S. H. Creem- regehr, J. [41] Fatima El Jamiy and Ronald Marsh, “A Survey on Depth Perception
M. Loomis, and A. C. Beall, “Does the quality of the computer graphics in Head Mounted Displays: Distance Estimation in Virtual Reality,
matter when judging distances in visually immersive environments?” Augmented Reality and Mixed Reality,” IET Image Processing, 8pp.
2002. 2019.
[26] P. Willemsen, A. A. Gooch, W. B. Thompson, and S. H. Creem-
Regehr, “Effects of stereo viewing conditions on dis- tance perception
in virtual environments,” Presence: Teleoper. Virtual Environ., vol. 17,
no. 1, pp. 91–101, 2008.
[27] V. Interrante, B. Ries, and L. Anderson, “Distance perception in
immersive virtual environments, revisited,” in Proc. Virtual Reality
Conference, 2006, pp. 3–10.
[28] M. Mon-Williams and J. R. Tresilian, “Ordinal depth infor- mation
from accommodation?” Ergonomics, vol. 43, no. 3, pp. 391–404, Mar
2000.
[29] J. A. Jones, J. E. Swan, G. Singh, E. Kolstad, and S. R. Ellis, “The
effects of virtual reality, augmented reality, and motion parallax on
egocentric depth perception,” In L. T. De Paolis, A. Mongelli (Eds.),
Augmented Reality, Virtual Reality, and Computer Graphics: Third
International Conference, AVR 2016, Proceedings, Lecce, Italy: ACM
Press., pp. 388–396, June 15-18, 2016.
[30] T. Y. Grechkin, T. D. Nguyen, J. M. Plumert, , J. F. Cremer and J. K.
Kearney, “How does presentation method and measurement pro- tocol
affect distance estimation in real and virtual environments?,” ACM
Transactions on Applied Perception, 7 (4), pp. 1–18, 2010.
[31] R. Messing, and F. H. Durgin, “Distance perception and the visual
horizon in head-mounted displays,” ACM Transactions on Applied
Perception, 2 (3), pp. 234–250, 2005.
[32] M. Kytö, A. Mäkinen, J. Häkkinen, and P. Oittinen, “Improving
relative depth judgments in augmented reality with auxiliary augmen-
tations,” ACM Transactions on Applied Perception, 10 (1), pp. 1–21,
2013.
[33] M. Kytö, A. Mäkinen, T. Tossavainen, and P. Oittinen, “Stereoscopic
depth perception in video see-through augmented reality within action
space,” Journal of Electronic Imaging, 23 (1), 2014.
[34] A. Dey, A. Cunningham, and C. Sandor, “Evaluating depth perception
of photorealistic mixed reality visualizations for occluded objects in
outdoor environments,” In T. Komura & Q. Peng (Eds.), Proceedings
of the 17th ACM Symposium on Virtual Reality Software and
Technology, New York, NY: ACM Press., pp. 211–218, 2010.
[35] A. Dey, G. Jarvis, C. Sandor, and G. Reitmayr, “Tablet versus phone:
Depth perception in handheld augmented reality,” In M. Gandy, K.
Kiyokawa, & G. Reitmayr (Eds.), Presented at the International
Symposium on Mixed and Augmented Reality (ISMAR), Piscataway,
NJ: IEEE., pp. 187–196, 2012
[36] A. Dey, and C. Sandor, “Lessons learned: Evaluating visualizations for
occluded objects in handheld augmented reality,” International Journal
of Human–Computer Studies, 72 (10–11), 704–716, 2014.

View publication stats

You might also like