You are on page 1of 15

Journal Pre-proof

Using virtual worlds to understand insect navigation for bio-inspired


systems

Pavan Kumar Kaushik, Shannon B Olsson

PII: S2214-5745(20)30119-X
DOI: https://doi.org/10.1016/j.cois.2020.09.010
Reference: COIS 737

To appear in: Current Opinion in Insect Science

Please cite this article as: Kaushik PK, Olsson SB, Using virtual worlds to understand insect
navigation for bio-inspired systems, Current Opinion in Insect Science (2020),
doi: https://doi.org/10.1016/j.cois.2020.09.010

This is a PDF file of an article that has undergone enhancements after acceptance, such as
the addition of a cover page and metadata, and formatting for readability, but it is not yet the
definitive version of record. This version will undergo additional copyediting, typesetting and
review before it is published in its final form, but we are providing this version to give early
visibility of the article. Please note that, during the production process, errors may be
discovered which could affect the content, and all legal disclaimers that apply to the journal
pertain.

© 2020 Published by Elsevier.


Title
Using virtual worlds to understand insect navigation for bio-inspired systems

Authors
Pavan Kumar Kaushik1* and Shannon B Olsson1*
1. National Centre for Biological Sciences, Tata Institute of Fundamental Research, GKVK
Campus, Bellary Road, Bengaluru, India 560064
*correspondence to pavan@nice.ncbs.res.in or shannon@nice.ncbs.res.in

of
Short title
Insect VR for bio-inspired systems

ro
Highlights

-p
● Navigation is challenging in natural environments.
● Insects have evolved sophisticated navigation strategies with small nervous systems.
● Insect navigation is a model for developing strategies for artificial systems.
re
● Virtual reality (VR) is an excellent tool to study insect ethology.
● Natural history and ecology are essential for understanding real world navigation.
lP

Abstract

Insects perform a wide array of intricate behaviors over large spatial and temporal scales in
na

complex natural environments. A mechanistic understanding of insect cognition has direct


implications on how brains integrate multimodal information and can inspire bio-based solutions
for autonomous robots. Virtual Reality (VR) offers an opportunity assess insect neuroethology
while presenting complex, yet controlled, stimuli. Here, we discuss the use of insects as
ur

inspiration for artificial systems, recent advances in different VR technologies, current


knowledge gaps, and the potential for application of insect VR research to bio-inspired robots.
Finally, we advocate the need to diversify our model organisms, behavioral paradigms and
embrace the complexity of the natural world. This will help in uncovering the proximate and
Jo

ultimate basis of brain and behavior and extract general principles for common challenging
problems.

Keywords
neuroethology, autonomous navigation, search algorithm, sensory systems, ecology
Introduction
Autonomous navigation by artificial systems in complex natural environments has been a major
challenge of robotics. In real-world conditions, sensory input is spatiotemporally complex,
multimodal, has large dynamic ranges, and fluctuates stochastically, making precise predictions
virtually impossible for current technologies. However, 500 million years of evolutionary
adaptations have allowed insects to accomplish this task with less than half a million neurons.
Insects perform a wide array of complex behaviors over large spatial and temporal scales from
rapid aerial maneuvers [1,2], long-distance plume following [3,4], and intercontinental
multigenerational migrations [5–7]. Insects execute these computationally intensive processes
[3,8–12] despite their limited computational resources, while our robot counterparts with ample
resources struggle in the real world [13–18]. A mechanistic understanding of how insects parse
complex natural environments has direct implications on how brains integrate multimodal
information and can inspire bio-based solutions for autonomous robots and smarter computer

of
algorithms that can parse multiple inputs and respond appropriately in the real world [13,19].
Dissection of insect behavior demands precise quantification of where and when insects detect
and respond to objects in ethological contexts, while simultaneously being able to dynamically

ro
manipulate these stimuli to deduce causal relationships. However, this is a challenging
endeavor due to insects’ small size, high speeds, and the large spatio-temporal scales in which
they navigate. Field studies have limited manipulative power, and lab assays are constrained to

-p
small scales in simplified environments. Virtual Reality (VR) arenas overcome many of these
challenges by restraining the animal and artificially providing sensory stimuli that can be
updated based on the animal’s actions [20,21]. Present-day VRs incorporate multiple modalities
[22–30] natural scene complexity [31], and can simultaneously record from the brain [32], or
re
present virtual stimuli while the animal is freely flying [33,34]. This enables fine-scale
manipulation of stimulus and response dynamics to assess brain and behavior [11,21]. Here, we
acknowledge these technological developments and discuss the potential to apply insect VR
lP

research to bio-inspired robots. To achieve this potential, we advocate the need for
transdisciplinary approaches that integrate neuroscience, engineering, ecology and natural
history to properly translate insect behavior to artificial systems.

Why study insects?


na

Insects forage, disperse, and find mates and other objects in the noisy natural environment over
distances that are many orders larger than their body size [4,5,31,35]. Insect vision is
fundamental for navigation, stable flight control, object avoidance and target tracking [12,36,37].
ur

However, vision alone is insufficient for such long-range target searches due to their limited
spatial acuity [4,35,36]. They overcome this limitation with long-range odor plume following and
multimodal sensory integration during search [10,23,26,31,38]. Mosquitoes, for example, locate
humans by integrating the olfactory, visual, and heat signatures of their warm-blooded hosts
Jo

[26,39]. Nevertheless, plume following is a challenging task since advection and turbulence in
air chaotically disperse the plume into a meandering, intermittent stream of packets [3,10]. Wind
not only affects how insects follow plumes, but also the magnitude and direction of their
migrations [5–7,40]. Sound and vibration also play an important role in locating objects from a
distance for several insects [8,24,41,42]. Performing these diverse tasks necessitates that
insects parse the natural world’s complex features amidst a noisy background. These features
are spread over multiple sensory systems, from hyperspectral visual signals, polarization
patterns, stochastic olfactory input, turbulent airflows, near and far-field sounds, and the Earth’s
magnetic field, among others (Figure 1). Along with these features, insects must store diverse
internal states that encode goal-oriented behaviors, mating status, hunger levels, and other
physiological conditions. They must then integrate these multiple sensory modalities while
accounting for their internal states, position, and orientation of self with respect to the outside
world and output motor commands (Figure 2).
If we want our robots to be truly autonomous in our worlds, they too must perform many of these
tasks and computations. The challenges currently facing our artificial systems arise due to the
structural complexity of real-world scenes, unmapped environments, unpredictability of
turbulence, noisy sensors, limited online computational power, range anxiety, and other
technological limitations. For example, despite our current computational capabilities,
foreground segmentation, monocular depth sensing, and object localization in real environments
is still an open problem in computer vision [14]. GPS denied navigation with only onboard
computation in unchartered territories is a challenge [43]. Turbulence in open environments and
rotorwash makes plume following difficult for robots [17,18]. Insects measure the same sensory
quantities, face analogous algorithmic challenges, and have to perform similar tasks with limited

of
resources [13]. We can take inspiration from the neuronal anatomy, stereotyped behaviors, and
underlying computational algorithms of insect navigation to enhance our artificial systems and
allow them to navigate in stochastic real-world environments.

ro
-p
re
lP
na

Figure 1. An insect’s multisensory view of reality. Insects make use of a wide variety of cues
available in nature over large spatial and temporal scales to perform diverse behaviors. They
use landmarks, horizon, sky polarization, and magnetic fields for orientation, dispersal, and
migration. Using wide-field visual cues, wind, and odor plumes, they approach distant visual
ur

targets. By adding ultraviolet, electric fields, humidity, and temperature cues, they make short-
range decisions for landing.
Jo
of
ro
-p
Figure 2. Multimodal sensory integration during long-range search. Insect brains receive
sensory input from the environment and rapidly perform multiscale feature extraction to detect
edges, shapes, odor identity, wind speed, and other aspects. Using these features, they
estimate self motion, figure ground discrimination, track targets, detect obstacles, localize
re
sound, etc. [24,31,44–48]. They integrate these multimodal features based on both past and
present internal and external states to select the appropriate motor response
[11,22,23,26,27,39,49–51]. An efference copy is sent back, and any discrepancy due to
lP

environmental disturbances or motor errors is corrected using rapid stabilization reflexes. Over
time, these contextual and non-linear processes combine to maintain the current behavioral
state or transition to new states, allowing for complex decision making such as mate selection,
foraging, migration, and a number of other behaviors. In the flow chart, colors and shapes
indicate the different aspects of sensory integration from input to output as follows: environment
na

and sensory input (red), decision making (green), memory (yellow) and motor outputs (blue).
Table 1. List of recent virtual reality technologies developed for studying insects

Modality Advance Description Ref


ur

Visual 2D patterns 2D optic flow stimuli with bars, gratings using drums, motors
and custom LED panels [36,37,52
Jo

–55]

3D objects 3D optic flow stimuli with spots, cones, and spheres using [56–59]
displays and projectors

Naturalistic 3D color scenery with naturalistic stimuli using gaming [31]


scenes technology
Ultra Violet Multispectrum display with arbitrary visual and UV wavelengths [60]

Polarization Polarized visual input with tunable angle of polarization [59,61]

Olfactory Odor A freely rotating magnetic tether with odor input. [62,63]

360° Odor A 360° closed-loop odor delivery with arbitrary odorscapes [31]

Mechanosensory Wind Laminar airflow delivery with arbitrary direction


[22,23,64]

of
360° Wind A 360° closed-loop wind delivery with arbitrary windscapes [31]

Free flight A closed-loop arena for freely flying insects [33,65]

ro
Sound A 360° sound speaker array with arbitrary soundscapes [24,25]

Magnetosensory Compass

-p
Magnetic field input with tunable field direction [66]
re
Advances in VR arenas for insect research
Insect VRs have been around for more than half a century and were originally created to
lP

examine insect flight and visual systems, particularly optomotor reflexes. For comprehensive
historical overviews, see reviews [21,36,37]. Here we concentrate on recent advances, with
acknowledgment of the extensive history behind these techniques. Due to technological
constraints in sensors, computation, and displays, initial VRs were built by surrounding tethered
na

flies with rotating drums [55,67,68](see Table 1 for a list of recent advances in VR arenas). The
drums were lined with mathematically simple visual patterns such as bars, gratings, dots, and
others that provided optic flow. With technological advances, these patterns were animated
using cathode-ray monitors [69] and custom LED panels to provide different kinds of optic flow
[28,48,52–54]. Recent advances in gaming displays and 3D projection technologies provide
ur

high refresh rate visual input that allow the use of color [31], complex 3D patterns [33,57,70],
polarization cues [59,61], hyperspectral [60], and naturalistic scenes [31] while simultaneously
recording neural signals from the brain [32] and optogenetically manipulating them [27,57–59].
Jo

In the recent past, scientists have elucidated the neural basis of many behaviors, such as how
insects detect motion [37], control their walking speed [71], perform spontaneous turns
[22,37,72], track bars [28,44,73], avoid obstacles [28,37] and navigate to virtual 2D [31,57,70],
3D [31,57,70] and naturalistic objects [31,57,70]. VR has also helped reveal the role of the
insect central complex in navigation [11]. Using VR, scientists have discovered that central
complex performs diverse computations ranging from integration of angular headings and
velocities [50,74], internal goal guided navigation [49], proprioceptive cues from halteres [75],
integrating polarization cues to encode sun position during menotaxis [61,76,77], and can
represent heading even after arbitrary scene remapping [51,78].
With advances in 3D printing, VRs are now capable of providing precise directional airflow (not
wind per se) even in closed-loop [31]. Using VR, flies are shown to use bilateral differences in
aristae to measure wind speed and direction [79]. In VR, flies also integrate wind and visual
cues as a sum of spatiotemporally filtered signals [22] with selective attention [64]. Flies can
also navigate in virtual worlds using only wind speed and direction in the absence of any optic
flow, an ability critical for high flying night migrants [6,31]. Technological enhancements,
particularly faster valves [80,81] and newer tethering paradigms [62,63,82] enable the use of
olfaction in VR arenas. Flies in olfactory VR have been shown to use concentration differences
across their antennae to orient toward higher concentrations [62]. They also follow wind tunnel-
like virtual plumes in VR by integrating visual, wind, and odor cues [31]. In Drosophila, attractive
odors reverse the valence of visual objects [27]. Using recent advances in mosquito genetics,
thermotaxis was shown to be enhanced by visual landing cues and exposure to CO2 [26]. CO2
exposure also modulated vision but not vice versa [39]. Honeybees can also learn using cues
from VR [70,83,84]. Using stereo speakers in VR, flies were shown to localize sound using

of
differences in inter-antennal vibrations [24]. Complexity in insects’ 3D sound localization
strategies limits our inferential reach using stereo speakers [8]. Using a 360° speaker array and
a panoramic visual input, crickets have been shown to use virtual sound and visual input for

ro
decision making and source localization [25]. Finally, migrating moths have been found to use
the Earth’s magnetic field and visual landmarks to steer flight in VR [66].
The VRs presented until now have used tethered insects due to convenience and technological

-p
barriers in performing neurophysiological recordings as they fly around freely. To estimate the
amount of intended motion of the insect, different methods such as floating balls [57,85,86],
torque compensators [36,55], masked photodiodes [37,87], and camera tracking [61,88] are
used, all of which excel at estimating the yaw torque. However, flying insects also
re
simultaneously thrust, pitch, and roll during banking turns and evasive maneuvers, enabling 3D
flight [1,2]. We still lack the ability to measure the subtle changes in wing kinematics in real-time
for closed-loop 3D VR flight. Tethering also can restrict maneuverability, particularly in
lP

Dipterans. Flies use mechanosensory cues from halteres like gyroscopes to estimate self-
motion [89] and are suggested to provide timing information [29]. However, these cues are
hampered by tethering [37], making tethered flight an inaccurate model of free flight. This
doesn’t appear to affect higher-level behaviors such as decision making, as the insect choices
observed are not measurably different between VR and nature in the cases tested [31,57].
na

However, tethering artifacts pose challenges for studying the biomechanics of insect flight in VR
as sensory conflicts can cause erroneous behaviors [33]. With the advent of newer tether
designs that can freely yaw, partial aspects of free flight have been recovered [82]. Recently,
using real-time 3D tracking and rapid computation, freely flying insects were provided with
ur

closed-loop visual feedback [33,34,65]. What this VR lost in manipulative abilities of imaging
and neural recording, it gained by simulating genuine free flight in VR.
Jo

Knowledge gaps
Insect brains have evolved to convert the noisy, complex, multimodal sensory input of the
natural world into intricate movement patterns to perform various behaviors [90]. As
neuroethologists with a long term goal to understand the organism and its brain, it is prudent to
also study the organism using ecologically relevant stimuli [3,9,10,90–94]. This is a good
strategy not only due to physiological constraints imposed by the organisms’ evolutionary
history but also due to the physics of the natural world. For example, visual systems are tuned
to the statistics of natural worlds and behave robustly over a wide range of input parameters,
such as contrast, spatial frequency, and spatiotemporal correlations [92–94]. However, natural
visual stimuli differ from artificial, “simple” stimuli [92] and are sensitive to even slight changes in
input parameters. Simple stimuli such as bars and stripes, which are mathematically well
defined, have been crucial to our characterization of insect visual systems
[28,36,37,44,46,48,95]. Nevertheless, their geometry is fundamentally incapable of providing
complex perspective and parallax cues that surround us in the real world [31]. The use of
naturalistic stimuli had been limited due to technological constraints. With the advent of faster
display technologies, we should use this opportunity to devise new ways of systematically
quantifying ethologically relevant stimuli [96] and use them in our VRs. Polarization and UV
cues in the sky also provide complex spatiotemporal cues for insects during navigation
[23,61,76,77]. These cues are particularly hard to study due to the limits of our own eyes and
limited commercial equipment to measure them. Devising techniques to incorporate these cues
to our VRs would let us see our organisms in a new light, both literally and conceptually [59,60].
Similarly, the air surrounding us is almost always turbulent, stochastically tossing around odors.
Insects have evolved to track plumes of odorant mixtures in turbulent environments and behave

of
more robustly than in laminar environments [3,91,97]. Our inability to precisely measure, model,
simulate, and simulate turbulent wind and odor plumes has required us to study insect behavior
in the laminar airflow used in most wind tunnels [10]. This has restricted our understanding of
the mechanistic basis of olfactory processing in insects. By using turbulent wind tunnels, faster

ro
odor delivery systems [80,81], high-resolution odor data [98], and computational power, we can
understand plume tracking behavior in complex and turbulent environments [10].
Despite the many shortcomings of the tethered paradigm, it does provide precise multimodal

-p
sensory stimulation and enables unfettered access to the brain for neuronal recording and
imaging. To close the loop, appendage movements are often transformed into yaw velocities,
limiting the virtual flight to a 2D plane. But flies perform maneuvers in 3D by rapidly changing
re
wing stroke dynamics [1]. By themselves, these are challenging measurements to perform. A
system that can measure these parameters in real-time is a daunting task. Another approach in
progress is the use of two-axis torque meters that can simultaneously measure yaw and thrust
[99]. With advances in deep learning for tracking and pose estimation, micro-electromechanical
lP

system 3D torque sensors, GHz radar we may soon be able to measure these flight parameters
on the fly and close the loop in 3D.

Future directions
na

VR is an excellent tool to study insect neuroethology. With the advent of faster displays, 3D
printing, larger computational power, optogenetics, wide-field imaging, CRISPR, and other
innovations, we can now manipulate sensorimotor, neural, and genetic mechanisms to elucidate
both behavior and its neural basis [100]. To reduce anthropomorphic bias in our VRs, we need
ur

to sample the 3D world using high-resolution LIDARs, polarimeters, and hyperspectral imaging
[96]. We must also sample the invisible yet omnipresent microcosmos of airflow and odor in the
turbulent natural world around us using light-sheet PIV, anemometer arrays, and other
Jo

technologies in diverse environments [98]. Using this rich body of knowledge, we can harness
the full potential of VR by devising ways to comprehensively present these multimodal,
naturalistic stimuli matched to the insect’s physiology in VR while mimicking the 3D movement
in closed-loop.
In the process of understanding insect vision, multiple algorithms have been implemented in
robots [13,15–19,101]. By studying insects in more complex environments, we can understand
how they parse the complex visual input stream with limited resources. The mechanisms
underlying their ability to perform foreground segmentation from an uncalibrated moving
camera, sense depth using monocular motion parallax cues, and localize occluded objects
amidst clutter would undoubtedly uncover a plethora of mechanisms useful to artificial systems.
These mechanisms directly translate into better bio-inspired computer vision algorithms and
robust real-world navigation of autonomous robots.
The lack of portable, rapid, and sensitive volatile sensors has been a major barrier for devising
autonomous olfactory robots. Robots that mimic insect plume following strategies based on
laminar airflow wind tunnel studies perform well only in tunnel environments [17,18].
Understanding insect plume following in turbulent conditions would generate novel algorithms
for bio-inspired autonomous robots that operate in the real world. This would enable robots to
be used for disaster management, search and rescue, tracking pollutants and hazardous waste,
and many other applications.
Understanding how insects combine multiple ambiguous and noisy streams to perform precisely
targeted behaviors would also help us device better decision making algorithms for robust
autonomous performance. By unraveling flight algorithms in 3D by including pitch, yaw, and roll,
we can also improve our bio-inspired algorithms currently limited to 2D. With enhanced

of
multisensory parsing and integration of natural scenes, we can truly navigate a flying drone
autonomously in 3D.
Insects perform a wide variety of complex cognitive tasks in nature, such as predation, parasitic

ro
behavior, migration, swarming, mimicry, social learning and memory, nest building, diapause,
and others [60]. However, the majority of VR studies on insects have so far focused on
understanding reflexes and sensory systems that necessitate the use of simple stimuli rather

-p
than more ecologically complex inputs. This problem is exacerbated by the knowledge gaps in
our organism’s ecology and natural history. We need to know the natural history of the species,
how it forages, disperses, and mates [102]. We also must uncover the sensory stimuli that elicit
such responses, as no VR study can supplant the fundamental necessity of studying insects in
re
their natural habitat. In the absence of such crucial data, we are forced to provide abstract or
ecologically irrelevant stimuli to investigate their behavior. Moreover, to address the diverse
questions and behaviors that are commonplace in insects, there is a pressing need to diversify
lP

our model systems beyond Drosophila [100,102]. Using comparative approaches, we must
study an array of emerging model organisms, performing a variety of complex behaviors in their
respective natural habitats [5,10,12,31,100] while simultaneously recording and imaging their
brains. This will help in uncovering the proximate and ultimate basis of brain and behavior and
extract general principles for common challenging problems. These approaches will not only
na

enhance our understanding of the brain but also help us design bio-inspired autonomous robots
and smarter computer algorithms.

Acknowledgements
ur

We thank Sanjay Sane, Upinder Bhalla, Mukund Thattai, Tim C. Pearce, Holger Krapp, Manal
Shakeel and Sree Subha Ramaswamy for helpful discussions. P.K.K. was supported by the
National Centre for Biological Sciences, Tata Institute of Fundamental Research. S.B.O. was
Jo

supported by a Ramanujan Fellowship (Science and Engineering Research Board India), a


grant from Microsoft Research, and the Department of Atomic Energy, Government of India,
under project no. 12-R&D-TFR-5.04-0800.
Declarations of interest: none.

References
1. Dickinson Michael H., Muijres Florian T.: The aerodynamics and control of free flight
manoeuvres in Drosophila. Philos Trans R Soc Lond B Biol Sci 2016, 371:20150388.
2. Balebail S, Raja SK, Sane SP: Landing maneuvers of houseflies on vertical and
inverted surfaces. PLoS One 2019, 14:e0219861.
3. Cardé RT, Willis MA: Navigational strategies used by insects to find distant, wind-
borne sources of odor. J Chem Ecol 2008, 34:854–866.
4. Coyne JA, Bryant SH, Turelli M: Long-Distance Migration of Drosophila. 2. Presence in
Desolate Sites and Dispersal Near a Desert Oasis. Am Nat 1987, 129:847–861.
5. Chapman JW, Reynolds DR, Wilson K: Long-range seasonal migration in insects:
mechanisms, evolutionary drivers and ecological consequences. Ecol Lett 2015,
18:287–302.
6. Reynolds AM, Reynolds DR, Sane SP, Hu G, Chapman JW: Orientation in high-flying
migrant insects in relation to flows: mechanisms and strategies. Philos Trans R Soc
Lond B Biol Sci 2016, 371.

of
7. Hu G, Lim KS, Horvitz N, Clark SJ, Reynolds DR, Sapir N, Chapman JW: Mass seasonal
bioflows of high-flying insect migrants. Science 2016, 354:1584–1587.

ro
8. Römer H: Directional hearing: from biophysical binaural cues to directional hearing
outdoors. Journal of Comparative Physiology A 2015,
9. Lewicki MS, Olshausen BA, Surlykke A, Moss CF: Scene analysis in the natural

-p
environment. Front Psychol 2014, 5:199.
10. Baker KL, Dickinson M, Findley TM, Gire DH, Louis M, Suver MP, Verhagen JV, Nagel KI,
Smear MC: Algorithms for Olfactory Search across Species. J Neurosci 2018,
re
38:9383–9389.
11. Honkanen A, Adden A, da Silva Freitas J, Heinze S: The insect central complex and the
neural basis of navigational strategies. J Exp Biol 2019, 222.
lP

12. Webb B: The internal maps of insects. J Exp Biol 2019, 222.
**The review elaborates the complex computations involved in making internal maps and
outlines the diverse approaches taken by insects.
13. Webb B: Robots with insect brains. Science 2020, 368:244–245.
na

14. Yazdi M, Bouwmans T: New trends on moving object detection in video images
captured by a moving camera: A survey. Computer Science Review 2018, 28:157–177.
*The review shows the challenges in performing a “simple” task of object detection in real
ur

environments by our best machines. These challenges aren’t limited to robots, yet our insects
cope with it providing bio inspiration.
15. Bagheri ZM, Wiederman SD, Cazzolato BS, Grainger S, O’Carroll DC: Performance of an
Jo

insect-inspired target tracker in natural conditions. Bioinspir Biomim 2017, 12:025006.


16. Skelton PSM, Finn A, Brinkworth RSA: Consistent estimation of rotational optical flow
in real environments using a biologically-inspired vision algorithm on embedded
hardware. Image Vis Comput 2019, 92:103814.
17. Neumann PP, Bennetts VH, Lilienthal AJ, Bartholmai M: From Insects to Micro Air
Vehicles—A Comparison of Reactive Plume Tracking Strategies. In Intelligent
Autonomous Systems 13. . Springer International Publishing; 2016:1533–1548.
18. Li Z, Tian ZF, Lu T-F, Wang H: Assessment of different plume-tracing algorithms for
indoor plumes. Build Environ 2020, 173:106746.
19. Dalgaty T, Vianello E, De Salvo B, Casas J: Insect-inspired neuromorphic computing.
Curr Opin Insect Sci 2018, 30:59–66.
20. Naik H, Bastien R, Navab N, Couzin ID: Animals in Virtual Environments. IEEE Trans
Vis Comput Graph 2020, 26:2073–2083.
21. Dombeck DA, Reiser MB: Real neuroscience in virtual worlds. Curr Opin Neurobiol
2012, 22:3–10.
22. Currier TA, Nagel KI: Multisensory Control of Orientation in Tethered Flying
Drosophila. Curr Biol 2018, 28:3533–3546.e6.
23. Dacke M, Bell ATA, Foster JJ, Baird EJ, Strube-Bloss MF, Byrne MJ, El Jundi B:
Multimodal cue integration in the dung beetle compass. Proc Natl Acad Sci U S A
2019, 116:14248–14253.

of
24. Batchelor AV, Wilson RI: Sound localization behavior in Drosophilamelanogaster
depends on inter-antenna vibration amplitude comparisons. J Exp Biol 2019, 222.
25. Makino K, Ando N, Shidara H, Hommaru N, Kanzaki R, Ogawa H: Auditory-Visual Virtual

ro
Reality for the Study of Multisensory Integration in Insect Navigation. In Biomimetic
and Biohybrid Systems. . Springer International Publishing; 2019:325–328.
26. Liu MZ, Vosshall LB: General Visual and Contingent Thermal Cues Interact to Elicit

-p
Attraction in Female Aedes aegypti Mosquitoes. Curr Biol 2019, 29:2250–2257.e4.
27. Cheng KY, Colbath RA, Frye MA: Olfactory and Neuromodulatory Signals Reverse
Visual Object Avoidance to Approach in Drosophila. Curr Biol 2019, 29:2058–2065.e2.
re
28. Mongeau J-M, Cheng KY, Aptekar J, Frye MA: Visuomotor strategies for object
approach and aversion in Drosophila melanogaster. J Exp Biol 2019, 222.
29. Dickerson BH, de Souza AM, Huda A, Dickinson MH: Flies Regulate Wing Motion via
lP

Active Control of a Dual-Function Gyroscope. Curr Biol 2019, 29:3517–3524.e3.


30. Nityananda V, Tarawneh G, Henriksen S, Umeton D, Simmons A, Read JCA: A Novel
Form of Stereo Vision in the Praying Mantis. Curr Biol 2018, 28:588–593.e4.
31. Kaushik PK, Renz M, Olsson SB: Characterizing long-range search behavior in Diptera
na

using complex 3D virtual environments. Proc Natl Acad Sci U S A 2020,


doi:10.1073/pnas.1912124117.
*The authors devise a novel VR paradigm with naturalistic visual, wind and odor to study
multimodal integration and search in diverse emerging model insects.
ur

32. Gray JR, Pawlowski V, Willis MA: A method for recording behavior and multineuronal
CNS activity from tethered insects flying in virtual space. J Neurosci Methods 2002,
120:211–223.
Jo

33. Stowers JR, Hofbauer M, Bastien R, Griessner J, Higgins P, Farooqui S, Fischer RM,
Nowikovsky K, Haubensak W, Couzin ID, et al.: Virtual reality for freely moving animals.
Nat Methods 2017, 14:995–1002.
**The authors built an unprecedented VR for freely flying insects. This enables a new paradigm
of VR manipulation without any mechanical artifacts of tethering.
34. Straw AD, Lee S, Dickinson MH: Visual control of altitude in flying Drosophila. Curr Biol
2010, 20:1550–1556.
35. Dickinson MH: Death Valley, Drosophila, and the Devonian toolkit. Annu Rev Entomol
2014, 59:51–72.
36. Mauss AS, Borst A: Motion vision in arthropods. In The Oxford handbook of invertebrate
neurobiology. . oxfordhandbooks.com; 2017.
37. Mauss AS, Borst A: Optic flow-based course control in insects. Curr Opin Neurobiol
2020, 60:21–27.
38. Currier TA, Nagel KI: Multisensory control of navigation in the fruit fly. Curr Opin
Neurobiol 2019, 64:10–16.
39. Vinauger C, Van Breugel F, Locke LT, Tobin KKS, Dickinson MH, Fairhall AL, Akbari OS,
Riffell JA: Visual-Olfactory Integration in the Human Disease Vector Mosquito Aedes
aegypti. Curr Biol 2019, 29:2509–2516.e5.
40. Chapman JW, Reynolds DR, Mouritsen H, Hill JK, Riley JR, Sivell D, Smith AD, Woiwod IP:

of
Wind selection and drift compensation optimize migratory pathways in a high-flying
moth. Curr Biol 2008, 18:514–518.
41. Rajaraman K, Nair A, Dey A, Balakrishnan R: Response Mode Choice in a Multimodally

ro
Duetting Paleotropical Pseudophylline Bushcricket. Frontiers in Ecology and Evolution
2018, 6:172.
42. Römer H, Schmidt AKD: Directional hearing in insects with internally coupled ears.

-p
Biol Cybern 2016, 110:247–254.
43. Balamurugan G, Valarmathi J, Naidu VPS: Survey on UAV navigation in GPS denied
environments. In 2016 International Conference on Signal Processing, Communication,
re
Power and Embedded System (SCOPES). . 2016:198–204.
44. Keleş MF, Mongeau J-M, Frye MA: Object features and T4/T5 motion detectors
modulate the dynamics of bar tracking by Drosophila. J Exp Biol 2019, 222.
lP

45. Celani A, Villermaux E, Vergassola M: Odor Landscapes in Turbulent Environments.


Phys Rev X 2014, 4:041015.
46. Currea JP, Smith JL, Theobald JC: Small fruit flies sacrifice temporal acuity to maintain
contrast sensitivity. Vision Res 2018, 149:1–8.
na

47. Palermo N, Theobald J: Fruit flies increase attention to their frontal visual field during
fast forward optic flow. Biol Lett 2019, 15:20180767.
48. Palavalli-Nettimi R, Theobald JC: Small eyes in dim light: Implications to spatio-
temporal visual abilities in Drosophila melanogaster. Vision Res 2020, 169:33–40.
ur

49. Green J, Vijayan V, Mussells Pires P, Adachi A, Maimon G: A neural heading estimate is
compared with an internal goal to guide oriented navigation. Nat Neurosci 2019,
22:1460–1468.
Jo

50. Green J, Adachi A, Shah KK, Hirokawa JD, Magani PS, Maimon G: A neural circuit
architecture for angular integration in Drosophila. Nature 2017, 546:101–106.
51. Kim SS, Hermundstad AM, Romani S, Abbott LF, Jayaraman V: Generation of stable
heading representations in diverse visual scenes. Nature 2019, 576:126–131.
*This paper combines the power of drosophila genetics with ethologically relevant input and
provides mechanistic insights on how insects navigate in diverse scenes.
52. Strauss R, Schuster S, Götz KG: Processing of artificial visual feedback in the walking
fruit fly Drosophila melanogaster. J Exp Biol 1997, 200:1281–1296.
53. Lindemann JP, Kern R, Michaelis C, Meyer P, van Hateren JH, Egelhaaf M: FliMax, a
novel stimulus device for panoramic and highspeed presentation of behaviourally
generated optic flow. Vision Res 2003, 43:779–791.
54. Reiser MB, Dickinson MH: A modular display system for insect behavioral
neuroscience. J Neurosci Methods 2008, 167:127–139.
55. Kunze P: Untersuchung des Bewegungssehens fixiert fliegender Bienen. Z Vgl Physiol
1961,
56. Cabrera S, Theobald JC: Flying fruit flies correct for visual sideslip depending on
relative speed of forward optic flow. Front Behav Neurosci 2013, 7:76.
57. Haberkern H, Basnak MA, Ahanonu B, Schauder D, Cohen JD, Bolstad M, Bruns C,
Jayaraman V: Visually Guided Behavior and Optogenetically Induced Learning in
Head-Fixed Flies Exploring a Virtual Landscape. Curr Biol 2019, 29:1647–1659.e8.

of
58. Creamer MS, Mano O, Tanaka R, Clark DA: A flexible geometry for panoramic visual
and optogenetic stimulation during behavior and physiology. J Neurosci Methods
2019, 323:48–55.

ro
59. Mathejczyk TF, Wernet MF: Modular assays for the quantitative study of visually
guided navigation in both flying and walking flies. J Neurosci Methods 2020,
340:108747.

-p
60. Franke K, Maia Chagas A, Zhao Z, Zimmermann MJ, Bartel P, Qiu Y, Szatko KP, Baden T,
Euler T: An arbitrary-spectrum spatial visual stimulator for vision research. Elife 2019,
8.
re
*By enabling the ability to provide hyperspectral visual input that is tuned to the organisms
visual system, this method opens up uncharted territories in visual ecology in VR.
lP

61. Warren TL, Weir PT, Dickinson MH: Flying Drosophilamelanogaster maintain arbitrary
but stable headings relative to the angle of polarized light. J Exp Biol 2018, 221.
62. Duistermars BJ, Chow DM, Frye MA: Flies require bilateral sensory input to track odor
gradients in flight. Curr Biol 2009, 19:1301–1307.
na

63. Duistermars BJ, Frye M: A magnetic tether system to investigate visual and olfactory
mediated flight control in Drosophila. J Vis Exp 2008, doi:10.3791/1063.
64. Lawson KKK, Srinivasan MV: Flight control of fruit flies: dynamic response to optic
flow and headwind. J Exp Biol 2017, 220:2005–2016.
ur

65. Fry SN, Rohrseitz N, Straw AD, Dickinson MH: TrackFly: virtual reality for a behavioral
system analysis in free-flying fruit flies. J Neurosci Methods 2008, 171:110–117.
66. Dreyer D, Frost B, Mouritsen H, Günther A, Green K, Whitehouse M, Johnsen S, Heinze S,
Jo

Warrant E: The Earth’s Magnetic Field and Visual Landmarks Steer Migratory Flight
Behavior in the Nocturnal Australian Bogong Moth. Curr Biol 2018, 28:2160–2166.e5.
67. Götz KG: Optomotorische Untersuchung des visuellen systems einiger
Augenmutanten der Fruchtfliege Drosophila. Kybernetik 1964, 2:77–92.
68. Reichardt W, Wenking H: Optical detection and fixation of objects by fixed flying flies.
Naturwissenschaften 1969, 56:424–425.
69. Reichardt W, Poggio T, Hausen K: Figure-ground discrimination by relative movement
in the visual system of the fly. Biol Cybern 1983, 46:1–30.
70. Rusch C, Roth E, Vinauger C, Riffell JA: Honeybees in a virtual reality environment
learn unique combinations of colour and shape. J Exp Biol 2017, 220:3478–3487.
71. Creamer MS, Mano O, Clark DA: Visual Control of Walking Speed in Drosophila.
Neuron 2018, 100:1460–1473.e6.
72. Ferris BD, Green J, Maimon G: Abolishment of Spontaneous Flight Turns in Visually
Responsive Drosophila. Curr Biol 2018, 28:170–180.e5.
73. Park EJ, Wasserman SM: Diversity of visuomotor reflexes in two Drosophila species.
Curr Biol 2018, 28:R865–R866.
74. Turner-Evans D, Wegener S, Rouault H, Franconville R, Wolff T, Seelig JD, Druckmann S,
Jayaraman V: Angular velocity integration in a fly heading circuit. Elife 2017, 6.
75. Kathman ND, Fox JL: Representation of Haltere Oscillations and Integration with

of
Visual Inputs in the Fly Central Complex. J Neurosci 2019, 39:4100–4112.
76. Giraldo YM, Leitch KJ, Ros IG, Warren TL, Weir PT, Dickinson MH: Sun Navigation
Requires Compass Neurons in Drosophila. Curr Biol 2018, 28:2845–2852.e4.

ro
77. Warren TL, Giraldo YM, Dickinson MH: Celestial navigation in Drosophila. J Exp Biol
2019, 222.
78. Fisher YE, Lu J, D’Alessandro I, Wilson RI: Sensorimotor experience remaps visual

-p
input to a heading-direction network. Nature 2019, 576:121–125.
79. Suver MP, Matheson AMM, Sarkar S, Damiata M, Schoppik D, Nagel KI: Encoding of
Wind Direction by Central Neurons in Drosophila. Neuron 2019, 102:828–842.e7.
re
80. Raiser G, Galizia CG, Szyszka P: A High-Bandwidth Dual-Channel Olfactory Stimulator
for Studying Temporal Sensitivity of Olfactory Processing. Chem Senses 2017,
42:141–151.
lP

81. Olsson SB, Kuebler LS, Veit D, Steck K, Schmidt A, Knaden M, Hansson BS: A novel
multicomponent stimulus device for use in olfactory experiments. J Neurosci Methods
2011, 195:1–9.
82. Bender JA, Dickinson MH: Visual stimulation of saccades in magnetically tethered
na

Drosophila. J Exp Biol 2006, 209:3170–3182.


83. Buatois A, Pichot C, Schultheiss P, Sandoz J-C, Lazzari CR, Chittka L, Avarguès-Weber A,
Giurfa M: Associative visual learning by tethered bees in a controlled visual
environment. Sci Rep 2017, 7:12903.
ur

84. Zwaka H, Bartels R, Lehfeldt S, Jusyte M, Hantke S, Menzel S, Gora J, Alberdi R, Menzel
R: Learning and Its Neural Correlates in a Virtual Environment for Honeybees. Front
Behav Neurosci 2018, 12:279.
Jo

85. Götz KG, Gambke C: [On the motion perception of the mealworm Tenebrio molitor].
Kybernetik 1968, 4:225–228.
86. Dahmen HJ: A simple apparatus to investigate the orientation of walking insects.
Experientia 1980, 36:685–687.
87. Götz KG: Course-control, metabolism and wing interference during ultralong tethered
flight in Drosophila melanogaster. J Exp Biol 1987,
88. Maimon G, Straw AD, Dickinson MH: Active flight increases the gain of visual motion
processing in Drosophila. Nat Neurosci 2010, 13:393–399.
89. Pringle JWS, Gray J: The gyroscopic mechanism of the halteres of Diptera. Philos
Trans R Soc Lond B Biol Sci 1948, 233:347–384.
90. Lihoreau M, Dubois T, Gomez-Moracho T, Kraus S, Monchanin C, Pasquaretta C: Chapter
One - Putting the ecology back into insect cognition research. In Advances in Insect
Physiology. Edited by Jurenka R. Academic Press; 2019:1–25.
*This review extensively addresses the need to study insect cognition from an ethological
context and how to embrace the complexity of natural systems.
91. Chan HK, Hersperger F, Marachlian E, Smith BH, Locatelli F, Szyszka P, Nowotny T:
Odorant mixtures elicit less variable and faster responses than pure odorants. PLoS
Comput Biol 2018, 14:e1006536.
92. Dyakova O, Nordström K: Image statistics and their processing in insect vision. Curr
Opin Insect Sci 2017, 24:7–14.

of
93. Harris LR, Jenkin MRM: Vision in 3D Environments. Cambridge University Press; 2011.
94. Chen J, Mandel HB, Fitzgerald JE, Clark DA: Asymmetric ON-OFF processing of visual

ro
motion cancels variability induced by the structure of natural scenes. Elife 2019, 8.
95. Toepfer F, Wolf R, Heisenberg M: Multi-stability with ambiguous visual stimuli in
Drosophila orientation behavior. PLoS Biol 2018, 16:e2003113.

-p
96. Dyakova O, Müller MM, Egelhaaf M, Nordström K: Image statistics of the environment
surrounding freely behaving hoverflies. J Comp Physiol A Neuroethol Sens Neural
Behav Physiol 2019, 205:373–385.
re
97. Mafra-Neto A, Cardé RT: Influence of plume structure and pheromone concentration
on upwind flight of Cadra cautella males. Physiol Entomol 1995, 20:117–133.
98. Connor EG, McHugh MK, Crimaldi JP: Quantification of airborne odor plumes using
lP

planar laser-induced fluorescence. Exp Fluids 2018, 59:137.


99. Lawson KKK, Srinivasan MV: A Robust Dual-Axis Virtual Reality Platform for Closed-
Loop Analysis of Insect Flight. In 2018 IEEE International Conference on Robotics and
Biomimetics (ROBIO). . 2018:262–267.
na

100. Matthews BJ, Vosshall LB: How to turn an organism into a model organism in 10
“easy”steps. J Exp Biol 2020,
**This review details the steps needed to convert an emerging model system into a tractable
model system. Coupling this with the knowledge of their natural histories will enable proximal
ur

and ultimate understanding of biological systems.


101. Agrawal S, Dean BK: Edge Detection Algorithm for Musca-Domestica Inspired
Vision System. IEEE Sens J 2019, 19:10591–10599.
Jo

102. Mansourian S, Enjin A, Jirle EV, Ramesh V, Rehermann G, Becher PG, Pool JE,
Stensmyr MC: Wild African Drosophila melanogaster Are Seasonal Specialists on
Marula Fruit. Curr Biol 2018, 28:3960–3968.e3.

You might also like