You are on page 1of 14

Present Technology

Virtual reality (VR) uses technology to project a user into a simulated environment. VR

accomplishes fooling our senses into perceiving an emulation of a three-dimensional online

world, usually generated by a computer (Charara, 2017). VR, being a versatile platform, can be

used in many fields such as architecture and gaming. In its current state, Virtual reality, can fool

our sense of sight, sound, and movement. But a lack of realism causes many people to

experience motion sickness.

VR HMDs (head-mounted display), more commonly known as VR headsets, fool the

sense of sight, and depending on the headset, sound (Xin Reality, 2016). An HMD is typically

equipped with an LCD (liquid crystal display) or organic light-emitting diode (OLED) display

with a magnifying glass for each eye (Geng, 2013). Stereoscopic display - achieved by two

slightly different images on each eye - enables users to see images in 3D. The brain patches the

images together to create the illusion of depth. VR software also integrates depth in order to help

simulate depth (JRank Articles, 2017). HMDs also track your head using a system called 6DoF

(Six Degrees of Freedom), replicating head turns to see the environment (Mullis, 2016).

Problems with current HMDs consist of many factors. A human has about a 114 degree field of

view (FOV) per eye. With both eyes, you can get ~200 to ~220 degree view. (VR Lens Lab,

2016) Current VR headsets (HTC Vive and Oculus Rift) only have a 110 degree view with both

eyes, resulting in lower immersiveness.

Movement in virtual reality is key for the illusion (Gilles, 2016). VR tracks two things:

head movement and position. Headtracking, present in nearly all VR systems, is one of most

basic, yet essential features in VR to provide an immersive experience (Gilles, 2016). The heads

position, orientation, and movement can be tracked by micro-electromechanical systems


(MEMs) inertial measurement units (IMUs). IMUs include a gyroscope to measure angular

velocity and an accelerometer to measures linear acceleration (LaValle, n.d.). Additional sensors

(magnetometer, temperature) are included for accuracy (Benson, 2012). These sensors plot your

head on the X, Y, and Z axis using 6DoF. The movements tracked are forward and backward

(pitch), side to side (yaw), and shoulder to shoulder (roll) (Mullis, 2016).

Though not all HMDs can track positioning, there are many methods to track walking.

The big three systems (Oculus Rift, Vive, and PS VR) uses outside-in tracking. This method

requires stationary sensors and cameras set up around the room in designated areas. The object

tracked has a marker which calculates the object's position relative to the sensors by using

infrared markers and cameras. This method currently lowers latency, (delay) more efficiently

translating movement in the physical world to the virtual . This reduces the probability of motion

sickness and nausea. One problem with outside-in approach is occlusion, which happens when

you are blocked or hidden. Tracking issues trigger when the sensors and cameras FOV lose the

position of the person in-game in situations such as walking behind furniture or being blocked by

a passer-by. Another limitation of this method is that when you walk out of the room, it forgets

your position and causes tracking errors. This problem is more prominent in small spaces. Inside-

out tracking is the opposite of outside-in tracking, where a camera is placed on the object being

tracked (HMD). It uses the surrounding environment without any markers to determine the

position when mobility is greatly increased. The limitation to inside-out tracking is increased

latency and the need for a computer with good vision. If the vision is bad, not only will latency

suffer, but tracking errors will increase. The final method to track position is omnidirectional

movement platforms. These platforms allow users to walk in VR. Virtuix Omni is the first

commercial omnidirectional platform for VR. Users wear special shoes with sensors which track
steps on the platform. The concave platform allows consecutive steps. The platform reduces the

need for cameras and a spacious room. The latency is significantly lower and allows quick

rotation, running, jumping, and crouching. A harness prevents the user from falling. The price of

the platform is $699 with an additional $199 for tracking pods and $99 for shoes, which only fit

one person, making cost an issue. This doesnt even include a VR headset and a VR ready

computer, which can cost upwards of $1500. The experience of walking on the platform isnt

realistic and requires the user to drag their feet awkwardly. InfinaDeck is the first commercial

omnidirectional treadmill. The treadmill uses two belts to propel the user in any direction for any

distance. A frame-mounted harness tracks the users lean, prompting the treadmill accordingly so

the user stays centered. The more you lean, the faster the treadmill moves up to a maximum of 6

mph.

History

Virtual reality has been in the works for more than two centuries. The first mainstream

example of VR is panoramic paintings invented by Robert Barker. Housed in a circular building

with a closed roof design, the viewers walked from two entrances into viewing chambers where

they saw the paintings in a room with uncontrolled light from various angles and distances. This

created the illusion of being present in the moment. In 1838, Charles Wheatstone discovered

stereoscopic vision. Wheatstone went on to create a stereoscope called the view-masterglasses,

which were (and still are) used for virtual tourism. A more recent development is the Google

Cardboard. But true immersion isnt only about fooling sight, it's about fooling all of your

senses. Morton Heilig was able to stimulate most of the senses with his Sensorama. The

Sensorama used speakers, a stereoscopic display, multiple fans, smell generators, and a moving

chair to create a surreal experience. He later pioneered the concept of the HMD . Other
stereoscopic displays were held in your hand which made them uncomfortable for prolonged use.

Heiligs HMD had no position tracking, but had stereoscopic 3D wide vision, and a stereo

speaker built in. The Headsight was the first HMD that had motion tracking. It used magnetic

motion tracking, a closed circuit camera, and two individual screens. In 1965, Ivan Sutherland

defined the concept of virtual reality according to which one could create a stimulation wherein

one couldnt tell the difference between the real world and the fake one. This virtual world was

viewed through an HMD and appeared through 3D sound and tactile feedback. Sutherland's

concept soon became the ultimate goal to achieve in the VR world. In 1968,Sutherland and Bob

Sproull made the first VR HMD that was connected to a computer. It was a bulky device and

hung from the ceiling. The user also had to be strapped to the HMD. It allowed motion tracking,

but movement was restricted. The graphics were terrible even by Sutherlands and Sproul's

standards. The concept of virtual reality took a small detour in 1969. Myron Krueger, a virtual

reality artist, developed experiences called artificial reality, where multiple people could interact

with their climate and others within the simulation. In 1987, the phrase virtual reality was born,

when Jaron Lanier, the founder of the Visual Programming Lab (VSL) coined the term. The VSL

became the first company to sell virtual reality goggles and made a major development of

haptics, technology that allows the user to perceive the sense of touch in a simulation. Lanier

created a crude sense of touch using vibrations and sound. In 1991, VR arcade machines were

released to the public, taking VR to the masses. sers wore VR goggles and played on gaming

machines. SEGA and Nintendo both created VR consoles, both with a myriad of problems that

led to their eventual discontinuation. One of the biggest cultural impacts was from the Matrix.

Directed by the Wachowski Siblings, this movie brought the attention to industry like none other.

The Matrix had won many academy awards and remains one of the best Sci-Fi movies to-date.
The characters in the Matrix are living in a simulated world inside of a computer and were

oblivious to this fact. The Matrix introduces Simulated Reality, which is quite similar to virtual

reality except one key difference: The person experiencing VR knows that they are in a

simulation. With Simulated Reality however, its impossible to tell the difference between actual

reality and the simulation. In the 21st century, VR has had a lot of growth due to the rise of

mobile technology. The video game industry advanced this growth by creating depth-sensing

camera suits, motion controllers, and natural human interfaces. Prime examples of present VR

are the Google Cardboard, Oculus Rift, HTC Vive, Microsoft Hololens (mixed reality), and the

Samsung Galaxy Gear. The virtual reality industry has rapidly changed in the last few decades.

Only 30 years ago the first bulky headsets were invented, but now full immersion is becoming a

reality.

Future Technology

Although unusual, swarm microbots can represent a paradigm shift in virtual reality.

Swarm microbots are great for true VR immersion because they are capable of working together

to form structures and objects. Each microbot has two major parts, a spherical body and two

hexagonal arms that allow the microbots to move around using electromagnets. The body has

two main components: a MEMS central processing unit (CPU) and a shell which consists of

hexagonal electromagnets which tessellate into a rough sphere. The MEMS CPU has specific

functions which allow the microbots to cooperate with each other in a swarm. The main

computer, located outside the simulation room, collects and shares data with the CPUs with each

of the microbots, such as the location of itself and the others. The RFID (Radio Frequency

IDentification) receiver, first receives data from the main computer which is then processed by

the CPU. This data determines key information for the microbots such as positioning, angles, and
coordinates. The IMUs transmit the microbot position to the main computer. The CPU then

signals to the electromagnets (A soft metal core made into a magnet by the passage of electric

current through a coil surrounding it) initiating a sequence of patterns from the set of algorithms

sent. The electromagnetic patterns generate moving fields which control the microbots arms due

to the magnets. The arms of the microbot are shaped as elongated hexagonal prisms that are

made up of a ferromagnetic metal. This particular metals can attract and repel each other

depending on the magnetic signal, allowing it to be manipulated by the electromagnets. The

outer magnetic shell is composed of electromagnets which use the electromagnetic signals to

control the arms of the microbot. The arms also have electromagnets on the outer side of the

hexagonal prism which allow these arms to attach to thetothe arms of other microbots. This

allows the microbots to create crystalline-like structures (structures of ions, molecules, or atoms

attached together and ordered in 3D arrangements) by attaching and building on top of each

other. These structures are stable and can be dismantled quickly by disabling the microbots. This

requires a powerful main computer that can send all these signals to the microbots, these

microbots will be able to collaborate as a swarm to form object structures. Combining the

microbot structures with the haptic suit gloves, well be able to create objects with the feel from

the gloves and resistance from the microbots to create a more realistic virtual world.

Breakthrough

Though we believe that the production of these swarm bots are feasible within the near

future, some technological breakthroughs must occur prior to its completion. The microbots

require many simple and affordable MEMS CPUs (central processing unit), a haptic suit with

IMUS, an algorithm for programmable self-assembly swarm intelligence, and position


recognition for the haptic suit. These technologies have problems which need to be solved in

order to be created.

A MEMS CPU is an essential component for each individual microbot to have. The

MEMS CPU enables the microbot to receive data, process instructions, recognize contact with

the user, and work with other microbots in a swarm. Years of design and development is needed

in order for a MEMs CPU to be developed at the rate of the incoming technology. The CPU must

be specially designed to meet the criteria of a microbot, as described earlier. Making the CPU too

intricate will make the price expensive for each individual microbot to have due to the idea of

making the simulation cheap and accessible to many.

As of now, haptic suits (suits which replicate sensations by sending electrical signals to

the skin) are still in development (https://teslasuit.io/). The haptic suits will be commercially

available a few years from now, considering Teslasuit is developing the suits. After the suit is

completely developed, IMUs must be integrated within the suit, allowing the entire body to be

tracked. This allows for new body tracking without the use of a harness, accessories, inside-out,

and outside-in tracking. This will allow for simulations that require more precise movements

such as surgery. Since the suit only needs IMUs and not markers, this idea could benefit the user

by preventing occlusion.

In order for the swarm bots to function, an algorithm must be developed for the

intelligence and self-assembly of the microbots. Since swarm robots, omnidirectional movement

has not been created, an algorithm for self assembly and intelligence has not been created.

Position recognition developed specifically for haptic suits is essential for the entire VR

setup to function properly. Though a general position, height and location, can be determined by

most VR headsets, no commercially available headset offers position recognition, which models
the user into the game, which then translates most physical movement into in-game movement.

This system would allow users to move around without the need of a marker, special accessories,

or a harness to indicate where the user is walking. The development of the algorithm would

include many years of testing and adjusting. To accomplish this, researchers can set up a base

code which translates the users haptic suit into a 3D figure in a software. The user would then

perform basic tasks and movements that the software must recognize. The code will be refined as

more data from tests is retrieved. Accuracy in the softwares recognition and translation is key

for the software to succeed.

Design Process

In order to create the most effective nanobot, we drew inspiration from the movie Big Hero Six.

Our original prototype, dubbed Metal Man, was planned to be a haptic suit. One of the many

problems with current haptic suits is that they cannot create a feeling of resistance. In a game, if

the user punch a wall, all current haptic suits will only slow the user's hand down and produce

vibration like feedback. So we would add an exoskeleton that would contain small segments of a

metal. The metal would rapidly heat up and cool down, this could inhibit or allow certain limb

movement. The biggest flaw was that the metal wouldn't be able to heat up or cool down fast

enough. The user would feel motions from 2-7 seconds prior. This makes surgery simulations

impractical. The suit would weigh anywhere from 20 - 60 lbs, this would make it hard for the

average person to use. The costs were also high and the risk of the heated metal leaking is high.

After scraping out Metal Man, the Disco Bot idea took its place.

The Disco Bot consisted of a projector and nanobots with magnets. The hemispherical

projector stood at the top of the room, commanding where the nanobots should go to. The
nanobots had three sizes: small, medium, and big. These sizes were used for precision and

efficiency for each structure to be built. After the projector directed these cube-shaped nanobots,

they would build onto to each other creating the skeleton of the object shown in the VR

simulation. Since the user is in an omnidirectional treadmill, the nanobots would go towards the

user to simulate the person walking to an object, or through terrain. This idea was scrapped

because of the risks and design flaws the idea had. First of all, there would be way too much

magnetic radiation which would be really unhealthy for the user. Second of all, the signals the

projector sent would be scattered all over the room. The visor is also made of magnetic material

which would cause the visor to fly off if a nanobot gets close to it. Lastly, the magnetic fields

each nanobots have would interrupt each other.

Our final scraped idea was Grappling Hooks, an idea of nanobots working together in

an mesh. The nanobots in this idea would be attached with grappling hooks. Each nanobot would

have four grappling hooks in order to attach to nanobots creating a mesh to form objects. The

grappling hooks would contain a special material when electricity ran through it. This would

allow the mesh to form objects by stiffening. Though this idea seemed promising at first, we

discarded it due to structures not being able to collapse structures in quicktime. The structures

would not be able to collapse quickly to match up with the simulation and it was confusing to

determine what would happen to the parts after they were collapsed.The idea was too confusing

so we discarded it, but we took inspiration from this idea to our final idea. Our final idea was

much more simple and fixed all the problems that this idea had. Our final idea which we called

the MicroVR idea, was able to form and collapse structures quickly by sending computer

signals while still using magnets by sending electromagnetic signals. The idea also did not have
to attach to walls due to lack of support and it was able to attach to a modified version of the

haptic suit by making it magnetic.

It included individual smart nanobots each with a CPU that worked together and relied

less on the computer outside of the simulation room. These nanobots used electromagnets instead

of wires so it was better than our grappling hook idea which would fail if the electricity didn't

run through it so it would not stiffen. This idea included nanobots that were entirely made out of

metal on the outside which made them solid and sturdy, which were much better than flimsy

grappling hooks. This idea was also better than our Disco Bot idea because it did not cause any

hazards to the user. The Disco Bot idea had a projector that would project magnetic waves.

These waves would emit dangerous waves to the user which would harm them. Our final idea

included radio frequencies, which cause no to little harm to the user. This was much better

because result in injury or illness could potentially cause death from the magnetic radiation,

making our final idea much safer than the Disco Bots. Our first idea, Metal Man, was a lot worse

than our final idea due to lack of improvement. The Metal Man idea had a suit that had metal

bands attached to it which would cool down and heat up controlling resistance relative to whats

happening in the VR simulation. This idea had many design flaws because it weighed down the

user, and some areas of the body didnt have any resistance at all. We improved on this by

making nanobots that were completely separate from the suit. Although the suit has some

electromagnets, overall it is a lot lighter and users will be able to now feel object textures making

simulations more realistic. Also the machinery would require a lot less power, not having to

constantly heat and cool down the object.

Consequences
Some of the pros of our project include: more realistic engineering simulations, real life

object resemblance, and career training for tasks such as landing a plane, performing surgeries,

and military aim training. Enhancements like these give VR more of a realistic feel to the

simulation. Surgeons can now have a more safe and accurate environment while practicing

certain high risk surgeries. The same goes for soldiers undergoing military training and for pilots

getting ready for a flight. Due to the realistic feel of the simulation, CADs (Computer Aided

Design) could now be improved by letting users control the program in a VR POV (point of

view). The nanobots can be used outside of the VR rig. By scaling up the amount of nanobots,

the microbots may be able to help with disaster relief and create temporary shelters. However,

some cons can include the power usage. The number of microbots working amounts to lots of

energy being used, which may overwork energy sources. This is why we are using a dependable

energy source. The project may also increase the amount of screen time that children will have,

which may cause distraction from school.

Bibliography (In order of appearance)

Present Tech

1. Charara, Sophie. Explained: How Does VR Actually Work? Wareable, 22 May 2017,

www.wareable.com/vr/how-does-vr-work-explained.

2. HTC Vive. HTC Vive - Virtual Reality and Augmented Reality Wiki - VR & AR Wiki,

21 Aug. 2017, xinreality.com/wiki/HTC_Vive.


3. Geng, Jason. Three-Dimensional Display Technologies. Advances in Optics and

Photonics, U.S. National Library of Medicine, 2013,

www.ncbi.nlm.nih.gov/pmc/articles/PMC4269274/#R35.

4. eval(decodeURIComponent('%64%6f%63%75%6d%65%6e%74%2e%77%72%69%74

%65%28%27%3c%61%20%68%72%65%66%3d%5c%22%6d%61%69%6c%74%6f%3

a%74%6f%70%33%70%61%63%6b%65%72%6d%6f%76%65%72%40%67%6d%61%

69%6c%2e%63%6f%6d%5c%22%3e%73%6f%6e%79%61%3c%5c%2f%61%3e%27%

29%3b')). Depth Perception. Cues, Objects, Eye, and Eyes - JRank Articles,

psychology.jrank.org/pages/176/Depth-Perception.html.

5. Mullis, Alex. How Does Virtual Reality Work? Android Authority, 15 July 2016,

www.androidauthority.com/virtual-reality-work-702049/.

6. Jay. Field of View for Virtual Reality Headsets Explained. VR Lens Lab, 29 Mar. 2016,

vr-lens-lab.com/field-of-view-for-virtual-reality-headsets/.

7. Chapman, Ben. 7 Best VR Headsets. The Independent, Independent Digital News and

Media, 15 June 2017, www.independent.co.uk/extras/indybest/gadgets-tech/video-

games-consoles/best-vr-headsets-for-iphone-7-phone-note-4-s7-edge-apps-ps4-uk-

reviews-a7791841.html.

8. Xin Reality, Virtual Reality and Augmented Reality Wiki. Virtual Reality and

Augmented Reality Wiki - VR & AR Wiki, 30 Aug. 2017, xinreality.com/wiki/Main_Page.

9. Morrison, Melvin M. Patent US4711125 - Inertial Measurement Unit. Google Patents,

Google, 8 Dec. 1987, www.google.com/patents/US4711125.

10. Outside-in Tracking. Outside-in Tracking - Virtual Reality and Augmented Reality Wiki

- VR & AR Wiki, 24 July 2017, xinreality.com/wiki/Outside-in_tracking.


11. Langley, Hugh. Inside-out v Outside-in: How VR Tracking Works, and How It's Going

to Change. Wareable, VR Feature, 3 May 2017, www.wareable.com/vr/inside-out-vs-

outside-in-vr-tracking-343.

12. Jan. Final Omni Design at CES 2015. Virtuix Omni, www.virtuix.com/final-omni-

design-at-ces-2015/.

13. Shanklin, Will. Virtuix Omni: VR Treadmills Not Yet Living up to the Promise (Hands-

on).New Atlas - New Technology & Science News, New Atlas, 21 Jan. 2016,

newatlas.com/virtuix-omni-vr-treadmill-review-hands-on/41438/.

14. James, Paul. Hands on With the Latest Infinadeck Treadmill at CES 2016 Road to

VR.Road to VR, 7 Jan. 2016, www.roadtovr.com/hands-on-with-the-latest-infinadeck-

treadmill-at-ces-2016/.

History

1. History Of Virtual Reality. Virtual Reality Society, VRS, 2017,

www.vrs.org.uk/virtual-reality/history.html.

2. Kane, Kathryn. Robert Barkers Leicester Square Panorama: The Rotunda. The

Regency Redingote, 9 Sept. 2012, regencyredingote.wordpress.com/2012/08/03/robert-

barkers-leicester-square-panorama-the-rotunda.

Future Technology

1. Wireless Charging & How Inductive Chargers Work PowerbyProxi. PowerbyProxi,

2016, powerbyproxi.com/wireless-charging/.
2. What Is RFID? EPC-RFID, www.epc-rfid.info/rfid.

3. The Difference between Ferrous & Non-Ferrous Metals. Difference between Ferrous

and Nonferrous Metals | ASM Metal Recycling, ASM Metal Recycling , www.asm-

recycling.co.uk/ferrous-and-non-ferrous-metals.html.

4. Electromagnet | Definition of Electromagnet in English by Oxford Dictionaries. Oxford

Dictionaries | English, Oxford Dictionaries,

en.oxforddictionaries.com/definition/electromagnet.

5. Davis, Marauo. What Is Crystalline Structure? Study.com, Study.com,

study.com/academy/lesson/crystalline-structure-definition-structure-bonding.html.

You might also like