You are on page 1of 31

CHAPTER 1

INTRODUCTION

Leap motion :

It is a breakthrough in computer interaction, using a patented mathematical approach to 3D,


touch free motion.
sensing & motion control software that’s alike anything that currently exist on the market.
Through this new technology we can say good bye to Mouse & Keyboard .Leap represents an entirely
new way to interact with our computers.
It’s more accurate than a mouse, as reliable as a keyboard and more sensitive than a touch
screen. We can control a computer in three dimensions with your natural hand and finger
movements Leap Motion has, well, leapfrogged the state of the art in this young field, giving
users the ability to control what's on their computers with hundredth of millimeter accuracy.
The main goal of the research presented in this paper was to analyze the precision and reliability
of the Leap Motion Controller in static and dynamic conditions and to determine its suitability as
an economically attractive finger/hand and object tracking sensor. The evaluation was performed
with the aid of a high-speed and highly accurate optical motion capture system.
To the best of knowledge, no study has yet been conducted with the Leap Motion Controller in
combination with an optical motion capture system. The main contributions of this analyses of
the following:

 Precision and reliability (spatial dispersion of measurements through time) of the


controller
 The spatial distortion of accuracy (variation of accuracy in different regions of sensory
space)
 Sampling frequency and its consistency.

The user interface and the corresponding interaction modalities play an essential role in the
human-computer relationship. Advanced multimodal interfaces present yet another step in this
equation, providing users with the freedom and flexibility to choose the best input modality for
specific tasks. Users generally prefer multimodal interaction when it is available and intuitive to

1
use.
Gesture-based user interfaces, in combination with the latest technical advances that incorporate
accurate yet affordable new types of input devices, provide realistic new opportunities for
specific application areas (e.g., entertainment, learning, health, engineering), especially for users
who are uncomfortable with more commonly used input devices and/or technology.

Human-Computer Interaction (HCI) is an important field of study that has gained increasing
attention with the emergence of new interaction devices (such as de Nintendo Wii Remote1 and
the Microsoft Kinect2). The use of gestural interfaces are arousing interest, since interaction is
essential in several domains of application such as art, medical assistance, simulation, etc.
In this context, the computer music field has been witnessing a growing number of new Digital
Musical Instruments (DMIs), that have benefit from new interface technologies.
Several of these DMIs can be found all over internet sites like Youtube3 and Vimeo3. DMIs
such as the Kin Hackt4 developed by Adriano Clemente and the work of Chris Vik in
collaboration with the dancer Paul Walker5, integrate the Kinect sensor capabilities with the
software Live6 by Ableton to enable musical composition. Projects like the V motion project7,
built a DMI that uses two Kinect sensors to enhance precision and reduce latency at the same
time as giving visual feedback for artistic purposes.
New instruments using other videogame motion controllers such as the Wii remote (Silva, 2012;
Miller and Hammond, 2010), projects using multi-touch tablets, multi-touch tables or building
their own innovative DMIs such as Mike Waisvisz’s “Hands”8, developed in the 1980’s, have
also been explored.
Moreover, several researchers have addressed the evaluation of new technologies and DMIs.
Studies evaluated precision in musical properties such as tempo, latency and precision with
instruments built with the Wii remote controller alone, with the Wii sensor bar and with several
multi-touch platforms available at the time. Others approached piano movement analysis with
inertial sensors attached to the user.
Since the area is relatively new, there is a lack of consolidated method to evaluate DMIs.
Therefore, researches have used different methods to approach this problem, varying from using
methodologies based on HCI concepts and theories to a more direct analysis focusing on
comparison of acoustic and digital instruments.

2
In 2012, the Leap Motion controller was introduced. This gesture controller provides an
approximately 150º field of view and uses a depth sensor to track hand features up to 1/100th of
a millimeter. This fine-grained control may represent an opportunity to create a new generation
of DMI. However, to confirm Leap Motion potential, an evaluation should be performed
concerning latency and precision, some of the common bottlenecks in the use of gestural
interfaces in DMIs.
In this paper, we perform a preliminary study and evaluation of the Leap Motion sensor as a tool
for building DMIs. We start by listing the conventional music gestures that can be recognized by
the device. Then, we analyze precision and latency of these gestures. As part of the evaluation
method, we also propose a first case study integrating Leap Motion with a virtual music
keyboard, which was called “Crystal Piano”.

The Leap Motion project has a high level aim of producing an application that can recognise
Auslan signs. This functionality could then be incorporated into a system to help young Deaf and
hard of hearing children to learn Auslan signs. The system would be able to demonstrate specific
signs using video and images, and provide feedback to the child on their own Auslan sign
accuracy through the Leap Motion controller. This project is aimed specifically for Australian
Sign Language (Auslan) and the principles will be relevant to any sign based communication
system.
This reports the findings of the first phase of the Leap Motion project. This focuses on evaluating
the Leap Motion controller for its ability to recognise Auslan signs made in the field-of-view of
the controller. The second phase of the project will look at the ability to record Auslan signs and
to train the system to recognise Auslan signs for later recognition and identification.
The Leap Motion controller was released in July 2013, and it presents the opportunity for a new
way of interacting with technology that is yet to be evaluated. This paper provides an early
evaluation of the technology specific to its recognition of hand and finger movements as they are
required for Auslan. We will first explore the use of gesture recognition technologies with sign
language, before describing the Leap Motion controller in more detail. We will describe the
approach taken in this evaluation and then present the strengths and weaknesses of the controller
that are key to its suitability for sign language recognition.

3
CHAPTER 2

LITERATURE SURVEY

This is mainly focuses on two main areas of computing: the use of robotics and the use
of innovative control systems. Both of these areas are incredibly popular in both the hobbyist
and professional communities, ranging from home automation to military applications, and as
such have a vast amount of material for anyone who is interested to take a look at.

Title: Myoelectric Teleoperation of a Complex Robotic Hand


Authors: Farry, Kadir
Year: 1996,2012
Description:
This discusses myoelectric signal processing approaches to robotics control, i.e.
control strategies that directly make use of the electrical signals passed through the human body.
While used previously in prosthetic limbs, the paper sought to find whether it could be used for
more than one degree-of-freedom at a time (read as: one type of motion). They found a high
level of accuracy for multiple degrees of freedom that proved this is as a working concept and
the project a success. Not only did this bode well for the development of prosthetic limbs (which
seek to replace the functionality of lost body parts as much as possible), but for intuitive robotics
controlingeneral.(Farry,1996)

As a part of the International Symposium on Robotics and Intelligent Sensors 2012 (IRIS 2012)
a paper titled Internet Controlled Robotic Arm challenged the perception that robotics was an
industry-only technology, showing great success with the teleoperation of an Arduino Uno
powered robotic arm. Over an internet connection the results showed an accuracy of 97-99%
compared to direct communication with the device, proving the viability of tele operated robotics
in non-industrial settings and helping promote amateur robotics development too. (Kadir, 2012)

4
Title: Lorentz Levitation Technology: A New Approach To Fine Motion Robotics,
Teleoperation, Haptic Interfaces, and Vibration Isolation,
Authors: R. Hollis and S. Salcudean
Year: 19993
Description:
This takes an approach to robotics control using Lorentz force, i.e. the force
experienced by a conductor within a magnetic field. By using this they have found, among other
things, benefits for hardware assembly through fine motion control and “high fidelity
force/torque feedback for teleoperation”. This kind of innovative use of alternative control
methods is highly commendable and to an extent helps to push the methods used into the
mainstream through example and “unlock” an area of research that could have huge benefits
down the line. (Hollis, 1993)

Title: A New Approach to Visual Servoing in Robotics


Authors: Bemand espiau and Francois Chaumette and Patrick Rives
Year: 1992
Description:
This approaches the control of robotics through a vision-based system, i.e. the use of a sensor to
extract information from the environment for a specific task, and a looping control algorithm to
adjust the hardware properly based on that information. This approach is referred to as visual
servoing in their paper, and is very similar to my own project (the sensor being the Leap Motion,
which processes and provides the information related to the user's hand, and the Arduino
providing the looping control algorithm adjusting the servos) though explored more in-depth in
the paper as a general approach rather than how it could be used for a specific task. This
approach can be used in a wide range of applications, though it is limited by the pattern
recognition algorithms used by the sensor – again, this is seen in my own project by the
limitations of the Leap Motion controller in detecting rotated hand details, more details on which
can be found in the Problems section. (Espiau, 1992)

5
Title: Evolutionary Robotics
Authors: Stefano Nolfi and Dario Floreano
Year: 2000
Description:
This takes a look at robotics through the perspective of Darwinian evolution theory
(i.e. selective reproduction or “natural selection”). It suggested that the creation of autonomous
robots can be automated through the use of concepts like neural networks and genetic
algorithms, with implications in several areas including artificial intelligence. This kind of
application shows the level of depth this area of the industry has and the research being done in
it, creating potential benefits for the entire industry – in this case through exponential work speed
improvements..(Nolfi,2002)

Robotics has in large part become popular in the DIY community thanks to the popularity soar in
affordable, programmable microcontrollers such as the Arduino and the Rhino. My project
makes use of the former to control the servos responsible for manipulating MARC, but this is by
no means the only solution on the market – the ability to perform general computing tasks rather
than specializing in one particular area, mixed with the low demands of the MARC hardware,
mean I could have picked from a wide range of options. The choice of the Arduino Uno was in
part based on monetary cost and availability as opposed to raw statistical superiority and this
study should by no means be seen as a promotion of that particular product but of the methods.

The Arduino Uno used throughout the project has an operating voltage of 5V (powered through
either a USB connection or an external power supply), 14 digital I/O pins and a clock speed of
16 MHz among other things. While not as powerful as a modern desktop computer, it's more
than enough to handle smaller tasks and any hobbyist projects with low budgets. As such, it's
extremely popular in the DIY community and has been used in a large number of systems
since.(Arduino,2015)

Plantduino was one such project, using the Arduino and several sensors to create an “automated
watering and temperature system” for a small backyard greenhouse. Combining the biology

6
interests of the author and tutorials they found on certain sites, the project is a testament to the
accessibility of the Arduino software for people who aren't specifically from an engineering
background as well as the versatility of the hardware itself. (Revolt Lab, 2011)

The growing concept of the Internet of Things (IoT) has been explored over recent years, noting
the rising prevalence of internet-connected objects and the role of microcontrollers (including the
Arduino) in the industry's future. Their accessibility, popularity and ability for general purpose
computing has proved them ideal platforms for networked systems, and will no doubt continue to
be adopted by professional engineers and hobbyists alike. (Doukas, 2012)

Title: Arduino for Robotics


Authors: John-David Warren and Harald Molle
Year: 2011
Description:
In this they explores the use of the Arduino in small scale robotics systems, an
area that's been thoroughly explored by the hobbyist community over the years and is the subject
of my own project. This book in particular provides an easy guide for robotics projects, a
testament to the accessibility of the platform as previously mentioned. (Warren, 2011)

Title: Integrating Arduino-based Educational Mobile Robots in ROS


Authors: Andre Araujo
Year:2013
Description:
It takes this a step further and explores the viability of the Arduino as a platform
for educational robotics. It was ultimately a success, proving further how the Arduino is perfectly
capable of robotics based projects that would in that past have been much more costly for
amateurdevelopers.(Araújo,2013)

Another popular choice of microcontroller is the Rhino, which has the added bonus of .NET
support, allowing it to run a range of Windows-based applications too. One such project is the
CRANE robot motion control suite, which uses the Rhino and the Grasshopper parametric

7
modelling plugin to control a robotic arm using the Leap Motion controller. Similar to my own
project, this one has the added benefit of connecting the controller straight to the microcontroller
at the expense of higher overall costs. This shows just how popular this hardware setup
(controller to microcontroller to robot arm) is and bodes well for the success of my own
project.(Harms,2012)

Intuitive control systems have come in many forms over the years from camera-based motion
detection to measuring the human body's electrical activity and even to some extent pervading
the area of high-level design, with many developers now taking into account human-computer
interaction concepts when designing their systems.

Intuitive Control of a Planar Bipedal Walking Robot, published in 1998, showcased several
intuitive control strategies in the control of a walking robot, primarily for keeping it balanced
through stabilizing height, pitch and speed. The results of the work proved positive, suggesting
that the use of “simple physical intuition” can lead to the discovery of successful control
strategies that can be easily described to laymen. (Pratt, 1998)

The Microsoft Kinect (Microsoft, 2015), released publicly in 2010 for the Xbox 360 and later for
Windows, has been a particular favourite amongst hobbyists due to its capability in tracking full
body motion very accurately. Using Kinect for Hand Tracking and Rendering in Wearable
Haptics was an article published in 2011 by IEEE and written by Valentina Frati and Domenico
Prattichizo, in which the Kinect was explored as a solution for tracking hand motion with
wearable haptics (read: tactile feedback technology). This approach could have been to the
benefit of my own project, but the use of Leap Motion instead seemed to be the better approach
for higher accessibility (as the user simply needs to start the application and plug the controller
in, rather than put on special gloves or any other accommodation the system would otherwise
require).(Frati,2011)

Design and Implementation of an EMG Control System, written in 2013 by Ettedgui et al,
attempted to create a control system for a wheel-based robot using a wireless Bluetooth
connection and electromyography sensors (i.e. sensors that detect “changes in electric potential

8
across muscles” and convert it into digital signals). Not only was the project successful in
creating an “inexpensive platform” for wireless robotics control, it also noted the rising
popularity in the approaches used that have resulted in products such as the Myo armband being
developed.(Ettedgui,2013)

The Leap Motion, too, has been used in many different projects over the years, making use of a
series of infrared sensors to detect hands placed above it and processing that information into an
easy to understand format for you to use. The popularity of the device most likely owes itself to
both this and the extensive documentation found on their website, along with the relatively low
price of the controller itself (£54.99 at time of writing, 10th April 2015). (Leap Motion, Inc.,
2015)

Title: Analysis of the Accuracy and Robustness of the Leap Motion Controller
Authors: Frank Weichert
Year: 2013
Description:
It was an in-depth study into the accuracy of the controller to inform any future
work with the device about its capabilities. What was found was that the advertised precision of
0.01mm was not capable in a realistic environment, but that an average accuracy of 0.7mm was
found instead. Any studies that would require greater accuracy than this would be better off
looking for other solutions, but for my own project’s purposes. (Weichert,2013)

Development of a Leap Motion Controller-Based Program for Finger Range-Of-Motion


Measurement, published in 2013 and written by Adrielle Cusi among others, showed another
application for this controller where doctors could use it to accurately measure “the maximum
degree and direction a joint can move”. Automating certain medical tasks like this has wider
ramifications for the medical industry too: freeing up doctors to work on more important tasks
would lower the stress associated with a high workload, which could in turn improve the service
provided to healthcare patients and a better experience for everyone involved. (Cusi, 2013)

Free-hand Interaction with Leap Motion Controller for Stroke Rehabilitation, published in 2014,

9
is an equally creative use of the Leap Motion controller, this time as an aid for recovering stroke
victims to practice finger movement and produce accurate clinical assessment scores as a result.
As with the range-of-motion measurement study mentioned before, the use of intuitive control
systems has once again been shown to be a benefit to the medical industry that could aid those in
need and not simply a passing fad in software design as some might think. (Khademi, 2014)

In 2013 the controller was explored as a solution for Australian Sign Language recognition,
though the study found similar issues to my own project in the reliability of tracking information
when the view of the hand is obstructed (i.e. at an angle to the controller instead of palm down).
The paper, titled The Leap Motion Controller: A View on Sign Language, while not successful
did note the potential for this kind of technology in opening up new methods of interaction for
the disabled community. It also noted that further development of the Leap Motion API could
improve on these issues and make the concept viable. (Potter, 2013)

Another paper in 2013, titled Air Painting with Corel Painter Freestyle and the Leap Motion
Controller, explores the use of the Leap Motion in the creation of artwork. It focuses on the use
of gestures and finger motion to “control every aspect of the painting process”, with each finger
representing its own brush. This kind of creative application helps to show the wide reaching
benefits that the use of intuitive control systems can bring to the industry outside of the more
obvious uses it has in professional environments. (Sutton, 2013)

The Leap Motion controller seems a favourable amongst the music production community too,
as in the case of Ryo Fujimoto, otherwise known as Humanelectro. Fujimoto integrates the Leap
Motion into his musical act, using hand gestures to produce specific effects in his songs, and has
gained a certain level of infamy through his work. Creative applications like this, as mentioned
before, show just how versatile intuitive interfaces can be when integrated well. (Fujimoto, 2015)

A marketplace specifically for apps created with the Leap Motion in mind exists and remains
fairly popular since its creation, seeing such projects as Lotus, the self-described “motion
controlled audio-reactive experiment” by Funktronic Labs, where the user can interact with
virtual “musical toys” through an intuitive control scheme. This and many other projects are

10
examples of how intuition can aid the concept of virtual reality that has been a popular point of
research for decades and has recently seen great leaps in terms of user interaction (such as the
case of the Oculus Rift (Oculus VR. LLC, 2015), one of the first commercially viable virtual
reality headsets that has seen a sharp rise in popularity). (Funktronic Labs, 2014)

The many benefits of studying these areas of the industry can be seen by both their direct and
indirect effects on it, though the wider implications should also be considered a positive of this
area of research. From teleoperated medical surgery to elderly care, the importance of these
technologies can't be understated and I believe they're well worth exploring for any computer
scientists who take an interest.

11
CHAPTER 3

SYSTEM SPECIFICATIONS

Leap Motion, a San Francisco-based motion-control software and hardware company, unveiled a
new way to control computers.
Using a USB connection, the device creates four cubic feet of space to interact with a computer.
The controls are similar to using a touch screen computer in the air. But what's more impressive
is the accuracy that the Leap can measure. Currently, Microsoft Kinect is the biggest player in
the 3D motion-control technology. However, Kinect is not accurate enough to track handwriting.
Leap Motion says the Leap is 200 times more sensitive than existing technologies.
"It was this gap between what's easy in the real world but very complicated to do digitally, like
molding a piece of clay or creating a 3D model, that inspired us to create the Leap and
fundamentally change how people work with their computers, "Leap Motion chief executive
officer and co-founder Michael Buchwald said in a press release.

" In addition to the Leap for computers, our core software is versatile enough to be embedded in
a wide range of devices, including smart phones, tablets, cars and refrigerators. One day 3-D
motion control will be in just about every device we interact with, and thanks to the Leap, that
day is coming sooner than anyone expected."

The Leap retails for $69.99 and is available for pre-orders now. The device will ship this winter.
Interact with, and thanks to the Leap, that day is coming sooner than anyone expected."The Leap
retails for $69.99 and is available for pre-orders now. The device will ship this winter.

3.1 SDK/Drivers
 Runs on Windows, MAC and Linux Environments
 Supports development in C#, C++, Objective C, Java, Python & JavaScript and more
 Supported by rich developer community portal

12
 Contains API’s for common gestures for rapid prototyping

3.2 Hardware and software requirements:

Hardware

 Intel i3 / i5 / i7 Processor
 2 GB RAM
 USB 2.0 PORT

Software

 Windows 7 or 8
 MAC OS X 10.6

3.3 Activation & First Use


 Plug in the Leap and place it in a comfortable position in front of your computer screen

 Go to leapmotion.com/setup to download the drivers and the Leap control panel


(available for Windows and Mac)
 After installation, the Leap Motion Orientation will run

Fig 3.3.1 : Description of Leap Motion

13
3.4 Experimental Design:
The controller’s performance was evaluated through two types of measurements. In the first
measurement, a series of fixed static points in space were tracked and recorded for a longer
period of time to evaluate the consistency and dispersion of the results. The coordinates of the
points were systematically chosen to cover the majority of the controller’s sensory space. In the
second measurement, a constant distance was provided between two objects, which were then
moved freely around the sensory space. The tracking accuracy of the controller was then
evaluated based on the distortion of the distance between the two objects. The reference system
(a professional optical motion capture system) was used to determine the exact spatial positions
of the tracked objects and the distances between them.

3.4.1. The Leap Motion Controller


The Leap Motion Controller uses infrared (IR) imaging to determine the position of predefined
objects in a limited space in real time. Technically, very few details are known about the precise
nature of the algorithms used due to patent and trade secret restrictions. However, from
inspection of the controller, it is clear that three separate IR LED emitters are used in conjunction
with two IR cameras. Therefore, the controller can be categorized as an optical tracking system
based on the stereo vision principle. According to the official information, the Leap software
analyzes the objects observed in the device’s field of view. It recognizes hands, fingers, and
tools, reporting discrete positions, gestures, and motion. The controller’s field of view is an
inverted pyramid centered on the device. The effective range of the controller extends from
approximately 25 to 600 millimeters above the device (1 inch to 2 feet). The controller itself is
accessed and programmed through Application Programming Interfaces (APIs), with support for
a variety of programming languages, ranging from C++ to Python. The positions of the
recognized objects are acquired through these APIs. The Cartesian and spherical coordinate
systems used to describe positions in the controller’s sensory space are shown in Figure 1.
However, it should be noted that the sampling frequency is not stable, cannot be set, and varies
significantly.

14
Fig 3.4.1. The Cartesian and spherical coordinate systems used to describe positions in the
controller’s sensory space.

3.4.2. The Reference System


A high-precision optical tracking system [9] consisting of eight Oqus 3+ high-speed cameras and
Qualisys Track Manager software (version 2.8—build 1065) was used as the reference system
(Qualisys Inc., Gothenburg, Sweden). Such systems are widely used for the fast and precise
tracking of various objects in industrial applications, biomechanics, and media and entertainment
applications. The tracking precision depends on the number of cameras used, their spatial layout,
the calibration process, and the lighting conditions. In our case, only three markers were used,
one for static measurement and two for dynamic measurement. In the dynamic measurement, a
simple Automatic Identification of Markers (AIM) model was created from the two selected
markers and their connecting bone. All markers were seen by all cameras at all times. The

15
standard deviation of the noise for the static marker was measured for each individual
coordinate: stdx = 0.018 mm, stdy = 0.016 mm and stdz = 0.029 mm.

3.4.3. Technical Setup


The Leap Motion controller was placed on a table 60 × 60 cm in area and 73 cm in height. The
controller was firmly attached to the table, ensuring no undesired movement of the device. The
controller transmitted data on the identified objects to a desktop computer (Intel® Core™ i7-
2600 CPU 3.40 GHz with 8 GB of RAM). A set of scripts was written in the Python
programming language using the Leap Motion APIs specifically for this study. The scripts were
used for real-time data acquisition and logging. The operation of the controller was monitored in
real time using the Leap Motion Visualizer software. The optical reference system provided a
calibrated measurement volume of approximately 1 × 1 × 1 m in size, with a resolution of 1.3
million pixels and a constant frame rate of 500 frames per second. The cameras were set up
uniformly, encircling the Leap Motion controller so that each camera’s point of view was
directed towards the controller. A set of hard passive markers with diameters of 12.5 mm was
used in the measurements. The coordinate systems of the reference system and the controller
were aligned at the origin of the controller’s coordinate system. Two types of measurements
were performed within the experiment, under two experimental conditions: Static conditions:
acquisition of a limited number of static points in space Dynamic conditions: tracking of
moving objects with constant inter-object distance within the calibrated space

Our pre-experiment trials indicated the controller’s inability to track static objects that do not
resemble the human hand. We can only speculate that this limitation is due to the controller’s
internal algorithms, as they are protected by patents and therefore not publicly disclosed. A
pointed object, such as a pen tip (used for tracking in [15]), was successfully tracked only if
constantly in motion. When it was stationary and mounted on a stand, it was successfully tracked
for only approximately 8–10 s. After this period of time, the measurement was automatically
stopped by the controller. Therefore, a plastic arm model was used (Figure 2) instead of a
simpler object.

16
Fig 3.4.3 The setup of the experimental environment.

During the measurement of static locations, the arm model was firmly attached using a stand
(Figure 3) and directed perpendicular to the z = 0 plane in the opposite direction from the z axis.
Additionally, a reflective marker was attached to the index fingertip of the plastic arm for
simultaneous tracking by the controller and by the reference motion capture system. The stability
of the stand was measured using the reference system, which indicated the dispersion of the
measured index fingertip location to be below 20 µm.

17
Fig 3.4.3.1 To improve the tracking capabilities of the Leap Motion Controller, the marker was
placed at the tip of the index finger of a plastic arm model. During the measurement of static
locations, the arm was fixed in place using a stand.

For dynamic measurements, the tracking objects were moved around the sensory space with an
approximately constant speed of 100 mm/s. Instead of the plastic arm, a special tool was used to
mimic

two human fingers. It consisted of two wooden sticks with markers fixed together to form a V-
shape (hereafter: ―the V-tool‖) (Figure 4). This tool provided a constant distance between the
two tracked objects, which was used to evaluate the tracking performance. It was perfectly
tracked by the controller and the reference system simultaneously. The exact distance was
acquired using the reference system (d = 21.36 mm, stdd = 0.023 mm). The arm model with five
fingers proved to be very impractical for this type of measurement, as the controller usually
tracked the five fingers as five individual points that could not be identified separately. It was
therefore almost impossible to identify the results for two selected fingers and calculate their
inter-distance.

18
Fig 3.4.3.2 The V-tool used for dynamic measurements.

19
CHAPTER 4

APPLICATIONS

Two or three hundred thousand lines of code later, we’ve figured out how to use the Leap to
create an interaction space around your computer, in 3D. Able to distinguish thumbs from
fingers, and even handheld items like pencils. This allows users to interact like never before,
using only natural movements. And we went a step further. You will be able to create custom
gestures that fit how you want to use your computer. You can even network more than one Leap
device, to create even larger interaction areas.

Fig 4.1 Evolution of Human Computer Interfaces

Fig 4.2: Market Potential

20
4.1 Advantages
4.1.1.Accurate
100x more accurate than anything on the market 1/100th mm – tip of a pin accuracy

Fig 4.3: Accuracy presentation

4.1.2.Powerful
No visible latency – real time interaction experience, precise hand and finger movements

Fig 4.4: Motion Representation

21
4.1.3.Embeddable
Small form factor, low z-height, low CPU and power consumption

Fig: 4.5 Embeddable device

4.1.4.Content-rich
Airspace App Store with over 60k app developers across wide range of categories

Fig 4.6 Airspace APP store

4.1.5.Other Advantages:
 It Can track movements at a rate of over 200 frames per second
 Plug and play (via USB)
 200 times more sensitive than existing touch-free technologies (kinect)
 Affordable & inexpensive

22
4.2 APPLICATIONS:
4.2.1.Data Visualization
 Complex data navigation
 Trend & Correlation identification
 Decision efficiencies
 Data sharing & collaboration

Fig 4.7 Data Visualization

4.2.2.Art & Design


 More Intuitive 3D interface
 Greater accuracy and responsiveness
 Offers better designing as compare to mouse

Fig 4.8: Art & design

23
4.2.3.GAMING
o More intuitive compared to mouse or even touch screens
o Great accuracy and responsiveness

Fig 4.9 Great responsiveness in gaming

4.2.4.MUSIC AND VIDEO


o Electronic Music
o Playing or learning Instruments
o Organic Feel
o Efficient

Fig 4.10 DJ Music Mix

4.3 Limitation:
An important limitation of the controller’s performance is its inconsistent sampling
frequency. Its mean value of less than 40 Hz is relatively low and varies significantly
under both static and dynamic conditions. The main drawback of the non-uniform
sampling is the great difficulty to synchronize the controller with other real-time systems
since it requires difficult post processing and re-sampling methods and operations

24
4.3.1 Limitations of Leap Motion
We shifted the focus from a usable product to a research project due to the number of issues we
encountered with Leap Motion. Aside from facing problems with the gestures, we also realized
that there were many issues with the Leap technology interacting with other types of
technologies.
Leap Motion is an exceptional product for many desktop applications; however features built for
the Leap Motion JavaScript Software Development Kit (SDK), in its current form, is quite
limited for development and for user experience. The JavaScript SDK does not have a large
number of gestures, the precision of gestures in the application is limited. In addition, both the
Wi-Fi connection and the Web browser choice can change the user experience. There are
instances where the Leap Motion SDK does not support all activity that a user may wish to use in
which case the user may revert back to the mouse and keyboard. All of the limitations listed can
be exemplified through our Travel Application website. These limitations are what ultimately
caused us to change our project to be more research based than application based.
4.3.2 Gestures
The precision of the Leap Motion device using the Web API is not accurate enough to perform
many tasks that require significant precision, such as to select a small point on the screen. For
instance, trying to use the KeyTapGesture. We had originally chosen the KeyTapGesture to click
on a hotel icon because it seemed intuitive. However, doing so was arduous due to the small size
of the markers as well as the proximity of markers to each other, requiring a high level of
precision. Due to the nature of gestures in general, it was difficult to pinpoint the exact location
where the gesture was made since a gesture itself is a movement. This meant that when a user
would make the gesture to select a hotel, the point being selected would change while the gesture
was being made. The result of this was either that no hotel or an undesired hotel was selected.
This held true not only for selecting hotel markers, but also for any gestures that depended upon
any amount of precision.
Another precision related issue concerned the number of fingers that were detected by the Leap
device. Many gestures required only one finger to be detected by the Leap Motion device. This
presented a serious usability issue. For example, at one point the swipe gesture was only to be
recognized when a single finger was on the screen, as it could help differentiate between similar
gestures. However, oftentimes the Leap device would inaccurately detect the number of fingers.

25
This resulted in some cases where a correctly complete gesture occurred, but no gesture was
registered. In addition, there were other cases where a gesture was supposed to occur a single
time, but multiple instances of the gesture were registered. For example, when we had
implemented the ScreenTapGesture to zoom in or out, we had created the function to zoom in or
out when only one finger was making the gesture. When a ScreenTapGesture was performed
with multiple fingers instead of just one, the action would be performed multiple times instead of
just once or not at all.
There were also issues related to the limited number of available gestures and how these gestures
interacted with each other. When originally brainstorming gestures and their correlating actions,
we had thought that taking well-known gestures and translating them into Leap gestures would
be the best option because most users would already be familiar with them. For example, when
planning the zoom feature, an intuitive gesture would have been moving the user’s fingers closer
together or farther apart, similar to what is currently used for Android and iPhone zooming. In
order to translate this motion to a Leap gesture, we thought of having users use two hands instead
of two fingers. However, using opposite gestures to be representative of opposite actions, while
familiar was almost impossible to implement using Leap. Using the zoom example from before,
if a user wanted to zoom in, they would move their hands closer. However, if they wanted to
zoom in multiple times, they would have to take their hands out of the field of view for Leap
each time they wanted to zoom in to reset the gesture. In order to counteract this, specific
constraints had to be made in order to have a usable product. However, doing so reduced the ease
with which the application was used.
Another option, which was more viable, was to have two gestures for opposite tasks that are not
direct opposite representations of each other. For example, to zoom-in, the gesture was swiping
horizontally, while the zoom-out gesture was swiping vertically. Ultimately, we had to weigh
how intuitive the gesture was against its effectiveness. Although this reduced the usability of the
application, we opted for gestures that were worked instead of ones that may have been more
intuitive. We also faced issues with specific aspects of Leap Motion, such as creating a custom
gesture and limitations with the interaction box, a box-shaped region that is completely within
the field of view of the Leap Motion controller.
The hover gesture we had wanted to create would allow the user to hover over a hotel marker
and view its information. If the user’s hand was in place for a certain number of seconds, Leap

26
would display the hotel information. After three weeks of attempted implementation, various
forum posts, and posting on the Leap Developer site as well, we identified that this gesture was
too time-consuming and that the Key Tap Gesture would be capable of achieving the same action
without investing nearly as much time. The reason creating a gesture was so difficult was
because there was little to no documentation available on the Leap Developer website and, from
our research, it seemed that no other developer had created a custom Leap gesture, although
many had tried.

27
CHAPTER 5

Conclusion
The leap represents a significant step forward for lough free, motion sensing technology and the
sensible price point means we won’t need to a finished artist CEO or stinkish rich to get us hands
on one. The potential and many applications here are truly exciting but if the device really is
going to be the revolution. In UI control that leap motion look like, then it will require some
clever integration into the software we use on a daily basis.

Based on the insights gained from these experiments, the further study of the Leap Motion
Controller may include research on the precision and reliability of tracking more complex
hand/finger and tool movements as well as its suitability for applications strongly relying on
gesture input modality. The Leap Motion Controller undoubtedly represents a revolutionary
input device for gesture-based human-computer interaction. In this study, we evaluated the
controller as a possible replacement for a fast and high-precision optical motion capture system
in a limited space and with a limited number of objects. Based on the current results and the
overall experience, we conclude that the controller in its current state could not be used as a
professional tracking system, primarily due to its rather limited sensory space and inconsistent
sampling frequency.

28
CHAPTER 6
Future Enhancements

Technology continues to evolve into new and different things and it continues to amaze me how
far we have come just in the last ten years — heck, even in the past five years. I am not that old,
but I have been able to witness the evolution of technology from black and white computer
screens to now where we have tablets and smartphones that are almost as powerful as our
computers. It just amazes me how far we have come.

It will be beneficial for developers looking to create Web applications using Leap Motion.
Because the technology is still relatively new, documenting such problems would prevent
developers from facing many of the same issues that we faced. One disadvantage of using Leap
Motion is that there is no way to type or create words only using Leap Motion users must use a
keyboard. In order to address this issue, Leap could implement some type of voice interaction. If
given more time, we could research how Leap interacts with all applications instead of just Web
applications and with other languages as well. Once these issues are addressed and a new version
of the product is rolled out, we believe Leap Motion could be a very useful and widely used
technology.

29
REFERENCES

1. Collicutt, M., Casciato C. and Wanderley, M. M. ,”From Real to Virtual: A Comparison


of Input Devices for Percussion Tasks”, Proceedings of the International Conference on
New Interfaces for Musical Expression, 2009,page number 1–6.
2. Costa Júnior, J. B., Calegario, F. C. A., Magalhães, F., Cabral, G., Teichrieb, V. and
Ramalho, G. L. “ Towards an evaluation methodology for digital music instruments
considering performer s view: a case study.In: Simpósio Brasileiro de Computação
Musical”, Vitória - ES. Anais do 13º Simpósio Brasileiro de Computação Musical,2011.
3. Freitas, D. Q., Gama, A. E. F., Figueiredo, L., Chaves, T. M., Oliveira, D. M., Teichrieb,
V. and Araujo, C. “ Development and Evaluation of a Kinect Based Motor Rehabilitation
Game”. Simpósio Brasileiro de Jogos e Entretenimento Digital, Brasilia. SBC –
Proceedings of SBGames 2012.
4. Gray, H.. “ Anatomy of the Human Body”. Disponível em. Acessado dia
30 de Agosto , 2013 , <http://www.bartleby.com/107/>
5. Hadjakos, A. and Mühlhäuser, M.” Analysis of Piano Playing Movements Spanning
Multiple Touches”. Proceedings of the International Conference on New Interfaces for
MusicalExpression.,2010, Retrieved from
http://www.nime.org/proceedings/2010/nime2010_335.pdf.
6. Jordà, S. “Digital Lutherie. Universitat Pompeu Fabra”. Disponível em. Acessado dia 30
de agosto, 2013., <http://dialnet.unirioja.es/servlet/tesis?codigo=19509>
7. Jordà, S., Kaltenbrunner, M., Geiger, G. and Bencina R. .” The reactable. Proceedings of
the international computer music conference” , Barcelona, Spain,2013, pages number
579-582. Citeseer.
8. Kiefer, C., Collins, N. and Fitzpatrick, G.” HCI methodology for evaluating musical
controllers: A case study”. Proceedings of the International Conference on New
Interfaces for Musical Expression.,2013, Retrieved from.
<http://nime2008.casapaganini.org/documents/Proceedings/Papers/193.pdf.>
9. Leap Motion Incorporation. Retrieved from: https://leapmotion.com/. 29 de October
2012.
10. Malloch, J., Birnbaum, D., Sinyor, E. and Wanderley, M.” Towards a New Conceptual
Framework for Digital Musical Instruments”. Proceedings of the 9th International
Conference on Digital Audio Effects,2006, page number 49–52. Retrieved from
http://www.dafx.ca/proceedings/papers/p_049.pdf.

30
11. Miller, J. and Hammond, T. Wiiolin: “A virtual instrument using the Wii remote”.
Proceedings of the 2010 Conference on New Interfaces for Musical Expression , Sydney,
Australia.2010.
12. Montag, M., Sullivan, S., Dickey, S. and Leider, C. “A Low-Cost, Low-Latency Multi-
Touch Table with Haptic Feedback for Musical Applications”. Proceedings of the
International Conference on New Interfaces for Musical Expression,2011 (June),page
number 8–13.
13. Peng, L. and Gerhard, D. (2009).,” A Wii-based gestural interface for computer
conducting systems”. Proceedings of the International Conference on New Interfaces for
Musical Expression.,2009, Retrieved from
www.nime.org/proceedings/2009/nime2009_155.pdf.
14. Silva, J. V. S.” Avaliando Interfaces Gestuais Para Prática de Instrumentos Virtuais de
Percussão. Dissertação de Mestrado”. Universidade Federal de Pernambuco, Recife,2009.
15. Singer, E., Larke, K. and Bianciardi, “D. LEMUR GuitarBot: MIDI Robotic String
Instrument”. Proceedings of the 2003 Conference on New Interfaces for Musical
Expression (NIME-03), Montreal, Canada,2003
16. Synthesia Game LLC. Retrieved from: https://www.synthesiagame.com. 29 October
2012.
17. Wanderley, M. M., and Orio, N. Evaluation of Input Devices for Musical Expression:
Borrowing Tools from HCI. Computer Music Journal, 26(3),2002,page number 62–76.
18. Wong, E., Yuen, W. and Choy, C. “Designing wii controller: a powerful musical
instrument in an interactive music performance system”. In Proceedings of the 6th
International Conference on Advances in Mobile Computing and Multimedia, 2008,pages
82-87. ACM.

31

You might also like