Professional Documents
Culture Documents
Louis Fellows
Signed:
Declaration
COPYRIGHT
Attention is drawn to the fact that copyright of this dissertation rests with its author. The
Intellectual Property Rights of the products produced as part of the project belong to the
University of Bath (see http://www.bath.ac.uk/ordinances/#intelprop).
This copy of the dissertation has been supplied on condition that anyone who consults it
is understood to recognise that its copyright rests with its author and that no quotation
from the dissertation and no information derived from it may be published without the
prior written consent of the author.
Declaration
This dissertation is submitted to the University of Bath in accordance with the requirements
of the degree of Batchelor of Science in the Department of Computer Science. No portion of
the work in this dissertation has been submitted in support of an application for any other
degree or qualification of this or any other university or institution of learning. Except
where specifcally acknowledged, it is the work of the author.
Signed:
Abstract
The aim of this project is to create an interface whereby the Wiimote can be used as an
instrument to create music. The idea allows for a brief exploration of the properties of the
Wiimote whilst providing a system with multiple methods of creating sound using all the
input methods available within the Wiimote. With an attempt to keep the Wiimote and
the system as separate as possible by allowing the system to be controlled remotely.
Contents
1 Introduction 1
1.1 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Aims . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Project Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4.1 The System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4.2 Required Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4.3 Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2 Literature Survey 6
2.1 Gesture Capture Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1.1 Hidden Markov Models (HMM) . . . . . . . . . . . . . . . . . . . . . 7
2.1.2 Conditional Random Fields (CRF) . . . . . . . . . . . . . . . . . . . 8
2.2 Wii Remote Connection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.1 Java . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.2 C++ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.3 C. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.4 Decision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3 Musical Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.1 Physical Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.2 Synthetic Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . 13
ii
CONTENTS iii
3 Design 14
3.1 Features vs Playability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2 UI Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3 Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.4 Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4 Requirements 19
4.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.1 System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.1.3 Gesture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.1.4 Sound . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6 Conclusions 42
6.0.1 Further Developments . . . . . . . . . . . . . . . . . . . . . . . . . . 43
A Design Diagrams 47
D Code 59
D.1 wiinote.engine.ListenerFlute.java . . . . . . . . . . . . . . . . . . . . . . . . 60
D.2 wiinote.engine.ListenerLeds.java . . . . . . . . . . . . . . . . . . . . . . . . . 60
D.3 wiinote.engine.ListenerPitchRoll.java . . . . . . . . . . . . . . . . . . . . . . 60
D.4 wiinote.engine.MidiOut.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
D.5 wiinote.engine.MWProcess.java . . . . . . . . . . . . . . . . . . . . . . . . . 62
D.6 wiinote.engine.Wiinote.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
D.7 wiinote.gesture.AccDirectionObject.java . . . . . . . . . . . . . . . . . . . . 66
D.8 wiinote.gesture.AccelerationArray.java . . . . . . . . . . . . . . . . . . . . . 67
D.9 wiinote.gesture.ConvArray.java . . . . . . . . . . . . . . . . . . . . . . . . . 68
D.10 wiinote.gesture.GestureObject.java . . . . . . . . . . . . . . . . . . . . . . . 69
D.11 wiinote.gesture.GestureRecognisedEvent.java . . . . . . . . . . . . . . . . . 69
D.12 wiinote.gesture.GestureRecogniser.java . . . . . . . . . . . . . . . . . . . . . 70
D.13 wiinote.gesture.ListenerGestureCapture.java . . . . . . . . . . . . . . . . . . 71
D.14 wiinote.gesture.PathObject.java . . . . . . . . . . . . . . . . . . . . . . . . . 72
D.15 wiinote.ui.GestureGui.java . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
D.16 wiinote.ui.Gui.java . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
D.17 wiinote.ui.MessagesWindow.java . . . . . . . . . . . . . . . . . . . . . . . . 80
D.18 wiinote.ui.NoteWindow.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
List of Figures
v
LIST OF FIGURES vi
vii
Acknowledgements
• and Ellie for keeping me sane! (Well, as sane as I was to start with!).
viii
Chapter 1
Introduction
Musical instruments have existed for many thousands of years and have defined cultures
since the dawn of time. New musical instruments are still being created to this day with
notable modern examples such as the electric guitar and the digital keyboard.
Most modern musical instruments are based heavily on classical examples (such as the
aforementioned guitar and keyboard), However, advances in technology have allowed people
to experiment with many weird and wonderful ways of creating music. One of the most
prominent of these being the Theremin, one of the earliest electronic instruments played by
waving ones hands around 2 radio antennas which in turn creates an often eerie, electronic
sound.
Recent times have also brought a wave of new human computer interaction methods such
as touchscreens and voice recognition. One of the more novel examples of hci is Nintendo’s
Wii Remote (known informally as the Wiimote), used to control the Wii games console. It
contains a three-axis linear accelerometer which can capture movements, allowing users to
interact with their games based on their movements.
The Wiimote connects to the Wii console using Bluetooth. This means that a Wiimote can
also be connected to any system with Bluetooth (such as a PC) to provide accelerometer
support.
1.2 Aims
By combining these two ideas, using accelerometers to interact with a computer and the
creation of music in new and interesting ways, the aim of this project was born. To use a
computer to interpret data coming from a Wiimote, and output musical notes in accordance
with the users commands. Thus turning the Wiimote into a musical instrument.
1
CHAPTER 1. INTRODUCTION 2
1.3 Objectives
The computer will need to take the raw data from the Wiimote, make the data useful to
the system and then use that data to decide what should be played. There should also be
some method of gesture recognition to control the system functionality.
For this to be viable as a musical instrument, this data refining and gesture recognition
must happen in real time with a sufficiently high success rate to be reliable whilst giving a
musical performance. A musical instrument that cannot be performed is not a particularly
useful instrument.
The gestures should also be customizable, to allow a user to perform whatever movement
seems the most comfortable to them to perform an action. The system should also output
in a format that can be used by many different systems and thus give the possibility of
extending the system at a later date.
Taking all this into account, below are the high level requirements of the system:
• The system must be able to take the input from a Wiimote connected by Bluetooth
in such a way that it is useful to the system.
• The system must be able to recognise a gesture made by the user with the Wiimote
and know how to react when that gesture is made.
• The User should be able to define new gestures and define an action that will be
performed when the gesture is recognised.
• The system must be able to output data in a musical format (e.g. MIDI) that is
widely used and understood by outside systems.
CHAPTER 1. INTRODUCTION 3
• The system must work in real time to make giving a musical performance using the
Wiimote possible.
• The gesture recognition must be sufficiently accurate at judging gestures that the
system can be reliably used to give a performance.
• The system should be able to play any song, given that the user has had enough
training/practice (i.e. the system should act as an actual instrument, rather than a
‘toy’)
This is the initial section of the system. It’s purpose is to take a snapshot of the raw data
from the Wiimote and hand it to the system. There are already libraries available that are
designed for this task which can be used fr this project, therefore making this section the
easiest to complete.
Music Generation
This section takes the raw input from the section above and uses it to create sound. This
would be done by taking the inputs and running an algorithm that generates a note and any
properties of the note that are needed. These details are then passed to the next section.
Midi Synthesis
This is the final section, it takes the information from the previous section and creates a
midi message from it. These messages will then be sent to a midi controller to control an
outside system which will play the music. There are libraries for controlling MIDI events
already available, which should make this section relatively straightforward.
Gesture Recognition
This is by far the most complex section of the project. It will have to take the data from the
Wiimote, attempt to recognise any gestures made by it and perform the action related to
CHAPTER 1. INTRODUCTION 4
it. My initial reading has determined a number of methods which will attempt to decipher
the gestures from the data. This will also have to be done reliably and in real time.
Below is a list of resources that I foresee I will need to successfully complete the project.
Software Resources
• Libraries to communicate with and retrieve raw data from the Wiimote. These are
available freely on the Internet.
• Libraries to communicate with MIDI Systems. These are also available freely on the
internet.
• A compiler for either the C, C++ or Java language, whichever becomes a more obvious
choice after the literature survey. Currently the most obvious language would seem
to be Java. Compilers for both are available from the library computers.
Hardware Resources
• Computer for software creation and write-up of dissertation. I have one of these and
there are several available in the library.
Literature Resources
Figure 1.2: Gantt Chart Showing the Planned Timing of Work Throughout the Project
Chapter 2
Literature Survey
There are many areas of this project that need to be investigated before any major decisions
can be made. The title of the project leaves a large scope for creation of a system and whilst
simply ‘jumping in’ and seeing what happens could work well for building this system, it
leaves the possibility of suffering the same problems and making the same errors as found
by those who have attempted similar endeavours in the past. As Konrad Adenauer has
said, “History is the sum total of things that could have been avoided”.
This literature survey then, is to take the ideas of the project proposal, explore the pos-
sibilities within them and discover the methods by which the project can progress in the
most successful fashion possible.
I have split this document into three main parts, the first looking at various methods of
capturing gestures made by a user. The second section is a brief look at the various available
libraries which exist to connect with a Wii Remote and form a choice over which to use to
complete the project, and finally a look at musical instruments and a look at how they can
be represented in a computer system.
The ability to recognise human movement with a computer system has been experimented
with for nearly 3 decades (Moeslund and Granum, 2001)(Moeslund, Hilton and Krger, 2006)
and the field has recently started to accelerate with the expansion of physical interfaces by
many big name companies such as Microsoft (Microsoft Surface, 2008) and Apple (Apple
IPhone, 2009).
The initial stage of capturing a gesture is to choose some sort of input method, in the
above two examples (Microsoft Surface, 2008)(Apple IPhone, 2009) the input method has
been a multi-touch screen which has taken user input. But there are many other meth-
ods such as cameras(Wilson, 2004), IR cameras with IR emitting diodes(Kapoor, 2001),
gyroscopes(Sakaguchi, Kanamori, Katayose, Sato and Inokuchi”, 1996) and many more
6
CHAPTER 2. LITERATURE SURVEY 7
Hidden Markov Models (HMMs) are a generative approach to gesture capture (Morency,
2007) and can be used when a system of states can be described as a Markov process1 with
unknown parameters. Given a set of observable states an HMM can calculate the most
likely sequence of hidden state transitions from a sequence of the observed states.
An HMM is constructed using a Markov model to model the hidden states and the possibil-
1
A Markov process is a random process whose future probabilities are determined by its most recent
values.
CHAPTER 2. LITERATURE SURVEY 8
ities of transition between each state, along with a start vector containing the probabilities
of starting at each state. The status of these hidden states is unknown by the system.
In addition to these hidden states are a number of observable states, along with a matix
containing the probability that the system is in a certain hidden state when an observed
state is seen.
If a system can be described with an HMM then three separate problems can be solved.
• Firstly, given a sequence of observed states, the most likely sequence of hidden states
can be calculated (known as decoding).
• Secondly, given the HMM, the probability of a sequence of observed states occurring
can be calculated(known as evaluation).
To use this to recognise gestures we would use the learning algorithm to create a Hidden
Markov Model for each of the gestures that could be recognised. Then when a gesture is
performed, we can use an evaluation of the sequence of observed states with each of the
HMMs in the system, the HMM returning the highest probability is the most likely to be
the gesture the user was performing. Knowing this we can perform the action related to
the gesture.
HMMs are popular because they are simple and flexible (Murphy, 2002). They have been
used extensively in speech recognition due to their ability to model speech in a mathemat-
ically tractable way (Warakagoda, 1996). Along with being used for speech recognition, it
has also been used in gesture recognition quite successfully (Pylvninen, 2005).
Conditional Random Fields (CRFs) are a discriminative (as opposed to generative as the
HMMs above) method of gesture recognition (Wang, Quattoni, Morency, Demirdjian and
Darrell, 2006). It uses a ‘discriminative sequence model with a hidden state structure’(Wang
et al., 2006) and attempts to model the entire sequence with the given input sequence.
A CRF an undirected graphical model (G), which contains a number of vertices (V ) each
of which represents a random variable(y ∈ Y ), and a number of edges each representing a
dependency between two random variables. The distribution of each variable of the set Y
is conditioned on an input sequence X.
A CRF also contains a number of ‘potential functions’ derived from the idea of conditional
independence. Each potential function works on a subset of vertices and each of these
subsets must form a clique as to be sure that a function is not altering a conditional
distribution over two vertices where no conditional distribution exists.
CHAPTER 2. LITERATURE SURVEY 9
The graph can be laid out in any arbitrary manner, however it is more often set in a chain
of vertices with an edge between each Yi-1 and Yi vertex. This layout allows the use of
efficient algorithms for solving the three following problems:
Here it is useful to note the similarities of the solutions between a CRF and an HMM. The
same method of using these solutions can be applied to a CRF to determine the gesture
attempted by the user from the observed data.
According to (Wallach, H.M., 2004), the advantage of CRFs as opposed to HMMs is in
their conditional nature which allows the relaxation of independence assumptions which
are required in HMMs.
The CRF method has been extended further into the two following methods:
Dynamic CRFs
Dynamic CRFs (DCRFs) are an extension of the CRF method whose structure and pa-
rameters are repeated over a sequence (Morency, 2007). When using hidden variables with
this system the system becomes difficult to optimise (Morency, 2007).
Latent Dynamic CRFs (LDCRFs) are an extension of the CRF and the DCRF methods
which attempts to incorporate hidden fields and sub-structures to better recognise gestures
(Morency, 2007). The original testing was based on video input from a camera which at-
tempted to recognise the movements of a human subject. This method was seen to compare
favourably to other methods such as HMMs in the paper by Morency et al (Morency, 2007).
As I have already decided that I will be using an existing library to carry data from the
Wiimote into the programming language which I will use to create my system I will now
have a brief look at the libraries for connecting to the Wii Remote which are currently
available. Libraries exist for three languages; C, C++ and Java.
CHAPTER 2. LITERATURE SURVEY 10
2.2.1 Java
Java would be my preferred language to attempt the project with, I have used it extensively
in the past and this experience would mean less work to learning a new language. There
are three possible libraries for use with Java, these are WiiRemoteJ, Wiimote-Simple and
WiiuseJ.
WiiRemoteJ
WiiRemoteJ is a library written in Java and is currently hovering between beta and full
release. There is very little documentation supporting it to be found on the net and no
apparent official home for the system, although many videos and images can be found
displaying some of its functionality (WiiRemoteJ Technical Demo, 2009). Attempting to
run the demo of WiiRemoteJ on my system caused numerous errors and did not work,
making it unfit for use within this system. It has been mentioned here however as it is
considered one of the more ‘feature-rich’ libraries for the Wiimote.
Wiimote-Simple
http://code.google.com/p/Wiimote-simple/
Wiimote-Simple is a library designed as an alternative to WiiremoteJ, It is open source,
however as the designer mentions, it has less functionality than WiiRemoteJ but is offered
as an alternative for people who could not get WiiRemoteJ to work.
It does provide the ability to read accelerometer data, IR data and respond to button
pushes which should be enough to use the Wiimote to interface with the system. As with
WiiRemoteJ, there is little documentation available for the implementation, and is also not
as well supported as WiiRemoteJ.
WiiuseJ
http://code.google.com/p/wiiusej/
WiiuseJ is another library written in Java. It provides a Java API for the wiiuse system
for C (which provides C libraries for communicating with the Wiimote), The API is open
source, although the Wiiuse libraries it uses are not.
It provides a great deal more functionality than Wiimote-Simple such as filters to normalise
the accelerometers and support for a number of extension controllers. It also has a wealth
of documentation and has been used in many other projects (which can be found from the
libraries website). It also seems to be well supported by its creator.
CHAPTER 2. LITERATURE SURVEY 11
2.2.2 C++
I have never used C++ before, and this would mean that choosing any of the following
libraries would mean learning the language from scratch, this puts the library in this
section at a disadvantage, however, I will still summarise what I learnt of the system
for completeness.
WiiYourself
http://wiiyourself.gl.tter.org/
WiiYourself is a library for C++ which is quite well featured and has been used in the
development of several projects (of which details can be found on the developers site).
The system seems well supported and works with most commercially available Bluetooth
stacks, although there doesn’t seem to be a great deal of documentation to accompany the
system. WiiYouself currently only works on Windows, which may limit the abilities of any
system created with it.
2.2.3 C
The final language on the list is C. My knowledge of C is basic but functional making C
a viable choice for the development of the system. I discovered two C libraries during my
research, CWiid and WiimoteAPI.
CWiid
http://abstrakraft.org/cwiid/
CWiid is a library developed for use in C and is quite full of features (some unique to this
implementation, such as an IR Tracker and an interface for Python). Like WiiuseJ and
Wiimote-Simple it is Open Source and is well supported (including a roadmap of future
features to be implemented, as well as bugfixes)
It is currently at version 0.6.00 (as of April 2009) and as well as being well supported (with
a small community building around it) it is also fairly well documented. However, only
Linux support is available currently with no view to port the system to any other platform.
This would limit any system developed using it.
WiimoteAPI
http://code.google.com/p/Wiimote-api/
WiimoteAPI is a basic library for C, it contains functionality for the IR sensor and the
CHAPTER 2. LITERATURE SURVEY 12
buttons, but seemingly not the accelerometers. The lack of accelerometer support greatly
limits the feasibility of using this library.
There is a little documentation available, but not a great deal, and support for the system
seems to have ended about two year ago (as of April 2009)
2.2.4 Decision
From looking at the available options, It has become clear to me that due to my proficiency
with Java relative to the other available languages that I should be using a Java based
library. This leaves WiiremoteJ, WiiSimple and WiiuseJ.
Of these, as WiiremoteJ is not compatible with the computer I’ll be coding on, and the
very basic nature of WiiSimple, the best choice of library would seem to be WiiuseJ, and
as such, I will be using this for the development of the system.
As this project is the creation of a musical instrument some time should be dedicated to
the study of current musical instruments and how they create music, as such I shall now
briefly go through the ideas found in my reading on music.
With most physical instruments the player crates sound by manipulating a part of the
instrument which vibrates (NH Fletcher, 1998). This could be a string on a guitar, the air
in a trombone, the skin of a drum etc. The note is altered by altering the vibrating part
(holding a fret on a guitar, lengthening the tube of a trombone, tightening the skin on a
drum.)
This holds true for all but a few instruments, these few are instruments created in the
electronic age and are not reliant on the player creating the vibration themselves (Glinsky,
1992). A major example of this is the Theremin. The worlds first electronic instrument
and the only instrument in the world played without physically touching it (Glinsky, 1992).
The Theremin was the first instrument where the player did not have to create the vibra-
tions as they were created with an oscillator which was manipulated by the player remotely.
This is however, an exception to the rule. the majority of musical instruments are played
my manipulating them physically (NH Fletcher, 1998).
To create a musical instrument from a Wiimote will have to be perched somewhere between
these two ideas. It will have to involve directly manipulating the interface (the Wiimote)
to choose the usual elements of the waves to create (i.e. The frequency and the amplitude
of the note). However, the actual notes will be created within the computer (which will
CHAPTER 2. LITERATURE SURVEY 13
This means that the system will need some method of creating synthetic sounds. For this
project this will be handled by an external process. The system will take the information
from the Wii remote and create from it the information it needs (gestures, note frequency,
volume etc.) and this will be passed to a digital synthesizer which will use this to make
sound which can be played out.
As I see it, there are two ways to achieve this, either have the external synthesizer written
into the system such that it has the data passed directly into it, Or have the system output
to an intermediate representation which can be understood by the (or possibly several
different) synthesizer(s).
Such an intermediary is MIDI. I feel that outputting via MIDI is the best idea for the
system as it allows a far wider number of musical synthesis systems to be used at will,
however, MIDI uses discrete codes for notes, which holds the instrument I create to certain
notes instead of being able to play all possible frequencies. This is a limitation on the
instrument, but may well make it easier to play (and to code!)
Chapter 3
Design
The design of the system created many challenges and in this section I will attempt to
discuss the decisions which led to the final system.
One of the biggest issues throughout the development was the balancing of the feature-set
of the system with its usability. From the outset, I wanted the system to behave as much
like a musical instrument as possible. By this I envisioned the ‘computer-system’ as being
as invisible as possible, thus making the journey between the Wiimote and the sounds
produced perceivable as a single step, without any intermediate factors.
To do this effectively, It seemed to me that the software between the Wiimote and the
MIDI output would have to have enough features within it that nothing could be seen
as ’missing’, however it needed to be lightweight enough that the user could focus on the
playing of the instrument and not on fiddling with options and settings within the system.
This led to one of the largest questions of the project ‘What features need to be in this
system, and which should be removed?’.
The set of features that I had originally envisioned was cut down to a subset which could be
implemented and complement each other well without overburdening the user whilst they
were playing the instrument. This reduced set was more useful to the system in several
ways. It helped create a ‘cleaner’ interface where all the options could be presented to the
user on a single frame without over-complicating the design of the interface.
3.2 UI Design
To keep the software as simple as possible whilst playing the instruments, The UI (see
Fig 3.1) was reduced to a single frame for all the musical systems (which was aided by the
14
CHAPTER 3. DESIGN 15
decision to limit the number of features, see the section 3.1 ‘Features vs Playability’ above).
This decision however led to a further issue, which was how to display information from
the system to the user without further complicating the interface and in such a way that
the user could ignore the messages if they needed to.
This led to the development of two new Java classes, the MessageWindow class and the
NoteWindow class (see Fig 3.2). I chose to add these windows to the system as a method
of presenting useful information to the user without interrupting the use of the system (for
example with dialog boxes).
Figure 3.2: The Message Window (left) and the Note Window (Right)
3.3 Control
As the system was intended to perform as a musical instrument, It seemed that the system
should be controllable from away from the computer. There were two ways this could be
approached. The first was to map the system commands to different buttons upon the
Wiimote and allow those options to be chosen by pressing the correct button. The second
was to introduce a method of gesture recognition such that each gesture would trigger
different options within the system.
The idea of using separate buttons to select options was the most straightforward method
to implement as it would take very little to setup a system which reacted based on button
presses on the Wiimote. However, this would have also removed the buttons from being
used to create music which would have limited the instrumental options. A second problem
with this idea was that the Wiimote only has a limited number of buttons, after they have
been exhausted no new features could be added to the system which imposing a limitation
on the function of the system.
By comparison, using a gesture recognition system would leave the buttons free for use
CHAPTER 3. DESIGN 17
as an instrument and a much larger set of gestures could be added than that amount of
buttons on a Wiimote.
One issue would be the removal of the Accelerometers from the musical half of the system.
This can be avoided by using the ‘Nunkchuk’ extension controller to interpret gestures. As
such, the entire Wiimote was free to be used as an instrument, and using their off-hand,
the user could control system functions whilst still performing using the Wiimote.
3.4 Gestures
After the decision to use gesture recognition to issue system commands remotely a new
set of choices had to be made, the most significant of these was ‘Which method of gesture
recognition should be implemented?’. After taking a look at the domain of gesture recog-
nition (see section 2.1 above). The choices were narrowed down to using Hidden Markov
Models due to their proven abilities for gesture recognition or to using a new method which
was unproven.
Hidden Markov Models were a more simple choice, there were already implementations
available in Java and documentation supporting both how to create and use them and
their effectiveness in practice. They have also been used previously with Wiimote based
gesture recognition projects (Schlömer, Poppinga, Henze and and Boll, 2008).
The new method was an idea I imagined whilst thinking about gesture capture. After
researching the field it became apparent that there were no implementations of the method
previously attempted. This option seemed more risky as the implementation may not be a
useful method of recognition (There may well be a reason why it has not been implemented
before!). Notes on the implementation of this method can be found in the next chapter.
In the end, the decision was taken to go with the unproven method, The thinking behind
this decision was that re-implementing an HMM would provide less worth than attempting
a new method and observing its usefulness. The chance to create something new overtook
the rewards for taking the ‘secure’ option.
3.5 Instruments
The central part of the system was always the instruments. For this project the decision
was to create an instrument for each of the input on the Wiimote (Requirement FI04, Table
4.1.2).
This led to three different instruments, one using the accelerometer and orientation func-
tions, one using the buttons on the face of the Wiimote and the final using the IR camera
functionality built into the front of the Wiimote.
The 2 instruments considered for the accelerometer were to use the orientations of the
Wiimote (its pitch and roll) to represent the note and the volume of the note, or to use the
CHAPTER 3. DESIGN 18
acceleration of the Wiimote to determine its volume and the roll to determine the note.
The second instrument proved too inaccurate when choosing a volume and was also far
more tiring in practice, reducing the time it could be played. The first method was far
less tiring and seemed more intuitive to play. Therefore, the first style of instrument was
chosen to be implemented into the system.
The button input had less choices available to it (as any way of playing it will involve
pressing the buttons to play notes). The only choices here were how to play notes using
the buttons. There were not enough buttons to assign one to each note in a MIDI octave (as
required by requirement FI02, table 4.1.2). This led to the idea of taking inspiration from
wind instruments which use a combination of different finger positions to reach different
notes, Thus a combination of different buttons are used to represent each note.
The third input method was the infra-red camera mounted on the front of the Wiimote.
This picks up IR light sources and displays them as dots in an X-Y plane, then these values
are passed to the system through the WiiuseJ system. There were many methods this could
have been used to create music. The two foremost methods (see fig 3.3) were to either set
up an array of IR LEDs which the Wiimote could be pointed at and used to measure the
Wiimotes position in space then map this value to a note (much like how the position of a
musicians hand near a theremin leads to a note being produced.) The second method was
to have the Wiimote as a static position and move IR sources in front of it to create music,
with the position of each LED controlling different aspects of the music (such as volume,
pitch etc).
In the end, the second idea was chosen. Both would require some manner of hardware
creation (i.e. building an array of LEDs), however, the hardware for the second idea was
far less complex requiring only a cell and an LED (see circuit diagram, fig 5.7 to work (with
no capacitors or resistors). This simplicity made it far easier (and cheaper) to build two
LED ’instruments’ than it would have been to build an array of LED’s for the Wiimote to
’look at’.
Figure 3.3: The two proposed IR instruments. Idea 1 (left) and the chosen method, idea 2
(right)
Chapter 4
Requirements
The project contains a number of broad areas of work. As such it seems appropriate to
order the functional requirements of the system under these headings. The identified areas
are:
1. System Requirements
2. Instrumental Requirements
3. Gesture Requirements
4. Sound Requirements
The following section will look at each of these areas separately in order to devise a set of
requirements that the system must meet.
4.1.1 System
The system requirements are those that relate to the system as a whole and not to any of
the other areas noted above. As mentioned in the literature survey, the preferable method
of connection between Java and a Wiimote was chosen to be the WiiuseJ library. As such,
one of the more straightforward requirements will be to use this library to connect the
Wiimote to the system.
Another feature that the system will require is to present the user with a simple to use
graphical interface. It would be beneficial for the user to be able to spend as little time
as possible interfacing with the computer in order to get the most out of playing the
instrument. Because of this a graphical interface is preferable to a text interface as it can
19
CHAPTER 4. REQUIREMENTS 20
present all options to the user without the need to learn a list of commands. It also opens
the instruments to use by less confident computer users.
A less important extension to the development of a user interface would be to display
musical information to the user. A useful idea of this would be to display the currently
playing note (or midi note number etc) to the user as they are playing, much like a digital
tuner is used with ‘real’ instruments. This could then be used to help users create a tune
or develop a ‘musical ear’, expanding the possibilities of the system.
Whilst talking about the user interface, it would also be a useful extension to provide a
window in which system events can be displayed to the user without interrupting their
musical session. This would not be a high priority requirement, however it would be a
valuable addition to the system.
ID Description
FSy01 The system should present the user with a graphical user interface
FSy02 The system should gather Wiimote data using WiiuseJ
FSy03 The system should inform the user of the note playing
FSy04 The system should inform the user of system developments
Table 4.1: Table detailing the requirements covered in the system section.
4.1.2 Instruments
The instruments section is a central section of the system and much of the rest of the
development is designed around it. In order to create a musical interface for the Wiimote, it
is a high priority that the system delivers a method capable of supporting multiple musical
instruments, by this it is meant that the system should contain some way of switching
between different methods of using the Wiimote as a tool to interface with the system.
It is also of a high priority that any instruments developed for the purposes of the project are
capable of playing the entire range of notes in 12-TET tuning. This would (theoretically)
mean that any musical piece written in this tuning could be played using the Wiimote
instruments and thus creating the idea of them being used as ‘real’ instruments.
A feature which is of low priority but which would enhance the functionality of the system
would be to be able to customise and create new instruments within the system. This
would allow the system to be extended with new instruments and allow users to customise
their experience of the system.
In order to make sure that the instruments created within the frame of the project cover all
aspects of the Wiimotes input interfaces, there should be an input created for each input
method within the Wiimote. This is a requirement of the instruments created as part of
the project and can be discarded once the system and the project are completed. However,
in order to fully evaluate the abilities of the Wiimote and the interface, this should be a
high priority for the project.
CHAPTER 4. REQUIREMENTS 21
ID Description
FI01 The system must support multiple instruments
FI02 Each instrument must be able to play notes A-G#
FI03 Users should be able to add new instruments to the system
FI04 Should contain instruments based on all the Wiimotes input methods
Table 4.2: Table detailing the requirements covered in the instruments section.
4.1.3 Gesture
The gesture system is another large section of the system. As such there are a number
of requirements which govern how it fits in with and interacts with the system. The first
requirement that should be noted here is that the system MUST have some method of
recognising gestures. This is of very high priority in the system as the gestures will be used
to control the system remotely. This should also be a requirement, that the system should
be controllable using the gesture recognition functionality.
One thing to note here is how much functionality the gestures should be able to control
within the system. Giving too much control could lead to users having too many gestures
to memorise and the system becoming overwhelming. Having too few undermines the
functionality of the feature. This collision of requirements can be solved by providing the
ability to control as many features of the system as possible, and at the same time allowing
the user to select which actions are performed by which gestures. This allows the user to
dictate what actions are useful to them, and how to perform them.
This connection of actions to gestures will also need some interface to the user to perform
these operations. This then extends the requirements to provide some interface for this
operation. Also useful for this would be the ability to give each gesture a ‘human-friendly’
name to make it simpler to differentiate between gestures in the system.
Also, if it is possible to assign gestures to actions, it should be possible to define new
gestures within the system. This would support both the preferences of the user and the
possibility for future growth to the systems functionality in the future.
ID Description
FG01 The system must have some method of identifying gestures.
FG02 The gesture system must be able to control system functionality.
FG03 The system must be able to learn new gestures.
FG04 Gestures should have a ’friendly’ Name available to the user.
FG05 The user should be able to assign actions to gestures.
FG06 The system should provide some interface for managing gestures.
Table 4.3: Table detailing the requirements covered in the gesture section.
CHAPTER 4. REQUIREMENTS 22
4.1.4 Sound
The final area of the system covers the creation of sound based on the inputs from the in-
struments created. As decided upon in previous sections, the most straightforward method
of output was to output MIDI messages to another system which could then take care
of the generation of sound (for example, a MIDI synth or CSound.) As such, the major
requirement of this section is that the system is able to connect to a MIDI device and send
to it messages of what to play determined by the user input.
Further to this are two lower priority requirements. The first is the option to switch between
playing a single note and playing a chord. This functionality would allow a user to create
a range of different sounds and make it possible to form a lead/rhythm dynamic musically.
Leading from this, the system could be made into a more powerful tool by the addition of
basic recording functionality. This could be achieved by noting the MIDI messages sent
by the system and whilst at the same time as sending them, recording a copy into a MIDI
sequence file. This recording could then be played as a backing track allowing the user a
richer musical experience. However it is not core to the systems functionality and as such
is still a low priority.
ID Description
FSo01 The system should support MIDI.
FSo02 The system should be able to record the instruments playing.
FSo03 The system should allow the playing of both chords and single notes
Table 4.4: Table detailing the requirements covered in the sound section.
As always, alongside the functional requirements of the system are the non-functional
requirements. These are requirements used to judge the operation of the system.
A major requirement here is that the delay between the playing of a note on the Wiimote
and the sounding of the note by the computer is perceived as ‘instantaneous’, by this it is
meant that there is no visible delay between cause and effect. This should be viable under
all usual circumstances.
The second requirement here is a requirement placed on the gesture recognition system.
For it to be found as useful, it should be able to recognise gestures with a sufficiently high
accuracy. For the purposes of this project that accuracy has been placed at 75% as this
should be seen to be the minimum accuracy by which the gesture system can be deemed
to work successfully.
CHAPTER 4. REQUIREMENTS 23
ID Description
NF01 The system should be able to play sounds in real time
NF02 The system must recognise gestures with at least 75% accuracy
Table 4.5: Table detailing the requirements covered in the non-functional requirements
section.
Functional Requirements
System Requirements
ID Description
FSy01 The system should present the user with a graphical user interface
FSy02 The system should gather Wiimote data using WiiuseJ
FSy03 The system should inform the user of the note playing
FSy04 The system should inform the user of system developments
Instrument Requirements
ID Description
FI01 The system must support multiple instruments
FI02 Each instrument must be able to play notes A-G#
FI03 Users should be able to add new instruments to the system
FI04 Should contain instruments based on all the Wiimotes input methods
Gesture Requirements
ID Description
FG01 The system must have some method of identifying gestures.
FG02 The gesture system must be able to control system functionality.
FG03 The system must be able to learn new gestures.
FG04 Gestures should have a ’friendly’ Name available to the user.
FG05 The user should be able to assign actions to gestures.
FG06 The system should provide some interface for managing gestures.
Sound Requirements
ID Description
FSo01 The system should support MIDI.
FSo02 The system should be able to record the instruments playing.
FSo03 The system should allow the playing of both chords and single notes
Non-Functional Requirements
ID Description
NF01 The system should be able to play sounds in real time
NF02 The system must recognise gestures with at least 75% accuracy
As the gesture recognition system used is a new system, I will now describe how it works.
Following that, I will describe how it was implemented in Java.
The gesture recogniser is split into three distinct sections. The first captures the movements
and organises them into a state that the next section can use. The second section takes
the input captured by the first, then attempts to match what it has found to gestures it
has learnt previously, the third section then acts upon what the second section has found.
In this project, the system knows a gesture is occurring as the user holds the ‘C’ button
on the nunchuk extension whilst they are performing the gesture. During this time all
readings coming from the accelerometers within the nunchuk are stored as ‘samples’ of the
whole gesture.
Each of these samples are assigned a direction based on the accelerometer measurements
taken at that time. All these samples are stored in an array with their direction.
When the user releases the C button, the system traverses the array and removes any sample
where its direction is the same as the direction previous. For a visual representation of this
procedure, see fig 5.1.
This then completes the gesture capture section. The final array of directions is passed
to the recognition section. All the possible gestures are stored as an n-tree where each
branch represents a possible direction in the array. The directions are then used to move
through the tree until a final node is reached. This final node contains an ID of the gesture
performed to reach that node. This ID is returned from the tree and used to represent
the gesture that has been performed. If the final node has no registered ID, or the branch
24
CHAPTER 5. IMPLEMENTATION AND TESTING 25
which needs to be followed is null, the tree returns a ‘Gesture Not Recognised’ value. A
visual representation is found in figure 5.2
The third section takes the ID returned from the tree and uses it to determine the action
of the system. This section is different in every system as each system will have different
actions to perform.
The recogniser was implemented in it own Java package as to allow its reuse in future
projects. This package contained several classes each covering separate parts of the imple-
mentation. A class diagram of the gesture recognition system can be found in the appendix
A.1.
The system recognises gestures in 2 dimensions, the X and Z directions1 . Thus performing
a gesture is akin to drawing an image on a blackboard. This was done to reduce the number
of directions being used. The system recognises 9 directions (as seen in fig 5.3) which are
‘North’, ‘North-East’, ‘East’, ‘South-East’, ‘South’, ‘South-West’, ‘West’, ‘North-West’ and
a point representing no movement called ‘Hold’. Figure 5.3 also shows a possible expansion
to this to cover 3D space.
Figure 5.3: Diagram showing the 9 directions recognised in the system (left) and a possible
set of 15 directions for use in 3-D space (right)
as part of its construction takes the input of the X and Z acceleration and decides upon the
direction that this represents. Each sample that is taken is stored in an AccDirectionObject
and all of the AccDirectionObjects are stored, in order, in an AccelerationArray object.
This AccelerationArray object can then be thought of as the entire gesture.
When the gesture has finished being captured the AccelerationArray object calls its re-
moveLikeMotions() function. This removes all AccDirectionObjects from the array where
the direction is the same as the previous direction (fig 5.1, above) which condenses the
array down to a series of directions which define the gesture.
Figure 5.4: Diagram showing how the contents of an AccelerationArray object represent a
gesture before and after calling the function removeLikeMotions()
The learnt gestures are all stored in a tree, in this implementation the tree is built using
PathObject objects as nodes. Within each PathObject there is an array of PathObjects
which represent all the connected nodes beneath the current node. The PathObject tree is
traversed recursively by passing in an array of integers2 and following the nodes recursively
until one of three possibilities occurs:
1. The array is exhausted and the current PathObject contains a gesture ID. Return
the ID up the tree to the calling function.
2. The array is exhausted and the current PathObject contains no gesture ID. Return
a ‘Gesture Not Recognised’ ID to the calling function.
3. The array is not exhausted and the next required node doesn’t exist. Return a
‘Gesture Not Recognised’ ID to the calling function.
2
Each representing a direction, these are defined in AccDirectionObject.java
CHAPTER 5. IMPLEMENTATION AND TESTING 29
Once the tree has been traversed and a gesture ID returned (either, the ID of the gesture
which has been performed or the ‘Gesture Not Recognised’ ID) a GestureRecognisedEvent
is created and thrown to all GestureListeners listening to the GestureRecogniser class.
The GestureListeners are where the system decides what to do when a gesture occurs. Each
system would create a different GestureListener from the GestureListener interface.
In this system, the ListenerGestureCapture object implements GestureListener, within this
object is defined several system events. When a GestureRecognisedEvent occurs the Ges-
tureListener receives the gesture ID and looks up the system event related to it in the
ConvArray (short for Conv ersion Array) which is stored in the GestureRecogniser class.
This array stores GestureObjects which contain the Gesture ID, the related system event
and a name for the gesture which can be displayed in the GUI. Once the ListenerGesture-
Capture object has received the ID if the system event, the event is performed and the
system returns to a dormant state to listen for further gestures.
New gestures are added by first capturing that gesture (as above, using the Acceleration-
Array object) and then reducing the array in the same manner as before. Now, instead of
traversing the tree till its end we start at its root node and move through the tree following
the directions in the array. There are 4 outcomes that could happen at each branch:
1. If the branch we need to follow has a PathObject node at its end, we move forward
to that node, remove the top object in the array and repeat.
2. If the branch we need to follow has a null value at its end, we create a new PathObject
object and place it at the end of the branch as a new node. Then we move forward
to that node, remove the top object in the array and repeat.
3. If we reach a node with no gesture ID and the array is exhausted then we set the
Gesture ID of that node to be the gesture being recorded.
4. If we reach a node with a gesture ID and the array is exhausted there are two possible
outcomes:
(a) The gesture ID of the node and of the current gesture are equal, therefore we
have re-recorded a gesture and no action need occur.
(b) The gesture ID of the node and of the current gesture are not equal, we flag an
error message to the user and leave the current gesture ID in tact.
For the system to work effectively new gestures have to be taught to the system several
times (such that any slight differences get taught to the system as the same gesture). For
this implementation this is all handled in the GestureRecogniser class.
CHAPTER 5. IMPLEMENTATION AND TESTING 30
5.2 Instruments
The instruments in the system are all implemented in the MWProcess class. Each instru-
ment implements a different ActionListener which receives the necessary data from the
Wiimote and calls the functions in the MWProcess class with a different function for each
instrument.
The ActionListener classes (named ListenerPitchRoll, ListenerFlute and ListenerLeds) are
attached to WiiuseJs Wiimote object when the instrument is being played, to switch in-
struments the listeners are swapped out for the listener of the new instrument. By using
this method it is simple to direct the correct data to the functions requiring it, without
making a single, complex, listener. It also allows for new instruments to be added simply
in the future.
The MWProcess class contains the functions that turn the raw data input from the Wiimote
into a MIDI message output (which is sent using the MidiOut class). Each function takes
the raw data and determines a MIDI note number which is stored in a global variable.
Each instrument then calls the function ‘play()’ which sends two MIDI messages to the
connected MIDI device, the first is the message to stop playing the current note, and the
second is to start playing the new note determined by the function3 . An excerpt of the
play function can be found below:
Listing 5.1: function play()
// i f t h e n o t e s a r e t h e same , c o n t i n u e p l a y i n g c u r r e n t n o t e
i f ( newNote != p l a y i n g N o t e ) {
// i f t h e r e i s c u r r e n t l y a n o t e p l a y i n g
i f ( p l a y i n g N o t e != −1) {
ShortMessage o f f = null ;
try {
o f f = Wiinote . m i d i o u t . c r e a t e S h o r t M e s s a g e (
ShortMessage . NOTE OFF, 0 , p l a y i n g N o t e , 9 0 ) ;
Wiinote . m i d i o u t . sendMSG ( o f f ) ;
...
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e1 ) {
Wiinote . g u i . msgWindow . newMessage (
” Midi Message c o n t a i n s I n v a l i d Data ” , 3 ) ;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
Wiinote . g u i . msgWindow . newMessage ( ” Midi Port Not S e t ” ,
3) ;
}
}
3
If the system is set to play chords then 6 messages are sent, three to stop the current 3 notes, 3 to start
the new three.
CHAPTER 5. IMPLEMENTATION AND TESTING 31
// i f t h e r e i s a new n o t e t o p l a y
i f ( newNote != −1) {
ShortMessage on = null ;
try {
on = Wiinote . m i d i o u t . c r e a t e S h o r t M e s s a g e (
ShortMessage .NOTE ON, 0 , newNote , 9 0 ) ;
Wiinote . m i d i o u t . sendMSG ( on ) ;
...
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e1 ) {
e1 . p r i n t S t a c k T r a c e ( ) ;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
Wiinote . g u i . msgWindow . newMessage ( ” Midi Port Not S e t ” ,
3) ;
}
}
// t h e new n o t e i s now t h e n o t e p l a y i n g
p l a y i n g N o t e = newNote ;
}
The Pitch/Roll instrument takes the roll of the Wiimote as the note to play. So by leaning
the Wiimote to the left or right different notes can be chosen. The volume of the current
note is determined by the pitch of the instrument with raising the Wiimote to vertical
creating silence and lowering it to horizontal creating maximum volume.
The flute instrument works by pressing a combination of buttons to receive a note. The
buttons used are the ‘Up’,‘A’,‘B’,‘1’ & ‘2’ (as seen in figure 5.5).
The ‘Up’ button is used to raise the selected note by an octave, The ‘2’ button raises
the selected note by a semitone (thus playing a sharp of the note). Table 5.1 shows the
combinations used to play each note.
5.2.3 IR Instrument
The IR instrument converts the (x,y) position of two IR light sources in view of the Wiimote
IR camera into MIDI notes. The note is selected based on the x distance between the two
sources, the octave is based on the y-position of the rightmost point (i.e. the light source
in the users right hand) and the volume is based on the y-position of the leftmost point
(i.e. the light source in the users left hand). Figure 5.6 displays how these are measured.
CHAPTER 5. IMPLEMENTATION AND TESTING 32
Combination Note
1 A
A B
1+A C
B D
1+B E
A+B F
1+B+A G
Hardware
To play the IR instrument a pair of handheld IR devices had to be created. These were
made by creating a simple circuit composed of an AA Battery, a switch and an Infra-Red
LED. These were then attached to a drumstick to make the circuit more robust and to aid
in playing the instrument. An image of the drumstick controllers and a scientific view of
the circuit can be found in figure 5.7.
5.3 Testing
Due to the split nature of the system between the gesture recogniser and the musical
interface, the testing was performed in two parts. The following sections (5.3.1 and 5.3.2)
will describe the method used for testing each section.
CHAPTER 5. IMPLEMENTATION AND TESTING 33
The testing of the gesture recogniser was carried out by 6 volunteers. Each of the volunteers
was shown an image of a gesture to map into the system and asked to perform the gesture
a number of times such that the system could ‘learn’ the gestures. The test went as follows:
This was performed for all 4 gestures allowing a view at how well the gesture recogniser
worked with different levels of complexity and learning.
CHAPTER 5. IMPLEMENTATION AND TESTING 34
Figure 5.7: The IR Drumsticks(left), Along with a Circuit Diagram of their Design(right)
Using the same group of volunteers, each instrument was given to the user in turn and the
user given 5 minutes to ‘play’ with the instrument in order to learn how it worked. After
the 5 minutes play time the experiment was performed as follows:
1. Ask the user to play a specific note (C) and time their response
2. Ask the user to play another note (E) and time their response
3. Ask the user to play a third note (F#) and time their response
4. Ask the user to play a three note tune (C, E, F#) and time their response
Of the 6 volunteers in this test, 3 were musicians and 3 were non-musicians, this was
done purposely as to see how the instruments handled in the hand to people skilled in
playing music compared to those less skilled. Each of the volunteers was also asked a brief
questionnaire about themselves and their experiences of the instruments. The questions
asked were:
1. Name.
2. Instruments played.
5.4 Results
The next sections display the results of the tests documented above. A critique of each
method is supplied with the added benefit of hindsight. We will begin with the gesture
recognition testing.
CHAPTER 5. IMPLEMENTATION AND TESTING 36
5.5.1 Results
Tables 5.2, 5.3, 5.4 and 5.5 are the results of the gesture recognition testing (described in
5.3.1). These results display the number of successful recognitions as a number (out of 50
repetitions) and as a percentage, the final column is a mean of the 6 columns of results.
Table 5.2: Successful gestures by number of repeat teachings for a 1 sided gesture
Table 5.3: Successful gestures by number of repeat teachings for a 2 sided gesture
Table 5.4: Successful gestures by number of repeat teachings for a 3 sided gesture
CHAPTER 5. IMPLEMENTATION AND TESTING 37
Table 5.5: Successful gestures by number of repeat teachings for a 4 sided gesture
Figure 5.9: Graph depicting recognition accuracy against gesture complexity after 10, 50
and 100 repetitions
It should be noted that with each volunteer, the four stages of tests (1, 2, 3 and 4 sides)
were performed within one session (usually lasting around an hour). Something noted by
almost all of the volunteers at some point was it was quite a tiring exercise physically, and
some noted that performing the same gesture 250 times was quite tedious. This issue with
the testing method should be taken into account if the test was ever to be re-performed.
There are two ways which this issue could be seen to have affected the results, first, the
physical fatigue could have subtly changed the motion of the user and lowered the average
number of successes. This would mean that we should look (increasingly with the later
tests) as the number being a low er than true count.
However, The constant repetition may also have ‘solidified’ the idea of the gesture in the
users mind, this would mean that the average number of successes would be higher than
CHAPTER 5. IMPLEMENTATION AND TESTING 38
usual, and that the numbers should be looked at as a higher than true count.
This ambiguous outcome means that without an in-depth look at the testing method we
cannot tell which outcome should be followed. Due to the time constraints on this project
it is not possible to perform testing on the testing of the project. However, the results taken
should be close enough to what would be found in ‘real-world’ use to take conclusions from
them.
Figure 5.6 charts depicting the answers given in questionnaires by the volunteers to the
questions 4
Figure 5.11 displays two graphs. The first depicts the average time taken to hit each of the
three notes individually for each of the three instruments. The graph on the right shows
the average time taken to play the three note tune for each of the instruments.
I believe these results give a good idea of how difficult each instrument was to play. The
only downside I can see to this method of quantifying each instrument was the uncertainty
in the timing (involving a human operator and a stopwatch). As such there may be a
margin of error on each side of the measurements taken. Most of this has been taken
care of by rounding the number to a single decimal place, the space of time between each
instruments results is sufficiently large to make the ordering of the instruments obvious
and as such, the remaining uncertainty can be ignored.
With hindsight, it may have also been a good idea to change the notes between timing the
individual notes and the tune sections of timing. It has been noted by the author whilst
looking at the testing process that the users were better able to play the tune after learning
where the individual notes were a moment earlier. However this happened with all three
instruments and all 6 volunteers, as such we can rule out any changes in timing this may
have made but should bear it in mind for any further testing using this method.
I believe this worked well as a method of testing the instruments against each other, how-
ever, this can only show how easy the instruments are compared to one another. A possible
4
Full versions of the questionnaires can be found in the appendix B.1
CHAPTER 5. IMPLEMENTATION AND TESTING 39
way to solve this would be to run the experiment again and include a number of ‘real’ mu-
sical instruments as a comparison of real instruments to their synthetic siblings. This,
however, is beyond the specifications of the project. Here it is enough to know that they
work.
CHAPTER 5. IMPLEMENTATION AND TESTING 40
Figure 5.11: Graphs depicting the average times taken to hit 3 notes on each instru-
ment(left), and the average time to play a three note tune on each instrument (right)
Chapter 6
Conclusions
The gesture recognition system worked reasonably well, If we look at figure 5.9 we can
see that as the complexity of the gesture increased, the systems accuracy in recognising it
decreased. By adding a line at 75% it is possible for us to compare the gesture recognition
ability of the system to the requirement NF02 (section 4.2). We can see that the gesture
recognition abilities of the system are within the requirements only up to gestures of 2
sides. We can also assume that this trend continues and that as the complexity of gestures
increases the accuracy in recognising them will continue to reduce.
If we look at the differences between the number of repetitions and accuracy in figure 5.9
we can see that the accuracy of the system increases after each set of repetitions. We can
also see that the 2 sided gesture only reaches an average of more than 75% accuracy after
it has been trained 100 times. With another round of teaching the system, the 3-sided
gesture may also cross the 75% threshold.
It is worth noting that this may hold true with any gesture, that after an ever-greater
amount of training any more complex gesture might be recognisable with a significant
accuracy.
Therefore, I believe it is possible to call the gesture recogniser a success, if only a ‘limited
success’ as it works well but for it to work well with a more complex range of gestures a
great deal of training must be performed.
The instrumental section of the project was, in my opinion, a greater success. Each of
the three instruments provided a useful interface with the system and all were able to
play simple musical pieces (and, with a little more practice of the operator, contained the
potential to play more complex pieces).
We can see that the Pitch/Roll instrument needs to be improved to remove the ‘overhead’
of playing multiple notes (See fig 5.11 where it was fastest at playing individual notes, but
slowed remarkably when attempting to chain notes together) and also to make it simpler
to play as evidenced in figure 5.6 where most users regarded it as both most difficult to
play, and least favourite overall. Many of the users regarding it as “too inaccurate” or “too
42
CHAPTER 6. CONCLUSIONS 43
track could also be controlled using gestures to perform actions upon it (such as telling the
system to repeat a bar or skip forward a bar mid-song). These extensions of the project
would build on the possibilities of the system by providing further functionality useful to
musicians.
Bibliography
Hollar, S., Perng, J. K. and Pister, K. S. J. (2000 ), ‘Wireless static hand gesture recognition
with accelerometers- the acceleration sensing glove’.
Kapoor, A. (2001), A real-time head nod and shake detector, in ‘in Proceedings from the
Workshop on Perspective User Interfaces’.
Moeslund, T. B., Hilton, A. and Krger, V. (2006), ‘A survey of advances in vision-based hu-
man motion capture and analysis’, Computer Vision and Image Understanding 104(2-
3), 90 – 126. Special Issue on Modeling People: Vision-based understanding of a
person’s shape, appearance, movement and behaviour.
45
BIBLIOGRAPHY 46
Sakaguchi, T., Kanamori, T., Katayose, H., Sato, K. and Inokuchi”, S. (1996), Human
motion capture by integrating gyroscopes and accelerometers, in ‘IEEE/SICE/RSJ
International Conference on Multisensor Fusion and Integration for Intelligent Sys-
tems, 1996.’, IEEE/SICE/RSJ, Washington, DC, USA, pp. 470–475.
Schlömer, T., Poppinga, B., Henze, N. and Boll, S. (2008), Gesture recognition with a Wii
controller, in TEI ’08: Proceedings of the 2nd international conference on Tangible
and embedded interaction, ACM, New York, NY, USA, pp. 11–14.
Wang, S. B., Quattoni, A., Morency, L.-P., Demirdjian, D. and Darrell, T. (2006), Hidden
conditional random fields for gesture recognition, in ‘Computer Vision and Pattern
Recognition, 2006 IEEE Computer Society Conference on’, Vol. 2, pp. 1521–1527.
Wilson, A. D. (2004), Touchlight: an imaging touch screen and display for gesture-based in-
teraction, in ‘ICMI ’04: Proceedings of the 6th international conference on Multimodal
interfaces’, ACM, New York, NY, USA, pp. 69–76.
Appendix A
Design Diagrams
47
APPENDIX A. DESIGN DIAGRAMS 48
Pitch/Roll Results
timed note (C): 1.3s
timed note (E): 1.3s
timed note (f#): 1.2s
tune (c, e, f#): 4.8s
Flute Results
timed note (C): 2.3s
timed note (E): 2.5s
timed note (f#): 2.0s
tune (c, e, f#): 5.9s
IR Results
timed note (C): 1.6s
timed note (E): 1.5s
timed note (f#): 1.7s
tune (c, e, f#): 5.5s
50
APPENDIX B. RAW RESULTS OUTPUT
Name: Paul
Instruments Played Guitar/Keyboard
Years Played Guitar - 9 Years, Keys - 6 Months
Favourite Wiinote Instrument (+ Why)
Flute: it felt like a real instrument and the notes were easier to find.
Easiest Wiinote Instrument to Play (+ Why)
Flute: See Above
Least Favourite Wiinote Instrument (+ Why)
IR: It was difficult to get right, seemed more like a gimmick to have the IR sensor working.
Most Difficult Wiinote Instrument to Play (+ Why)
Pitch/Roll: Finding the notes with any accuracy was too difficult.
Pitch/Roll Results
timed note (C): 1.1s
timed note (E): 1.0s
timed note (f#): 1.1s
tune (c, e, f#): 4.5s
Flute Results
timed note (C): 2.1s
timed note (E): 2.3s
timed note (f#): 1.9s
tune (c, e, f#): 5.4s
IR Results
timed note (C): 1.3s
timed note (E): 1.5s
timed note (f#): 1.6s
tune (c, e, f#): 5.0s
51
APPENDIX B. RAW RESULTS OUTPUT
Name: Daniel
Instruments Played Drums
Years Played 6 Years(On and Off)
Favourite Wiinote Instrument (+ Why)
The Flute - Nicest to play, less messing around!
Easiest Wiinote Instrument to Play (+ Why)
The Flute - it was easier to hit the correct notes once I’d got used to where they were
Least Favourite Wiinote Instrument (+ Why)
The Pitch Roll Instrument - Playing it too long made my wrist ache! and it was tough to find the notes.
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch Roll Instrument - It was tough to find the right notes.
Pitch/Roll Results
timed note (C): 1.3s
timed note (E): 1.2s
timed note (f#): 1.4s
tune (c, e, f#): 5.7s
Flute Results
timed note (C): 2.5s
timed note (E): 2.6s
timed note (f#): 2.0s
tune (c, e, f#): 5.8s
IR Results
timed note (C): 1.6s
timed note (E): 1.6s
timed note (f#): 1.7s
tune (c, e, f#): 5.7s
52
APPENDIX B. RAW RESULTS OUTPUT
Name: Ellie
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
The IR Instrument was my favourite, it was fun to play around with!
Easiest Wiinote Instrument to Play (+ Why)
The IR Instrument was the easiest, after I found where the notes were it was quite easy to play them again!
Least Favourite Wiinote Instrument (+ Why)
The Pitch/Roll Instrument wasn’t nice to play, it was too awkward
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch/Roll Instrument as the notes were too close together, which maed them difficult to get right.
Pitch/Roll Results
timed note (C): 2.0s
timed note (E): 1.8s
timed note (f#): 1.9s
tune (c, e, f#): 7.9s
Flute Results
timed note (C): 2.8s
timed note (E): 2.8s
timed note (f#): 2.2s
tune (c, e, f#): 6.0s
IR Results
timed note (C): 1.9s
timed note (E): 2.0s
timed note (f#): 2.1s
tune (c, e, f#): 6.4s
53
APPENDIX B. RAW RESULTS OUTPUT
Name: Liam
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
IR Instrument: It was quite fun using the drumsticks to play it.
Easiest Wiinote Instrument to Play (+ Why)
IR Instrument: It was easiest o figure out where different notes were
Least Favourite Wiinote Instrument (+ Why)
Flute Instrument: The Button Combinations made it too difficult to play
Most Difficult Wiinote Instrument to Play (+ Why)
Flute Instrument: Remembering all the button combinations was too much.
Pitch/Roll Results
timed note (C): 1.5s
timed note (E): 1.3s
timed note (f#): 1.5s
tune (c, e, f#): 6.2s
Flute Results
timed note (C): 3.0s
timed note (E): 3.4s
timed note (f#): 2.6s
tune (c, e, f#): 6.8s
IR Results
timed note (C): 1.7s
timed note (E): 1.8s
timed note (f#): 1.9s
tune (c, e, f#): 6.1s
54
APPENDIX B. RAW RESULTS OUTPUT
Name: Liz
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
The IR Instrument was good! It was very different!
Easiest Wiinote Instrument to Play (+ Why)
The Flute Instrument was easiest to get the notes right with!
Least Favourite Wiinote Instrument (+ Why)
The Pitch/Roll Instrument wasnt much fun after the first couple minutes!
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch/Roll Instrument had too many notes on it to make it easy to play.
Pitch/Roll Results
timed note (C): 2.1s
timed note (E): 1.9s
timed note (f#): 2.0s
tune (c, e, f#): 8.3s
Flute Results
timed note (C): 2.7s
timed note (E): 2.9s
timed note (f#): 2.1s
tune (c, e, f#): 6.4s
IR Results
timed note (C): 2.1s
timed note (E): 1.9s
timed note (f#): 2.0s
tune (c, e, f#): 6.5s
55
APPENDIX B. RAW RESULTS OUTPUT 56
57
APPENDIX C. A FINAL VIEW OF THE SYSTEM REQUIREMENTS 58
Functional Requirements
System Requirements
ID Description Complete?
FSy01 The system should present the user with a graphical user interface !
FSy02 The system should gather Wiimote data using WiiuseJ !
FSy03 The system should inform the user of the note playing !
FSy04 The system should inform the user of system developments !
Instrument Requirements
ID Description Complete?
FI01 The system must support multiple instruments !
FI02 Each instrument must be able to play notes A-G# !
FI03 Users should be able to add new instruments to the system %
FI04 should contain instruments based on all the Wiimotes input methods !
Gesture Requirements
ID Description Complete?
FG01 The system must have some method of identifying gestures !
FG02 The gesture system must be able to control system functionality !
FG03 The system must be able to learn new gestures !
FG04 Gestures should have a ’friendly’ name available to the user !
FG05 The user should be able to assign actions to gestures !
FG06 The system should provide some interface for managing gestures !
Sound Requirements
ID Description Complete?
FSo01 The system should support MIDI. !
FSo02 The system should be able to record the instruments playing. %
FSo03 The system should Allow the playing of Both chords and single notes !
Non-Functional Requirements
ID Description Complete?
NF01 The system should be able to play sounds in real time !
NF02 The system must recognise gestures with at least 75% accuracy -
Table C.1: Table containing the final status of the system requirements.
Appendix D
Code
59
APPENDIX D. CODE
D.1 wiinote.engine.ListenerFlute.java ∗/
public c l a s s L i s t e n e r L e d s implements W i i m o t e L i s t e n e r {
@ O v er r ide
/∗ ∗ public void o n I r E v e n t ( IREvent a r g 0 ) {
∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e B u t t o n IRSource [ ] pts = arg0 . getIRPoints ( ) ;
Data f r o m t h e
∗ Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t . if ( p t s . l e n g t h == 2 ) {
∗ <p> W i i n o t e . p r o c e s s . LedsToMidi ( p t s [ 0 ] . getX ( ) , p t s [ 0 ] . getY ( ) ,
∗ Whilst the wiimote o b j e c t i s attached to t h i s the f l u t e pts [ 1 ]
instrument w i l l be . getX ( ) , p t s [ 1 ] . getY ( ) ) ;
∗ a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o u s e t h e }
flute }
∗ i n s t r u m e n t and r e m o v e i t t o s t o p u s i n g i t . }
∗ <p>
∗ 05−Apr −2009
∗
∗
∗
@author
@version
Louis Fellows
1.0.0.0
D.3 wiinote.engine.ListenerPitchRoll.java
∗/
public c l a s s L i s t e n e r F l u t e implements W i i m o t e L i s t e n e r {
/∗ ∗
@ O v er r ide ∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e
public void o n B u t t o n s E v e n t ( WiimoteButtonsEvent a r g 0 ) { Accellerometer
i f ( arg0 . isButtonAPressed ( ) ∗ Data f r o m t h e Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t .
| | arg0 . isButtonBPressed ( ) ∗ <p>
| | arg0 . isButtonUpPressed ( ) ∗ Whilst the wiimote o b j e c t i s attached to t h i s the Pitch / Roll
| | arg0 . isButtonOnePressed ( ) instrument
| | arg0 . isButtonTwoPressed ( ) ∗ w i l l b e a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o
| | arg0 . isButtonAJustReleased ( ) use the
| | arg0 . isButtonBJustReleased ( ) ∗ P i t c h / R o l l i n s t r u m e n t and r e m o v e i t t o s t o p u s i n g i t .
| | arg0 . isButtonUpJustReleased ( ) ∗ <p>
| | arg0 . isButtonOneJustReleased ( ) ∗ 05−Apr −2009
| | arg0 . isButtonTwoJustReleased ( ) ) { ∗
∗ @author Louis Fellows
Wiinote . p r o c e s s . ButtonstoMidi ( arg0 . getButtonsHeld ( ) ) ; ∗ @version 1.0.0.0
} ∗/
}
} public c l a s s L i s t e n e r P i t c h R o l l implements W i i m o t e L i s t e n e r {
@ O v er r ide
public void o n M o t i o n S e n s i n g E v e n t ( M o t i o n S e n s i n g E v e n t a r g 0 ) {
D.2 wiinote.engine.ListenerLeds.java Orientation o = arg0 . g e t O r i e n t a t i o n ( ) ;
W i i n o t e . p r o c e s s . motionToMidi ( o . g e t P i t c h ( ) , o . g e t R o l l ( ) ) ;
}
}
/∗ ∗
∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e LED Data
∗
from t h e
Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t .
D.4 wiinote.engine.MidiOut.java
∗ <p>
∗ W h i l s t t h e w i i m o t e o b j e c t i s a t t a c h e d t o t h i s t h e IR
instrument w i l l be /∗ ∗
∗ a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o u s e t h e ∗ C o n t a i n s a number o f MIDI h e l p e r f u n c t i o n s w h i c h a r e u s e d t o
IR i n s t r u m e n t connect
∗ and r e m o v e i t t o s t o p u s i n g i t . ∗ t o a MIDI D e v i c e and t o b u i l d and s e n d MIDI M e s s a g e s t o i t .
∗ <p> ∗ <p>
∗ 05−Apr −2009 ∗ The Aim o f t h i s c l a s s i s t o k e e p a l l MIDI f u n c t i o n s
∗ t o g e t h e r i n one
∗ @author Louis Fellows ∗ c l a s s t o i n c r e a s e r e−u s e and r e d u c e t h e w o r k i n a l t e r i n g t h e
60
∗ @version 1.0.0.0 MIDI
APPENDIX D. CODE
∗ functions of the Project }
∗ <p>
∗ 05−Apr −2009 public void h o l d ( i n t t i m e ) {
∗ try {
∗ @author Louis Fellows Thread . s l e e p ( t i m e ) ;
∗ @version 1.0.0.0 } catch ( I n t e r r u p t e d E x c e p t i o n e ) {
∗/
public c l a s s MidiOut { e . printStackTrace () ;
}
p r i v a t e M i d i D e v i c e md ; }
61
return myMsg ; sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE OFF, 0 , 62 ,
APPENDIX D. CODE
90) ) ; public int octave ;
sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE ON, 0 , 64 , public int outType ;
90) ) ; public int playingNote ;
hold (200) ; public int volume = 9 0 ;
/∗
∗ t he b i n a r y s t r i n g r e p r e s e n t i n g th e b u t t o n s i s as f o l l o w s :
/∗ ∗ ∗ +UDRL??−AB12 Where : U = Up D = Down L = L e f t R = R i g h t ?
∗ C o n t a i n s many f u n c t i o n s f o r c o n v e r t i n g d i f f e r e n t t y p e s i n = unknown
i n p u t i n t o MIDI ∗/
∗ Commands , A l l t h e F u n c t i o n s s e t t h e g l o b a l v a r i a b l e newNote
which i s the if ( binButtons . e q u a l s ( ” 0000000000000 ” ) ) {
∗ MIDI n o t e number t o p l a y and t h e n c a l l t h e f u n c t i o n p l a y ( ) newNote = −1;
which handles } else i f ( binButtons . e q u a l s ( ” 0000000000001 ” ) ) {
∗ t h e s e n d i n g o f MIDI m e s s a g e s u s i n g t h e M i d i O u t c l a s s . newNote = −1;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000000010 ” ) ) {
∗ Any new c o d e f o r i n s t r u m e n t s s h o u l d b e a d d e d h e r e and u s e t h e newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE A ;
play c l a s s to } else i f ( binButtons . e q u a l s ( ” 0000000000011 ” ) ) {
∗ h a n d l e MIDI m e s s a g e s . The S e n d i n g o f a Volume M e s s a g e i n t h e newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE ASHP ;
IR i n s t r u m e n t } else i f ( binButtons . e q u a l s ( ” 0000000001000 ” ) ) {
∗ s h o u l d b e moved t o p l a y ( ) e v e n t u a l l y t o o ! newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE B ;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000001001 ” ) ) {
∗ @see M i d i O u t newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE B ;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000001010 ” ) ) {
∗ 05−Apr −2009 newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE C ;
∗ } else i f ( binButtons . e q u a l s ( ” 0000000001011 ” ) ) {
∗ @author Louis Fellows newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE CSHP ;
∗ @version 1.0.0.0 } else i f ( binButtons . e q u a l s ( ” 0000000000100 ” ) ) {
∗/ newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE D ;
public c l a s s MWProcess { } else i f ( binButtons . e q u a l s ( ” 0000000000101 ” ) ) {
newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE DSHP ;
public static final int NOTE A = 9 ; } else i f ( binButtons . e q u a l s ( ” 0000000000110 ” ) ) {
public static final int NOTE ASHP = 10; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE E ;
public static final int NOTE B = 1 1 ; } else i f ( binButtons . e q u a l s ( ” 0000000000111 ” ) ) {
public static final int NOTE C = 1 2 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE E ;
public static final int NOTE CSHP = 13; } else i f ( binButtons . e q u a l s ( ” 0000000001100 ” ) ) {
public static final int NOTE D = 1 4 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE F ;
public static final int NOTE DSHP = 15; } else i f ( binButtons . e q u a l s ( ” 0000000001101 ” ) ) {
public static final int NOTE E = 1 6 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE FSHP ;
public static final int NOTE F = 1 7 ; } else i f ( binButtons . e q u a l s ( ” 0000000001110 ” ) ) {
public static final int NOTE FSHP = 18; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE G ;
public static final int NOTE G = 1 9 ; } else i f ( binButtons . e q u a l s ( ” 0000000001111 ” ) ) {
public static final int NOTE GSHP = 20; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE GSHP ;
} else i f ( binButtons . e q u a l s ( ” 0100000000000 ” ) ) {
62
public i n t newNote ; newNote = −1;
APPENDIX D. CODE
} else i f ( binButtons . e q u a l s ( ” 0100000000001 ” ) ) { return ”B” ;
newNote = −1; }
} else i f ( binButtons . e q u a l s ( ” 0100000000010 ” ) ) { return ” ” ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE A ; }
} else i f ( binButtons . e q u a l s ( ” 0100000000011 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE ASHP ; public i n t g e t O c t a v e ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001000 ” ) ) { return o c t a v e ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE B ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001001 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE B ; public i n t getOutType ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001010 ” ) ) { return outType ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE C ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001011 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE CSHP ; public i n t g e t P l a y i n g N o t e ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000000100 ” ) ) { return p l a y i n g N o t e ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE D ; }
} else i f ( binButtons . e q u a l s ( ” 0100000000101 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE DSHP ; public i n t getPlusNumberFromNoteNumber ( i n t n o t e ) {
} else i f ( binButtons . e q u a l s ( ” 0100000000110 ” ) ) { i f ( n o t e % 12 == 1 1 ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE E ; return 2 ;
} else i f ( binButtons . e q u a l s ( ” 0100000000111 ” ) ) { }
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE E ; i f ( n o t e % 12 == 4 ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001100 ” ) ) { return 2 ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE F ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001101 ” ) ) { return 3 ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE FSHP ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001110 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE G ; public void IRToMidi ( i n t x , i n t y ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001111 ” ) ) { i n t Note = ( x / 800 ∗ 1 0 0 ) ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE GSHP ; Note = Note ∗ 1 2 7 ;
} else { }
}
public void LedsToMidi ( i n t x1 , i n t y1 , i n t x2 , i n t y2 ) {
play () ; x1 = 1023 − x1 ;
} x2 = 1023 − x2 ;
63
case ( 1 1 ) : }
APPENDIX D. CODE
i n t newvolume = 0 ; play () ;
i f ( y2 >= 6 0 0 ) { }
newvolume = 1 0 0 ;
} e l s e i f ( y2 >= 500 && y2 < 6 0 0 ) { /∗ ∗
newvolume = 9 0 ; ∗ I n c r e a s e s a B i n a r y Number t o t h e r i g h t l e n g t h by adding
} e l s e i f ( y2 >= 400 && y2 < 5 0 0 ) { zeros to
newvolume = 7 5 ; ∗ the front of i t .
} e l s e i f ( y2 >= 300 && y2 < 4 0 0 ) { ∗
newvolume = 6 0 ; ∗ @param b i n a r y S t r t h e b i n a r y s t r
} e l s e i f ( y2 >= 200 && y2 < 3 0 0 ) { ∗ @param Len t h e l e n
newvolume = 4 5 ; ∗
} e l s e i f ( y2 >= 100 && y2 < 2 0 0 ) { ∗ @return t h e s t r i n g
newvolume = 3 0 ; ∗/
} e l s e i f ( y2 >= 000 && y2 < 1 0 0 ) { public S t r i n g makeBinaryLength ( S t r i n g b i n a r y S t r , i n t Len ) {
newvolume = 0 ; int currLen = binaryStr . length ( ) ;
} i n t toAdd = Len − c u r r L e n ;
S t r i n g tempStr = ” ” ;
if ( newvolume != volume ) { f o r ( i n t i = 0 ; i < toAdd ; i ++) {
volume = newvolume ; tempStr += ” 0 ” ;
try { }
ShortMessage v o l = Wiinote . midiout . createShortMessage ( tempStr = tempStr + b i n a r y S t r ;
S h o r t M e s s a g e .CONTROL CHANGE, 0 x07 , volume ) ; return tempStr ;
W i i n o t e . m i d i o u t . sendMSG ( v o l ) ; }
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) {
e . printStackTrace () ; public void motionToMidi ( f l o a t pitch , float roll ) {
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
e . printStackTrace () ; newNote = −1; // s i l e n c e
} i f ( p i t c h > −45 && p i t c h < 4 5 ) {
} i f ( r o l l < −90) {
// S i l e n c e
i n t d i s t = x2 − x1 ; } e l s e i f ( r o l l < −70 && r o l l > −90) {
newNote = −1; newNote = ( o c t a v e ∗ 1 2 ) + NOTE G ;
} e l s e i f ( r o l l < −50 && r o l l > −70) {
if ( d i s t > 1000) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE F ;
} e l s e i f ( r o l l < −30 && r o l l > −50) {
} else i f ( d i s t > 825) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE E ;
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE A ; } e l s e i f ( r o l l < −10 && r o l l > −30) {
} else i f ( d i s t > 750) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE D ;
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE ASHP ; } e l s e i f ( r o l l < 10 && r o l l > −10) {
} else i f ( d i s t > 675) { // S i l e n c e
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE B ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE C ;
} else i f ( d i s t > 600) { } e l s e i f ( r o l l < 30 && r o l l > 1 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE C ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE B ;
} else i f ( d i s t > 525) { } e l s e i f ( r o l l < 50 && r o l l > 3 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE CSHP ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE A ;
} else i f ( d i s t > 450) { } e l s e i f ( r o l l < 70 && r o l l > 5 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE D ; // S i l e n c e
} else i f ( d i s t > 375) { } e l s e i f ( r o l l < 90 && r o l l > 7 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE DSHP ; // S i l e n c e
} else i f ( d i s t > 300) { } else i f ( r o l l > 90) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE E ; // S i l e n c e
} else i f ( d i s t > 225) { }
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE F ; }
} else i f ( d i s t > 150) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE FSHP ; play () ;
} else i f ( d i s t > 75) { }
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE G ;
} else i f ( d i s t > 0) { public void p l a y ( ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE GSHP ;
64
} if ( outType == 1 ) {
APPENDIX D. CODE
playSingle () ; p l a y i n g N o t e = newNote ;
} e l s e i f ( outType == 2 ) { }
pl ay C h or d ( ) ; }
}
} public void p l a y S i n g l e ( ) {
65
}
APPENDIX D. CODE
D.6 wiinote.engine.Wiinote.java ∗
∗ @author Louis Fellows
∗ @version 1.0.0.0
∗/
/∗ ∗ public c l a s s AccDirectionObject {
∗ T h i s c l a s s s t a r t s up t h e W i i n o t e s y s t e m , i t l o a d s t h e o b j e c t s
that public s t a t i c final f l o a t ACC THRESHOLD X = ( f l o a t ) 0.3;
∗ c o m p r i s e t h e 4 main s e c t i o n s o f t h e s y s t e m . The P r o c e s s e s , public s t a t i c final f l o a t ACC THRESHOLD Z = ( f l o a t ) 0.3;
The M i d i System ,
∗ The G e s t u r e R e c o g n i t i o n S y s t e m and t h e GUI . T h i s t h e n s t a r t s public static final int DIR HOLD = 0 ;
the system . public static final int DIR N = 1 ;
∗ <p> public static final int DIR NE = 2 ;
∗ 05−Apr −2009 public static final int DIR E = 3 ;
∗ public static final int DIR SE = 4 ;
∗ @author Louis Fellows public static final int DIR S = 5 ;
∗ @version 1.0.0.0 public static final int DIR SW = 6 ;
∗/ public static final int DIR W = 7 ;
public c l a s s W i i n o t e { public static final int DIR NW = 8 ;
66
∗ 05−Apr −2009
APPENDIX D. CODE
public S t r i n g g e t D i r e c t i o n S t r i n g ( ) { ∗ 05−Apr −2009
return A c c D i r e c t i o n O b j e c t . i n t T o D i r e c t i o n S t r ( d i r e c t i o n ) ; ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
p r i v a t e i n t l e a r n D i r e c t i o n ( M o t i o n S e n s i n g E v e n t m) { ∗/
GForce g f = m. g e t G f o r c e ( ) ; public c l a s s A c c e l e r a t i o n A r r a y {
f l o a t x = g f . getX ( ) ; // h o r i z o n t a l
f l o a t z = g f . g e t Z ( ) ; // v e r t i c a l public A r r a y L i s t <A c c D i r e c t i o n O b j e c t > m o t i o n A r r a y ;
z = ( float ) ( z − 0.2) ;
i n t r e t u r n I n t = −1; public A c c e l e r a t i o n A r r a y ( ) {
m o t i o n A r r a y = new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
if ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X && z > }
ACC THRESHOLD Z) { // N o r t h !
r e t u r n I n t = DIR N ; public void addMotionEvent ( M o t i o n S e n s i n g E v e n t m) {
} e l s e i f ( x > ACC THRESHOLD X && z > ACC THRESHOLD Z) { // m o t i o n A r r a y . add (new A c c D i r e c t i o n O b j e c t (m) ) ;
NE }
r e t u r n I n t = DIR NE ;
} e l s e i f ( x > ACC THRESHOLD X && z > −ACC THRESHOLD Z public void emptyArray ( ) {
&& z < ACC THRESHOLD Z) { // E a s t ! m o t i o n A r r a y = new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
r e t u r n I n t = DIR E ; }
} e l s e i f ( x > ACC THRESHOLD X && z < −ACC THRESHOLD Z) { //
SE /∗ ∗
r e t u r n I n t = DIR SE ; ∗ Moves t h r o u g h t h e A r r a y and r e m o v e s any s a m p l e s t h a t a r e
} e l s e i f ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X t h e same
&& z < −ACC THRESHOLD Z) { // S o u t h ∗ d i r e c t i o n as t h e p r e v i o u s sample . Condensing t h e samples
r e t u r n I n t = DIR S ; down i n t o
} e l s e i f ( x < −ACC THRESHOLD X && z < −ACC THRESHOLD Z) ∗ a List of the d i f f e r e n t vectors of the gesture .
{ // SW ∗
r e t u r n I n t = DIR SW ; ∗ I t a l s o r e m o v e s any e r r o n e o u s s a m p l e s ( r e c o g n i s e d b y t h e
} e l s e i f ( x < −ACC THRESHOLD X && z > −ACC THRESHOLD Z direction
&& z < ACC THRESHOLD Z) { // West ! ∗ −1)
r e t u r n I n t = DIR W ; ∗/
} e l s e i f ( x < −ACC THRESHOLD X && z > ACC THRESHOLD Z) { // public void r e m o v e L i k e M o t i o n s ( ) {
NW A r r a y L i s t <A c c D i r e c t i o n O b j e c t > r e p l a c e A r r a y =
r e t u r n I n t = DIR NW ; new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
} e l s e i f ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X L i s t I t e r a t o r <A c c D i r e c t i o n O b j e c t > l i =
&& z > −ACC THRESHOLD Z && z < ACC THRESHOLD Z) { // motionArray . l i s t I t e r a t o r ( ) ;
Hold !
r e t u r n I n t = DIR HOLD ; AccDirectionObject d i r = l i . next ( ) ;
} while ( d i r . g e t D i r e c t i o n ( ) == −1) {
return r e t u r n I n t ; d i r = l i . next ( ) ;
} }
} r e p l a c e A r r a y . add ( d i r ) ;
while ( l i . hasNext ( ) ) {
int l a s t I n d e x = replaceArray . s i z e ( ) − 1 ;
D.8 wiinote.gesture.AccelerationArray.java d i r = l i . next ( ) ;
i f ( d i r . g e t D i r e c t i o n ( ) != r e p l a c e A r r a y . g e t ( l a s t I n d e x )
. getDirection () ) {
i f ( d i r . g e t D i r e c t i o n ( ) != −1) {
/∗ ∗ r e p l a c e A r r a y . add ( d i r ) ;
∗ The A c c e l l e r a t i o n A r r a y o b j e c t g a t h e r s t o g e t h e r a number o f }
∗ AccDirectionObjects to provide a sampling of a movement o f }
the wiimote . }
∗ There a r e a l s o f u n c t i o n s f o r m a n i p u l a t i n g t h i s data , such as
the motionArray = r e p l a c e A r r a y ;
∗ ’ removeLikeMotions ’ f u n c t i o n which i s r e q u i r e d as part of the }
∗ Gesture Recognition Process .
∗ <p> public i n t [ ] t o I n t A r r a y ( ) {
∗ @see A c c D i r e c t i o n O b j e c t i n t [ ] i n t s = new i n t [ m o t i o n A r r a y . s i z e ( ) ] ;
67
∗ <p>
APPENDIX D. CODE
int count = 0 ; ∗ Find g e s t u r e o b j e c t .
for ( A c c D i r e c t i o n O b j e c t a : motionArray ) { ∗
i n t s [ count ] = a . g e t D i r e c t i o n ( ) ; ∗ @param i d t h e i d
c o u n t ++; ∗
} ∗ @return t h e g e s t u r e o b j e c t
∗/
return i n t s ; public G e s t u r e O b j e c t f i n d G e s t u r e O b j e c t ( i n t id ) {
} f o r ( G e s t u r e O b j e c t ob : c on v A r r a y ) {
i f ( ob . g e t G e s t u r e I D ( ) == i d ) {
public S t r i n g t o S t r i n g ( ) { return ob ;
S t r i n g s t r = new S t r i n g ( ) ; }
}
for ( A c c D i r e c t i o n O b j e c t a : motionArray ) { return n u l l ;
str = str + a . getDirectionString () + ” ” ; }
}
/∗ ∗
return s t r ; ∗ Find g e s t u r e o b j e c t .
} ∗
} ∗ @param i d t h e i d
∗
∗ @return t h e g e s t u r e o b j e c t
∗/
D.9 wiinote.gesture.ConvArray.java public G e s t u r e O b j e c t f i n d G e s t u r e O b j e c t ( S t r i n g
f o r ( G e s t u r e O b j e c t ob : c on v A r r a y ) {
id ) {
i f ( ob . getGestureName ( ) == i d ) {
return ob ;
/∗ ∗ }
∗ H o l d s an a r r a y o f G e s t u r e O b j e c t s w i t h f u n c t i o n s to search }
t h r o u g h them . return n u l l ;
∗ <p> }
∗ 05−Apr −2009
∗ /∗ ∗
∗ @author Louis Fellows ∗ Gets t h e .
∗ @version 1.0.0.0 ∗
∗/ ∗ @param p o s t h e p o s
public c l a s s ConvArray implements S e r i a l i z a b l e { ∗
∗ @return t h e g e s t u r e o b j e c t
/∗ ∗ The C o n s t a n t s e r i a l V e r s i o n U I D . ∗/ ∗/
p r i v a t e s t a t i c f i n a l long public G e s t u r e O b j e c t g e t ( i n t p o s ) {
s e r i a l V e r s i o n U I D = 1849777868583837139L ; return c o n v A r r ay . g e t ( p o s ) ;
}
/∗ ∗ The c o n v a r r a y . ∗/
public A r r a y L i s t <G e s t u r e O b j e c t > c o n v A r r a y ; /∗ ∗
∗ Len c o n v a r r a y .
/∗ ∗ ∗
∗ I n s t a n t i a t e s a new c o n v a r r a y . ∗ @return t h e i n t
∗/ ∗/
public ConvArray ( ) { public i n t l e n C on v A r r a y ( ) {
c o n v A r r a y = new A r r a y L i s t <G e s t u r e O b j e c t >() ; return c o n v A r r ay . s i z e ( ) ;
} }
/∗ ∗ /∗ ∗
∗ Adds t h e . ∗ L i s t names .
∗ ∗
∗ @param o b t h e o b ∗ @return t h e s t r i n g [ ]
∗/ ∗/
public void add ( G e s t u r e O b j e c t ob ) { public S t r i n g [ ] l i s t N a m e s ( ) {
c o n v A r r a y . add ( ob ) ; A r r a y L i s t <S t r i n g > names = new A r r a y L i s t <S t r i n g >() ;
}
if ( c o n v A r r a y . s i z e ( ) == 0 ) {
68
/∗ ∗ String [ ] returnVals ;
APPENDIX D. CODE
r e t u r n V a l s = new S t r i n g [ 0 ] ; ∗ <p>
return r e t u r n V a l s ; ∗ 05−Apr −2009
} ∗
∗ @author Louis Fellows
f o r ( G e s t u r e O b j e c t go : co n v A r r a y ) { ∗ @version 1.0.0.0
names . add ( go . getGestureName ( ) ) ; ∗/
} public c l a s s G e s t u r e O b j e c t implements S e r i a l i z a b l e {
/∗ ∗ public i n t g e t G e s t u r e I D ( ) {
∗ Sets the gesture response . return g e s t u r e I D ;
∗ }
∗ @param s t r t h e s t r
∗ @param r e s p t h e r e s p public S t r i n g getGestureName ( ) {
∗ return gestureName ;
∗ @return t h e i n t }
∗/
public i n t s e t G e s t u r e R e s p o n s e ( S t r i n g str , int resp ) { public i n t g e t G e s t u r e R e s u l t ( ) {
return g e s t u r e R e s u l t ;
f o r ( G e s t u r e O b j e c t ob : c o n vA r r a y ) { }
i f ( ob . getGestureName ( ) == s t r ) {
ob . s e t G e s t u r e R e s u l t ( r e s p ) ; public void s e t G e s t u r e I D ( i n t g e s t u r e I D ) {
return 1 ; this . gestureID = gestureID ;
} }
}
return 0 ; public void s e t G e s t u r e N a m e ( S t r i n g gestureName ) {
} t h i s . gestureName = gestureName ;
} }
public void s e t G e s t u r e R e s u l t ( i n t g e s t u r e R e s u l t ) {
this . gestureResult = gestureResult ;
D.10 wiinote.gesture.GestureObject.java }
}
/∗ ∗
∗ A G e s t u r e O b j e c t h o l d s a u n i q u e g e s t u r e ID a l o n g w i t h t h e
’ friendly ’
D.11 wiinote.gesture.GestureRecognisedEvent.java
∗ name o f t h e g e s t u r e and t h e a c t i o n t h a t s h o u l d o c c u r when t h e
gesture
69
∗ happens . /∗ ∗
APPENDIX D. CODE
∗ Thrown b y t h e G e s t u r e R e c o g n i s e r when a G e s t u r e h a s o c c u r r e d . ∗ @author Louis Fellows
∗ <p> ∗ @version 1.0.0.0
∗ 05−Apr −2009 ∗/
∗ public c l a s s GestureRecogniser {
∗ @author Louis Fellows
∗ @version 1.0.0.0 public A c c e l e r a t i o n A r r a y AccArr ;
∗/ public i n t addEndNum ;
public c l a s s G e s t u r e R e c o g n i s e d E v e n t extends A c t i o n E v e n t { public boolean addNewEnd = f a l s e ;
public boolean cOn ;
p r i v a t e s t a t i c f i n a l long public ConvArray c o n v A rr a y ;
s e r i a l V e r s i o n U I D = −7617671388195703967L ; public E v e n t L i s t e n e r L i s t l i s t e n e r s = new E v e n t L i s t e n e r L i s t ( ) ;
private int gestureFound ; public i n t numAdds ;
public P a t h O b j e c t path ;
public G e s t u r e R e c o g n i s e d E v e n t ( O b j e c t a r g 0 , int arg1 , String
arg2 , public G e s t u r e R e c o g n i s e r ( ) {
int Gesture ) { loadPaths ( ) ;
super ( a r g 0 , a r g 1 , a r g 2 ) ; AccArr = new A c c e l e r a t i o n A r r a y ( ) ;
gestureFound = Gesture ; cOn = f a l s e ;
} }
70
∗ 05−Apr −2009 O u t pu t O b je ct oo = new O u tpu tOb j e ct ( ) ;
APPENDIX D. CODE
path = oo . i n p u t P a t h ( ) ; addEndNum = c o n v A r ra y . l en C o n v A rr a y ( ) − 1 ;
c o n v A r r a y = oo . inputC onvAr ray ( ) ;
} G e s t u r e O b j e c t newGestOb = new G e s t u r e O b j e c t ( newName ,
addEndNum , 0 ) ;
public void newMotion ( M o t i o n S e n s i n g E v e n t e ) {
i f ( cOn == true ) { c o n v A r r a y . s e t ( c o n v Ar r a y . le n C o n v A r r a y ( ) − 1 , newGestOb ) ;
AccArr . addMotionEvent ( e ) ;
} return c o n v A r ra y . l en C o n v A rr a y ( ) − 1 ;
} } else {
addNewEnd = true ;
p r i v a t e void n o t i f y E v e n t L i s t e n e r s ( G e s t u r e R e c o g n i s e d E v e n t e v t ) { numAdds = nnumAdds ;
for ( EventListener l i s t e n e r : getEventListeners ( ) ) { addEndNum = g e s t . g e t G e s t u r e I D ( ) ;
( ( G e s t u r e L i s t e n e r ) l i s t e n e r ) . onGestureEvent ( evt ) ; return addEndNum ;
} }
} }
}
public void r e m o v e E v e n t L i s t e n e r s ( G e s t u r e L i s t e n e r l i s t e n e r ) {
l i s t e n e r s . remove ( G e s t u r e L i s t e n e r . c l a s s , l i s t e n e r ) ;
}
public void r e s e t G e s t u r e s ( i n t l e v e l ) {
D.13 wiinote.gesture.ListenerGestureCapture.java
i f ( l e v e l == 1 ) {
path = new P a t h O b j e c t ( −2) ;
savePaths ( ) ; /∗ ∗
W i i n o t e . g u i . msgWindow . newMessage ( ” G e s t u r e System R e s e t ” , ∗ Handles G e s t u r e R e c o g n i s e d E v e n t s thrown by a G e s t u r e R e c o g n i s e r .
2) ; ∗ T a k e s t h e g e s t u r e t h a t h a s b e e n f o u n d and d e c i d e s w h a t a c t i o n
} e l s e i f ( l e v e l == 2 ) { i s to be
c o n v A r r ay = new ConvArray ( ) ; ∗ p e r f o r m e d now t h a t t h e g e s t u r e h a s b e e n r e c o g n i s e d .
path = new P a t h O b j e c t ( −2) ; ∗ <p>
savePaths ( ) ; ∗ @see G e s t u r e R e c o g n i s e r
W i i n o t e . g u i . msgWindow . newMessage ( ” G e s t u r e Tree R e s e t ” , ∗ <p>
2) ; ∗ 05−Apr −2009
} ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
public void s a v e P a t h s ( ) { ∗/
O u t pu t O bje ct oo = new O u t pu t O b jec t ( ) ; public c l a s s L i s t e n e r G e s t u r e C a p t u r e implements G e s t u r e L i s t e n e r {
oo . o u t p u t P a t h ( path ) ;
oo . outputConvArray ( c o n v A r r a y ) ; public static final int ACTION EXIT = 7 ;
} public static final int ACTION INSTRUMENT FLUTE = 3 ;
public static final int ACTION INSTRUMENT IR = 5 ;
p r i v a t e void s e a r c h G e s t u r e s ( ) { public static final int ACTION INSTRUMENT PITCH ROLL = 4 ;
cOn = f a l s e ; public static final int ACTION OCTAVE DOWN = 2 ;
AccArr . r e m o v e L i k e M o t i o n s ( ) ; public static final int ACTION OCTAVE UP = 1 ;
i n t motion = path . f o l l o w P a t h ( AccArr . t o I n t A r r a y ( ) ) ; public static final int ACTION PRINT GESTURE FOUND = 0 ;
public static final int ACTION SET MIDI = 6 ;
n o t i f y E v e n t L i s t e n e r s (new G e s t u r e R e c o g n i s e d E v e n t ( t h i s ,
motion , I n t e g e r public ConvArray a c t i o n s = W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y ;
. t o S t r i n g ( motion ) , motion ) ) ;
71
c o n v A r r ay . add ( n u l l ) ; i f ( c a u g h t G e s t u r e == n u l l ) {
APPENDIX D. CODE
msgWindow . newMessage ( ” G e s t u r e Not R e c o g n i s e d ” , 6 ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r P i t c h R o l l ( ) ) ;
return ;
} else { msgWindow . newMessage ( ” P i t c h / R o l l I n s t r u m e n t Selected ” ,
action = caughtGesture . getGestureResult ( ) ; 1) ;
} break ;
case ( 5 ) :
switch ( a c t i o n ) { W i i m o t e L i s t e n e r [ ] wmLi2 =
case ( −1) : // e s c a p e c l a u s e ! ! wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
break ;
case ( 0 ) : f o r ( W i i m o t e L i s t e n e r l : wmLi2 ) {
msgWindow . newMessage ( ” G e s t u r e ” + wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ;
c a u g h t G e s t u r e . getGestureName ( ) }
+ ” Recognised ” , 6) ;
break ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
case ( 1 ) : wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
i n t newuOct = W i i n o t e . p r o c e s s . g e t O c t a v e ( ) + 1 ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
i f ( newuOct > 7 ) {
newuOct = 7 ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r L e d s ( ) ) ;
}
W i i n o t e . p r o c e s s . s e t O c t a v e ( newuOct ) ; msgWindow . newMessage ( ”Hand Waving I n s t r u m e n t S e l e c t e d ” ,
msgWindow . newMessage ( ” G e s t u r e R e c o g n i s e d : Octave Up , Now 1) ;
” break ;
+ newuOct , 4 ) ; case ( 6 ) :
break ; Wiinote . gui . s e t M i d i S e t t i n g s ( ) ;
case ( 2 ) : break ;
i n t newdOct = W i i n o t e . p r o c e s s . g e t O c t a v e ( ) − 1 ; case ( 7 ) :
i f ( newdOct < 0 ) { I m a g e I c o n e x i t I c o n = new I m a g e I c o n (
newdOct = 0 ; ”C: \ \ Documents and
} S e t t i n g s \\ L o u i s .METALMACHINE\\ w o r k s p a c e \\ M u s i c a l W i i m o t e \\ s r c \\ i c o n s \\ e x i t . pn
W i i n o t e . p r o c e s s . s e t O c t a v e ( newdOct ) ;
msgWindow . newMessage ( ” G e s t u r e R e c o g n i s e d : Octave Down , int r e s = JOptionPane . s h o w O p t i o n D i a l o g ( W i i n o t e . g u i . frame ,
Now ” ” Are You S u r e You Want To Quit ? ” , ” Quit ? ” ,
+ newdOct , 5 ) ; JOptionPane . OK CANCEL OPTION,
break ; JOptionPane .WARNING MESSAGE,
case ( 3 ) : e x i t I c o n , null , null ) ;
W i i m o t e L i s t e n e r [ ] wmL = wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ; if ( r e s == JOptionPane . OK OPTION) {
System . e x i t ( 0 ) ;
f o r ( W i i m o t e L i s t e n e r l : wmL) { }
wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ; break ;
} }
}
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; @ O v er r ide
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ; public void o n G e s t u r e E v e n t ( G e s t u r e R e c o g n i s e d E v e n t e v t ) {
actionDB ( e v t . g e t G e s t u r e F o u n d ( ) ) ;
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r F l u t e ( ) ) ; }
}
msgWindow . newMessage ( ” F l u t e I n s t r u m e n t S e l e c t e d ” , 1) ;
break ;
case ( 4 ) :
W i i m o t e L i s t e n e r [ ] wmLi =
wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
D.14 wiinote.gesture.PathObject.java
f o r ( W i i m o t e L i s t e n e r l : wmLi ) {
wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ; /∗ ∗
} ∗ A c t s a s a n o d e i n an n−t r e e w h i c h r e p r e s e n t s t h e p o s s i b l e s
g e s t u r e s in
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ; ∗ t h e s y s t e m . The e n d I n t v a l u e i s t h e ID o f t h e g e s t u r e t h a t
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; has been
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ; ∗ p e r f o r m e d t o r e a c h t h i s node . I f i t i s n u l l t h e n t h e g e s t u r e
72
has not
APPENDIX D. CODE
∗ been r e c o g n i s e d . if( path . g e t ( hd ) != n u l l ) {
∗ <p> path . g e t ( hd ) . c r e a t e P a t h ( endID , t l ) ;
∗ T h i s i s c u r r e n t l y a 9− t r e e , w i t h one b r a n c h f o r e a c h } else {
direction defined path . s e t ( hd , new P a t h O b j e c t ( −2) ) ;
∗ i n A c c D i r e c t i o n O b j e c t ( @see A c c D i r e c t i o n O b j e c t ) . I f path . g e t ( hd ) . c r e a t e P a t h ( endID , t l ) ;
AccDirectionObject }
∗ was t o b e u p d a t e d t o h a v e more p o s s i b l e d i r e c t i o n s ( s a y , b y }
adding the }
∗ Y d i r e c t i o n ) then t h i s c l a s s would have to be updated to
b u i l d an n−t r e e public i n t f o l l o w P a t h ( i n t [ ] dirs ) {
∗ o f t h e number o f d i r e c t i o n s d e f i n e d i n t h e A c c D i r e c t i o n O b j e c t .
∗ <p> for ( int i : dirs ) {
∗ 05−Apr −2009 }
∗
∗ @author Louis Fellows i f ( d i r s . l e n g t h == 0 ) {
∗ @version 1.0.0.0 return e n d I n t ;
∗/ } else {
public c l a s s P a t h O b j e c t implements S e r i a l i z a b l e { i n t hd = d i r s [ 0 ] ;
i n t [ ] t l = new i n t [ d i r s . l e n g t h − 1 ] ;
p r i v a t e s t a t i c f i n a l long for ( int i = 0 ; i < d i r s . length − 1 ; i ++) {
s e r i a l V e r s i o n U I D = −2458132543035895739L ; tl [ i ] = dirs [ i + 1];
private int endInt ; }
p r i v a t e A r r a y L i s t <PathObject> path ;
if( path . g e t ( hd ) != n u l l ) {
public P a t h O b j e c t ( i n t e n d I n t ) { return path . g e t ( hd ) . f o l l o w P a t h ( t l ) ;
this . endInt = endInt ; } else {
path = new A r r a y L i s t <PathObject >() ; return −1;
path . add ( 0 , n u l l ) ; }
path . add ( 1 , n u l l ) ; }
path . add ( 2 , n u l l ) ; }
path . add ( 3 , n u l l ) ;
path . add ( 4 , n u l l ) ; public P a t h O b j e c t g e t P a t h ( i n t d i r ) {
path . add ( 5 , n u l l ) ; i f ( path . g e t ( d i r ) != n u l l ) {
path . add ( 6 , n u l l ) ; return path . g e t ( d i r ) ;
path . add ( 7 , n u l l ) ; }
path . add ( 8 , n u l l ) ; return n u l l ;
} }
73
APPENDIX D. CODE
D.15 wiinote.ui.GestureGui.java BoxLayout . X AXIS ) ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l ,
BoxLayout . X AXIS ) ) ;
s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
/∗ ∗ BoxLayout . X AXIS ) ) ;
∗ C r e a t e s a frame w i t h a l l t h e c o n t r o l s t o the Gesture Capture p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ;
System
∗ Within i t a l o n g w i t h t h e code to c o n t r o l these functions . p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
∗ <p> p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
∗ 06−Apr −2009
∗ panel . setBorder ( lineBdr ) ;
∗ @author Louis Fellows J L a b e l h e a d e r = new J L a b e l ( ” Capture G e s t u r e ” ) ;
∗ @version 1.0.0.0 h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ;
∗/ t o p P a n e l . add ( h e a d e r ) ;
public c l a s s G e s t u r e G u i extends JFrame { t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
74
i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l , Gesture ”
APPENDIX D. CODE
+ t i m e s + ” Times ! ” , 2 ) ; JButton r e s e t G e s t u r e s = new JButton ( ” R e s e t G e s t u r e Tree ” ) ;
r e s e t G e s t u r e s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
dispose () ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
} i n t n = JOptionPane . s h o w C o n f i r m D i a l o g ( G e s t u r e G u i . t h i s ,
}) ; ” Are You S u r e ? ” , ” R e a l l y ? ? ” ,
JOptionPane . YES NO OPTION) ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; i f ( n == JOptionPane . YES OPTION) {
s a v e P a n e l . add ( s a v e ) ; Wiinote . gestureRecog . r e s e t G e s t u r e s ( 1 ) ;
}
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 10) ) ) ; }
p a n e l . add ( s a v e P a n e l ) ; }) ;
return p a n e l ;
} s a v e P a n e l . add ( r e s e t G e s t u r e s ) ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
public J P a n e l g e s t u r e R e s e t P a n e l ( ) {
J P a n e l p a n e l = new J P a n e l ( ) ; p a n e l . add ( s a v e P a n e l ) ;
J P a n e l t o p P a n e l = new J P a n e l ( ) ; return p a n e l ;
J P a n e l i n n e r P a n e l = new J P a n e l ( ) ; }
J P a n e l s a v e P a n e l = new J P a n e l ( ) ;
public J P a n e l g e s t u r e R e s p o n s e P a n e l ( ) {
i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l , J P a n e l p a n e l = new J P a n e l ( ) ;
BoxLayout . X AXIS ) ) ; J P a n e l t o p P a n e l = new J P a n e l ( ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l , J P a n e l i n n e r P a n e l = new J P a n e l ( ) ;
BoxLayout . X AXIS ) ) ; J P a n e l s a v e P a n e l = new J P a n e l ( ) ;
s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
BoxLayout . X AXIS ) ) ; i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l ,
p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ; BoxLayout . X AXIS ) ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l ,
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 180) ) ; BoxLayout . X AXIS ) ) ;
p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 180) ) ; s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
BoxLayout . X AXIS ) ) ;
panel . setBorder ( lineBdr ) ; p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ;
J L a b e l h e a d e r = new J L a b e l ( ” System R e s e t s ” ) ;
h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ; p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
t o p P a n e l . add ( h e a d e r ) ; p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
panel . setBorder ( lineBdr ) ;
h e a d e r . setMinimumSize (new Dimension ( 4 0 0 , 20) ) ; J L a b e l h e a d e r = new J L a b e l ( ” G e s t u r e R e s p o n s e s ” ) ;
h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ;
p a n e l . add ( t o p P a n e l ) ; t o p P a n e l . add ( h e a d e r ) ;
t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 5 , 0) ) ) ;
h e a d e r . setMinimumSize (new Dimension ( 4 0 0 , 2 0 ) ) ;
JButton r e s e t S y s t e m = new JButton ( p a n e l . add ( t o p P a n e l ) ;
” R e s e t G e s t u r e System ( D e l e t e A l l G e s t u r e s ) ” ) ; p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ;
r e s e t S y s t e m . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { J L a b e l t i t l e = new J L a b e l ( ” G e s t u r e : ” ) ;
i n t n = JOptionPane . s h o w C o n f i r m D i a l o g ( G e s t u r e G u i . t h i s , i n n e r P a n e l . add ( t i t l e ) ;
” Are You S u r e ? ” , ” R e a l l y ? ? ” ,
JOptionPane . YES NO OPTION) ; g e s t s = new
i f ( n == JOptionPane . YES OPTION) { JComboBox ( W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y . l i s t N a m e s ( ) ) ;
Wiinote . gestureRecog . r e s e t G e s t u r e s ( 2 ) ; g e s t s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
} public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
} G e s t u r e O b j e c t gOb = W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y
}) ; . findGestureObject (( String )
gests . getSelectedItem () ) ;
i n n e r P a n e l . add ( r e s e t S y s t e m ) ; r e s p . s e t S e l e c t e d I n d e x ( gOb . g e t G e s t u r e R e s u l t ( ) ) ;
i n n e r P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; s a v e S t r i n g . setText ( ”” ) ;
}
p a n e l . add ( i n n e r P a n e l ) ; }) ;
75
i n n e r P a n e l . add ( g e s t s ) ;
APPENDIX D. CODE
i n n e r P a n e l . add (new J L a b e l ( ” Responds With : ” ) ) ; }
S t r i n g [ ] r e s p o n s e s = { ” P r i n t ’ G e s t u r e Found ’ ” , ” Octave Up” ,
” Octave Down” , ” F l u t e I n s t r u m e n t ” , ” P i t c h / R o l l }
Instrument ” ,
” IR I n s t r u m e n t ” , ” S e t MIDI” , ” E x i t ” } ;
r e s p = new JComboBox ( r e s p o n s e s ) ;
i n n e r P a n e l . add ( r e s p ) ;
D.16 wiinote.ui.Gui.java
r e s p . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
s a v e S t r i n g . s e t T e x t ( ” Unsaved ” ) ; /∗ ∗
} ∗ The Main Gui O b j e c t , C o n t a i n s a l l o b j e c t s t h a t b u i l d up t h e
}) ; main
∗ s y s t e m g u i and much o f t h e c o d e t h a t c o n t r o l s t h e s y s t e m .
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 20) ) ) ; ∗ <p>
p a n e l . add ( i n n e r P a n e l ) ; ∗ 06−Apr −2009
∗
JButton s a v e = new JButton ( ” Save ” ) ; ∗ @author Louis Fellows
s a v e . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { ∗ @version 1.0.0.0
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { ∗/
Wiinote . g e s t u r e R e c o g . convArray . s e t G e s t u r e R e s p o n s e ( public c l a s s Gui {
( String ) gests . getSelectedItem () , resp
. getSelectedIndex () ) ; public A c c e l e r a t i o n W i i m o t e E v e n t P a n e l acPan ;
s a v e S t r i n g . s e t T e x t ( ” Saved ” ) ; public J P a n e l eWindow ;
Wiinote . gestureRecog . savePaths ( ) ; public JFrame f r a m e ;
} public MessagesWindow msgWindow ;
}) ; public JButton newGesture ;
public NoteWindow noteWindow ;
s a v e S t r i n g = new J L a b e l ( ” ” ) ; public O r i e n t a t i o n W i i m o t e E v e n t P a n e l orPan ;
public JLabel s t a t u s b a r ;
s a v e P a n e l . add ( s a v e S t r i n g ) ; public Toolkit t o o l k i t ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; public Wiimote wmote ;
s a v e P a n e l . add ( s a v e ) ;
public Gui ( ) {
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 10) ) ) ; f r a m e = new JFrame ( ) ;
p a n e l . add ( s a v e P a n e l ) ;
return p a n e l ; I m a g e I c o n w i i m o t e I c o n = new
} I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w i i m o t e . png ” ) ;
frame . setIconImage ( wiimoteIcon . getImage ( ) ) ;
public J P a n e l g e s t u r e T i t l e ( ) {
J P a n e l p a n e l = new J P a n e l ( ) ; frame . s e t S i z e ( 1 0 2 4 , 768) ;
p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . X AXIS ) ) ; frame . s e t T i t l e ( ” Wiinote ” ) ;
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 2 0 ) ) ; f r a m e . s e t D e f a u l t C l o s e O p e r a t i o n ( JFrame . EXIT ON CLOSE ) ;
p a n e l . s e t A l i g n m e n t Y (LEFT ALIGNMENT) ;
J L a b e l t i t l e = new J L a b e l ( ” G e s t u r e Capture P r o p e r t i e s ” ) ; B o r d e r L a y o u t l o = new B o r d e r L a y o u t ( ) ;
p a n e l . add ( t i t l e ) ; frame . setLayout ( l o ) ;
return p a n e l ;
} J P a n e l t o o l b a r s = new J P a n e l ( ) ;
t o o l b a r s . s e t L a y o u t (new BoxLayout ( t o o l b a r s ,
public J P a n e l r e t u r n P a n e l ( ) { BoxLayout . X AXIS ) ) ;
J P a n e l p a n e l = new J P a n e l ( ) ; t o o l b a r s . add ( c r e a t e G e n e r a l T o o l b a r ( ) ) ;
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 2 0 ) ) ; t o o l b a r s . add ( c r e a t e W i i m o t e T o o l b a r ( ) ) ;
p a n e l . s e t A l i g n m e n t Y (RIGHT ALIGNMENT) ; f r a m e . add ( t o o l b a r s , B o r d e r L a y o u t .NORTH) ;
JButton r e t u r n B u t t o n = new JButton ( ” Return ” ) ;
r e t u r n B u t t o n . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { J P a n e l t o o l b a r s W e s t = new J P a n e l ( ) ;
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { t o o l b a r s W e s t . s e t L a y o u t (new BoxLayout ( t o o l b a r s W e s t ,
dispose () ; BoxLayout . Y AXIS ) ) ;
} t o o l b a r s W e s t . add ( c r e a t e M u s i c a l T o o l b a r ( ) ) ;
}) ; f r a m e . add ( t o o l b a r s W e s t , B o r d e r L a y o u t .WEST) ;
p a n e l . add ( r e t u r n B u t t o n ) ;
76
return p a n e l ; J P a n e l window = new J P a n e l ( ) ;
APPENDIX D. CODE
window . s e t B a c k g r o u n d (new C o l o r ( 0 , 0 , 0 ) ) ; return w i i m o t e ;
window . s e t L a y o u t (new BoxLayout ( window , BoxLayout . Y AXIS ) ) ; }
f r a m e . add ( window , B o r d e r L a y o u t .CENTER) ;
p r i v a t e JToolBar c r e a t e G e n e r a l T o o l b a r ( ) {
eWindow = new J P a n e l ( ) ; JToolBar t o o l b a r = new JToolBar ( ) ;
eWindow . s e t L a y o u t (new BoxLayout ( eWindow , BoxLayout . Y AXIS ) ) ; toolbar . setFloatable ( false ) ;
f r a m e . s e t V i s i b l e ( true ) ; I m a g e I c o n o c t U p I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ octaveUp . png ” ) ;
} I m a g e I c o n o c t D n I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ octaveDown . png ” ) ;
public Wiimote c o n n e c t W i i m o t e ( ) { I m a g e I c o n s t a v e 1 I c o n = new
Wiimote [ ] w i i m o t e s = WiiUseApiManager . g e t W i i m o t e s ( 1 , true ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ s t a v e 1 . png ” ) ;
i f ( w i i m o t e s . l e n g t h == 0 ) { I m a g e I c o n s t a v e C r d I c o n = new
s e t S t a t u s b a r ( ”No Connected Wiimotes Found ” , 2 ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ s t a v e c h o r d . png ” ) ;
return n u l l ; I m a g e I c o n PRIcon = new
} I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ p i t c h R o l l i n s . png ” ) ;
Wiimote w i i m o t e = w i i m o t e s [ 0 ] ; I m a g e I c o n FLIcon = new
wiimote . activat eMotionSensi ng ( ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ f l u t e i n s . png ” ) ;
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; I m a g e I c o n IRHIcon = new
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ IRHandsIns . png ” ) ;
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
77
s e t S t a t u s b a r ( ” Wiimote Connected ” , 1 ) ; f i n a l JButton octUp = new JButton ( o c t U p I c o n ) ;
APPENDIX D. CODE
final JButton octDn = new JButton ( o c t D n I c o n ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
final JButton p i t c h R o l l = new JButton ( PRIcon ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
final JButton f l u t e = new JButton ( FLIcon ) ;
final JButton IRHands = new JButton ( IRHIcon ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r F l u t e ( ) ) ;
final JButton s t a v e 1 = new JButton ( s t a v e 1 I c o n ) ; wmote . d e a c t i v a t e I R T R a c k i n g ( ) ;
final JButton s t a v e C r d = new JButton ( s t a v e C r d I c o n ) ; wmote . a c t i v a t e M o t i o n S e n s i n g ( ) ;
78
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
APPENDIX D. CODE
wmote . s e t L e d s ( f a l s e , f a l s e , true , false ) ; d i s c o n n e c t . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
79
t o o l b a r . add ( d i s c o n n e c t ) ;
APPENDIX D. CODE
if ( p a s s == f a l s e ) { I m a g e I c o n msgIcon = n u l l ;
s = ( S t r i n g ) JOptionPane . s h o w I n p u t D i a l o g ( frame , i f ( i c o n == 1 ) {
” That name a l r e a d y e x i s t s , p l e a s e g i v e msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m e s s a g e . png ” ) ;
this ” } e l s e i f ( i c o n == 2 ) {
+ ” g e s t u r e a UNIQUE name ! ” , msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w a r n i n g . png ” ) ;
”Name New G e s t u r e ” , } e l s e i f ( i c o n == 3 ) {
JOptionPane . PLAIN MESSAGE , msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ e r r o r . png ” ) ;
n u l l , n u l l , ”New G e s t u r e ” ) ; }
} s t a t u s b a r . s e t I c o n ( msgIcon ) ;
} while ( p a s s == f a l s e ) ; statusbar . setText ( status ) ;
msgWindow . newMessage ( s t a t u s , i c o n ) ;
W i i n o t e . g e s t u r e R e c o g . setToAdd ( 1 0 , s ) ; }
msgWindow . newMessage ( ” Adding New G e s t u r e ” , 2 ) ; }
msgWindow . newMessage ( ” P l e a s e P e r f o r m t h e G e s t u r e 10
Times ! ” , 2 ) ;
}
}) ;
D.17 wiinote.ui.MessagesWindow.java
GestureProp . setToolTipText ( ” Gesture P r o p e r t i e s ” ) ;
t o o l b a r . add ( G e s t u r e P r o p ) ;
G e s t u r e P r o p . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { /∗ ∗
G e s t u r e G u i gGui = new G e s t u r e G u i ( ) ; ∗ A Window w h i c h d i s p l a y s a l i s t o f m e s s a g e s f r o m t e s y s t e m
a l o n g w i t h an
} ∗ i c o n d e s r i b i g t h e m e s s a g e . Once t h e maximum number o f m e s s a g e s
}) ; i s reached
∗ t h e o l d e s t m e s s a g e i s r e m o v e d and a l l m e s s a g e s move a s p a c e
return t o o l b a r ; back in the
} ∗ array .
∗ <p>
public void n u n c h u k A c t i v e ( boolean a c t i v e ) { ∗ 06−Apr −2009
newGesture . s e t E n a b l e d ( a c t i v e ) ; ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
public void s e t M i d i S e t t i n g s ( ) { ∗/
S t r i n g [ ] p o r t s = Wiinote . midiout . outputPorts ( ) ; public c l a s s MessagesWindow extends J P a n e l {
I m a g e I c o n m i d i I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m i d i . png ” ) ; p r i v a t e s t a t i c f i n a l long
s e r i a l V e r s i o n U I D = 3677227138328480862L ;
String s = ( S t r i n g ) JOptionPane . s h o w I n p u t D i a l o g ( frame , public I m a g e I c o n mesgIcon , warnIcon , e r r I c o n , upIcon ,
” P l e a s e c h o o s e t h e name o f t h e Midi p o r t t o u s e : ” , downIcon , wmIcon ;
” Choose Midi P o r t ” , JOptionPane . PLAIN MESSAGE , public A r r a y L i s t <J L a b e l> m e s s a g e s ;
midiIcon , ports ,
null ) ; public MessagesWindow ( i n t numberMessages ) {
init () ;
if ( ( s != n u l l ) && ( s . l e n g t h ( ) > 0 ) ) {
try { f o r ( i n t i = 0 ; i < numberMessages ; i ++) {
Wiinote . midiout . connectToPort ( s ) ; J L a b e l m e s s a g e = new J L a b e l ( ) ;
} catch ( M i d i U n a v a i l a b l e E x c e p t i o n e ) { m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
msgWindow . newMessage ( ” Midi U n a v a l i a b l e Exception ” , 3) ; m e s s a g e s . add ( m e s s a g e ) ;
return ; }
}
for ( JLabel label : messages ) {
Wiinote . midiout . t e s t S i g n a l ( ) ; add ( l a b e l ) ;
s e t S t a t u s b a r ( ” Midi Output S e t ” , 1 ) ; }
80
public void s e t S t a t u s b a r ( S t r i n g status , int icon ) {
APPENDIX D. CODE
public MessagesWindow ( i n t numberMessages , C o l o r background , }
Color foreground ) {
init () ; public I m a g e I c o n getIconByNumber ( i n t number ) {
ImageIcon r e t u r n I c o n = null ;
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { i f ( number == 1 ) {
J L a b e l m e s s a g e = new J L a b e l ( ) ; r e t u r n I c o n = mesgIcon ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ; } e l s e i f ( number == 2 ) {
message . setForeground ( f o r e g r o u n d ) ; r e t u r n I c o n = warnIcon ;
m e s s a g e s . add ( m e s s a g e ) ; } e l s e i f ( number == 3 ) {
} returnIcon = errIcon ;
} e l s e i f ( number == 4 ) {
for ( JLabel label : messages ) { r e t u r n I c o n = upIcon ;
add ( l a b e l ) ; } e l s e i f ( number == 5 ) {
} r e t u r n I c o n = downIcon ;
} e l s e i f ( number == 6 ) {
setBackground ( background ) ; r e t u r n I c o n = wmIcon ;
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; }
setMaximumSize (new Dimension ( 4 0 0 , 20 ∗ numberMessages ) ) ; return r e t u r n I c o n ;
}
}
p r i v a t e void i n i t ( ) {
public MessagesWindow ( i n t h e i g h t , int width ) { m e s s a g e s = new A r r a y L i s t <J L a b e l >() ;
init () ; m e s g I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m e s s a g e . png ” ) ;
w a r n I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w a r n i n g . png ” ) ;
i n t numberMessages = h e i g h t / 2 0 ; e r r I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ e r r o r . png ” ) ;
u p I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\upsm . png ” ) ;
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { downIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\downsm . png ” ) ;
J L a b e l m e s s a g e = new J L a b e l ( ) ; wmIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ wiimotesm . png ” ) ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
m e s s a g e s . add ( m e s s a g e ) ; }
}
public void newMessage ( S t r i n g msg , i n t i c o n ) {
for ( JLabel label : messages ) { I m a g e I c o n msgIcon = getIconByNumber ( i c o n ) ;
add ( l a b e l ) ;
} f o r ( i n t i = m e s s a g e s . s i z e ( ) − 1 ; i > 0 ; i −−) {
messages . get ( i ) . s e t I c o n ( messages . get ( i − 1) . getIcon ( ) ) ;
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; messages . get ( i ) . setText ( messages . get ( i − 1) . getText ( ) ) ;
setMaximumSize (new Dimension ( width , h e i g h t ) ) ; messages . get ( i )
} . setToolTipText ( messages . get ( i −
1) . getToolTipText ( ) ) ;
public MessagesWindow ( i n t h e i g h t , i n t width , C o l o r background , }
Color foreground ) {
init () ; m e s s a g e s . g e t ( 0 ) . s e t T e x t ( msg ) ;
m e s s a g e s . g e t ( 0 ) . s e t I c o n ( msgIcon ) ;
i n t numberMessages = h e i g h t / 2 0 ; m e s s a g e s . g e t ( 0 ) . s e t T o o l T i p T e x t ( msg ) ;
}
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { }
J L a b e l m e s s a g e = new J L a b e l ( ) ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
message . setForeground ( f o r e g r o u n d ) ;
}
m e s s a g e s . add ( m e s s a g e ) ; D.18 wiinote.ui.NoteWindow.java
for ( JLabel label : messages ) {
add ( l a b e l ) ; /∗ ∗
} ∗ A UI O b j e c t w h i c h d i p l a y s i n f o r m a t i o n on t h e note currently
being played
setBackground ( background ) ; ∗ <p>
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; ∗ 06−Apr −2009
setMaximumSize (new Dimension ( width , h e i g h t ) ) ; ∗
81
∗ @author Louis Fellows
APPENDIX D. CODE
∗ @version 1.0.0.0 n o t e O c t a v e . s e t F o n t (new Font ( ” A r i a l ” , Font . PLAIN , 20) ) ;
∗/ noteOctave . setBackground ( backcol ) ;
public c l a s s NoteWindow extends J P a n e l { noteOctave . setForeground ( f o r e c o l ) ;
J P a n e l t i t l e F r a m e = new J P a n e l ( ) ; }
t i t l e F r a m e . setBackground ( backcol ) ;
titleFrame . setForeground ( f o r e c o l ) ; public void s e t N o t e L e t t e r ( S t r i n g newText ) {
t i t l e F r a m e . s e t L a y o u t (new BoxLayout ( t i t l e F r a m e , n o t e L e t t e r . s e t T e x t ( newText ) ;
BoxLayout . X AXIS ) ) ; }
82