You are on page 1of 93

Wiinote - Musical Interface for the Wii Remote.

Louis Fellows

Bachelor of Science in Computer Science with Honours


The University of Bath
April 2009
This dissertation may be made available for consultation within the Uni-
versity Library and may be photocopied or lent to other libraries for the
purposes of consultation.

Signed:
Declaration

Submitted by: Louis Fellows

COPYRIGHT
Attention is drawn to the fact that copyright of this dissertation rests with its author. The
Intellectual Property Rights of the products produced as part of the project belong to the
University of Bath (see http://www.bath.ac.uk/ordinances/#intelprop).
This copy of the dissertation has been supplied on condition that anyone who consults it
is understood to recognise that its copyright rests with its author and that no quotation
from the dissertation and no information derived from it may be published without the
prior written consent of the author.

Declaration
This dissertation is submitted to the University of Bath in accordance with the requirements
of the degree of Batchelor of Science in the Department of Computer Science. No portion of
the work in this dissertation has been submitted in support of an application for any other
degree or qualification of this or any other university or institution of learning. Except
where specifcally acknowledged, it is the work of the author.

Signed:
Abstract

The aim of this project is to create an interface whereby the Wiimote can be used as an
instrument to create music. The idea allows for a brief exploration of the properties of the
Wiimote whilst providing a system with multiple methods of creating sound using all the
input methods available within the Wiimote. With an attempt to keep the Wiimote and
the system as separate as possible by allowing the system to be controlled remotely.
Contents

1 Introduction 1
1.1 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Aims . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Project Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4.1 The System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4.2 Required Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4.3 Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Literature Survey 6
2.1 Gesture Capture Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1.1 Hidden Markov Models (HMM) . . . . . . . . . . . . . . . . . . . . . 7
2.1.2 Conditional Random Fields (CRF) . . . . . . . . . . . . . . . . . . . 8
2.2 Wii Remote Connection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.1 Java . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.2 C++ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.3 C. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.4 Decision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3 Musical Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.1 Physical Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.2 Synthetic Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . 13

ii
CONTENTS iii

3 Design 14
3.1 Features vs Playability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2 UI Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3 Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.4 Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4 Requirements 19
4.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.1 System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.1.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.1.3 Gesture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.1.4 Sound . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . 22

5 Implementation and Testing 24


5.1 Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.1.1 Recogniser Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.1.2 Implementation in Java . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.1.3 New Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.2.1 Pitch/Roll Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2.2 Flute Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2.3 IR Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.3 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.3.1 Testing the Gesture Recogniser . . . . . . . . . . . . . . . . . . . . . 33
5.3.2 Testing the Instruments . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.5 Gesture Recognition Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.5.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.5.2 Results Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
CONTENTS iv

5.6 Instrument Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38


5.6.1 Results Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

6 Conclusions 42
6.0.1 Further Developments . . . . . . . . . . . . . . . . . . . . . . . . . . 43

A Design Diagrams 47

B Raw results output 49


B.1 User Questionnaires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

C A Final View of the System Requirements 57

D Code 59
D.1 wiinote.engine.ListenerFlute.java . . . . . . . . . . . . . . . . . . . . . . . . 60
D.2 wiinote.engine.ListenerLeds.java . . . . . . . . . . . . . . . . . . . . . . . . . 60
D.3 wiinote.engine.ListenerPitchRoll.java . . . . . . . . . . . . . . . . . . . . . . 60
D.4 wiinote.engine.MidiOut.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
D.5 wiinote.engine.MWProcess.java . . . . . . . . . . . . . . . . . . . . . . . . . 62
D.6 wiinote.engine.Wiinote.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
D.7 wiinote.gesture.AccDirectionObject.java . . . . . . . . . . . . . . . . . . . . 66
D.8 wiinote.gesture.AccelerationArray.java . . . . . . . . . . . . . . . . . . . . . 67
D.9 wiinote.gesture.ConvArray.java . . . . . . . . . . . . . . . . . . . . . . . . . 68
D.10 wiinote.gesture.GestureObject.java . . . . . . . . . . . . . . . . . . . . . . . 69
D.11 wiinote.gesture.GestureRecognisedEvent.java . . . . . . . . . . . . . . . . . 69
D.12 wiinote.gesture.GestureRecogniser.java . . . . . . . . . . . . . . . . . . . . . 70
D.13 wiinote.gesture.ListenerGestureCapture.java . . . . . . . . . . . . . . . . . . 71
D.14 wiinote.gesture.PathObject.java . . . . . . . . . . . . . . . . . . . . . . . . . 72
D.15 wiinote.ui.GestureGui.java . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
D.16 wiinote.ui.Gui.java . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
D.17 wiinote.ui.MessagesWindow.java . . . . . . . . . . . . . . . . . . . . . . . . 80
D.18 wiinote.ui.NoteWindow.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
List of Figures

1.1 A Wii Remote . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2


1.2 Gantt Chart Showing the Planned Timing of Work Throughout the Project 5

2.1 An example Hidden Markov Model . . . . . . . . . . . . . . . . . . . . . . . 7

3.1 The Wiinote User Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . 15


3.2 The Message Window (left) and the Note Window (Right) . . . . . . . . . 16
3.3 The two proposed IR instruments. Idea 1 (left) and the chosen method, idea
2 (right) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5.1 A Visual Representation of the Array Condensing Procedure . . . . . . . . 25


5.2 A Visual Representation of the Gesture Recognition Tree . . . . . . . . . . 26
5.3 Diagram showing the 9 directions recognised in the system (left) and a pos-
sible set of 15 directions for use in 3-D space (right) . . . . . . . . . . . . . 27
5.4 Diagram showing how the contents of an AccelerationArray object represent
a gesture before and after calling the function removeLikeMotions() . . . . 28
5.5 Buttons used with the Flute Instrument . . . . . . . . . . . . . . . . . . . . 32
5.6 The Measurements Taken by the IR instrument . . . . . . . . . . . . . . . . 33
5.7 The IR Drumsticks(left), Along with a Circuit Diagram of their Design(right) 34
5.8 The 4 Gestures the Volunteers were asked to perform . . . . . . . . . . . . . 35
5.9 Graph depicting recognition accuracy against gesture complexity after 10,
50 and 100 repetitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.10 Charts summarising the volunteers questionnaire choices. . . . . . . . . . . 40
5.11 Graphs depicting the average times taken to hit 3 notes on each instru-
ment(left), and the average time to play a three note tune on each instrument
(right) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

v
LIST OF FIGURES vi

A.1 UML Class Diagram of the Gesture Recognition Package . . . . . . . . . . . 48


List of Tables

4.1 Table detailing the requirements covered in the system section. . . . . . . . 20


4.2 Table detailing the requirements covered in the instruments section. . . . . 21
4.3 Table detailing the requirements covered in the gesture section. . . . . . . . 21
4.4 Table detailing the requirements covered in the sound section. . . . . . . . . 22
4.5 Table detailing the requirements covered in the non-functional requirements
section. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.6 A Summary of all the requirements of the system. . . . . . . . . . . . . . . 23

5.1 Flute instrument Button Combinations . . . . . . . . . . . . . . . . . . . . 32


5.2 Successful gestures by number of repeat teachings for a 1 sided gesture . . . 36
5.3 Successful gestures by number of repeat teachings for a 2 sided gesture . . . 36
5.4 Successful gestures by number of repeat teachings for a 3 sided gesture . . . 36
5.5 Successful gestures by number of repeat teachings for a 4 sided gesture . . . 37

B.1 Table Containing Chris’ Questionaire Results . . . . . . . . . . . . . . . . . 50


B.2 Table Containing Pauls Questionaire Results . . . . . . . . . . . . . . . . . 51
B.3 Table Containing Dans Questionaire Results . . . . . . . . . . . . . . . . . . 52
B.4 Table Containing Ellies Questionaire Results . . . . . . . . . . . . . . . . . 53
B.5 Table Containing Liams Questionaire Results . . . . . . . . . . . . . . . . . 54
B.6 Table combining the results of all Pitch/Roll tests. . . . . . . . . . . . . . . 56
B.7 Table combining the results of all flute tests. . . . . . . . . . . . . . . . . . 56
B.8 Table combining the results of all IR tests. . . . . . . . . . . . . . . . . . . . 56

C.1 Table containing the final status of the system requirements. . . . . . . . . 58

vii
Acknowledgements

I’d like to acknowledge:

• Liz, my Mum, for all the support,

• Chris for all the advice,

• and Ellie for keeping me sane! (Well, as sane as I was to start with!).

viii
Chapter 1

Introduction

1.1 Problem Description

Musical instruments have existed for many thousands of years and have defined cultures
since the dawn of time. New musical instruments are still being created to this day with
notable modern examples such as the electric guitar and the digital keyboard.
Most modern musical instruments are based heavily on classical examples (such as the
aforementioned guitar and keyboard), However, advances in technology have allowed people
to experiment with many weird and wonderful ways of creating music. One of the most
prominent of these being the Theremin, one of the earliest electronic instruments played by
waving ones hands around 2 radio antennas which in turn creates an often eerie, electronic
sound.
Recent times have also brought a wave of new human computer interaction methods such
as touchscreens and voice recognition. One of the more novel examples of hci is Nintendo’s
Wii Remote (known informally as the Wiimote), used to control the Wii games console. It
contains a three-axis linear accelerometer which can capture movements, allowing users to
interact with their games based on their movements.
The Wiimote connects to the Wii console using Bluetooth. This means that a Wiimote can
also be connected to any system with Bluetooth (such as a PC) to provide accelerometer
support.

1.2 Aims

By combining these two ideas, using accelerometers to interact with a computer and the
creation of music in new and interesting ways, the aim of this project was born. To use a
computer to interpret data coming from a Wiimote, and output musical notes in accordance
with the users commands. Thus turning the Wiimote into a musical instrument.

1
CHAPTER 1. INTRODUCTION 2

Figure 1.1: A Wii Remote

1.3 Objectives

The computer will need to take the raw data from the Wiimote, make the data useful to
the system and then use that data to decide what should be played. There should also be
some method of gesture recognition to control the system functionality.
For this to be viable as a musical instrument, this data refining and gesture recognition
must happen in real time with a sufficiently high success rate to be reliable whilst giving a
musical performance. A musical instrument that cannot be performed is not a particularly
useful instrument.
The gestures should also be customizable, to allow a user to perform whatever movement
seems the most comfortable to them to perform an action. The system should also output
in a format that can be used by many different systems and thus give the possibility of
extending the system at a later date.
Taking all this into account, below are the high level requirements of the system:

1.3.1 Functional Requirements

• The system must be able to take the input from a Wiimote connected by Bluetooth
in such a way that it is useful to the system.

• The system must be able to recognise a gesture made by the user with the Wiimote
and know how to react when that gesture is made.

• The User should be able to define new gestures and define an action that will be
performed when the gesture is recognised.

• The system must be able to output data in a musical format (e.g. MIDI) that is
widely used and understood by outside systems.
CHAPTER 1. INTRODUCTION 3

1.3.2 Non-Functional Requirements

• The system must work in real time to make giving a musical performance using the
Wiimote possible.
• The gesture recognition must be sufficiently accurate at judging gestures that the
system can be reliably used to give a performance.
• The system should be able to play any song, given that the user has had enough
training/practice (i.e. the system should act as an actual instrument, rather than a
‘toy’)

1.4 Project Plan

1.4.1 The System

The completed system will consist of 4 main sections. These are:

Raw Data Gathering

This is the initial section of the system. It’s purpose is to take a snapshot of the raw data
from the Wiimote and hand it to the system. There are already libraries available that are
designed for this task which can be used fr this project, therefore making this section the
easiest to complete.

Music Generation

This section takes the raw input from the section above and uses it to create sound. This
would be done by taking the inputs and running an algorithm that generates a note and any
properties of the note that are needed. These details are then passed to the next section.

Midi Synthesis

This is the final section, it takes the information from the previous section and creates a
midi message from it. These messages will then be sent to a midi controller to control an
outside system which will play the music. There are libraries for controlling MIDI events
already available, which should make this section relatively straightforward.

Gesture Recognition

This is by far the most complex section of the project. It will have to take the data from the
Wiimote, attempt to recognise any gestures made by it and perform the action related to
CHAPTER 1. INTRODUCTION 4

it. My initial reading has determined a number of methods which will attempt to decipher
the gestures from the data. This will also have to be done reliably and in real time.

1.4.2 Required Resources

Below is a list of resources that I foresee I will need to successfully complete the project.

Software Resources

• Libraries to communicate with and retrieve raw data from the Wiimote. These are
available freely on the Internet.

• Libraries to communicate with MIDI Systems. These are also available freely on the
internet.

• A compiler for either the C, C++ or Java language, whichever becomes a more obvious
choice after the literature survey. Currently the most obvious language would seem
to be Java. Compilers for both are available from the library computers.

Hardware Resources

• A Wiimote. I own two of these so they’re easily available

• A Bluetooth Connection. My Computer has one of these, so it is already available.

• Computer for software creation and write-up of dissertation. I have one of these and
there are several available in the library.

Literature Resources

• Papers regarding gesture capture

• Papers regarding projects dealing with accelerometers


CHAPTER 1. INTRODUCTION 5

1.4.3 Gantt Chart

Figure 1.2: Gantt Chart Showing the Planned Timing of Work Throughout the Project
Chapter 2

Literature Survey

There are many areas of this project that need to be investigated before any major decisions
can be made. The title of the project leaves a large scope for creation of a system and whilst
simply ‘jumping in’ and seeing what happens could work well for building this system, it
leaves the possibility of suffering the same problems and making the same errors as found
by those who have attempted similar endeavours in the past. As Konrad Adenauer has
said, “History is the sum total of things that could have been avoided”.
This literature survey then, is to take the ideas of the project proposal, explore the pos-
sibilities within them and discover the methods by which the project can progress in the
most successful fashion possible.
I have split this document into three main parts, the first looking at various methods of
capturing gestures made by a user. The second section is a brief look at the various available
libraries which exist to connect with a Wii Remote and form a choice over which to use to
complete the project, and finally a look at musical instruments and a look at how they can
be represented in a computer system.

2.1 Gesture Capture Methods

The ability to recognise human movement with a computer system has been experimented
with for nearly 3 decades (Moeslund and Granum, 2001)(Moeslund, Hilton and Krger, 2006)
and the field has recently started to accelerate with the expansion of physical interfaces by
many big name companies such as Microsoft (Microsoft Surface, 2008) and Apple (Apple
IPhone, 2009).
The initial stage of capturing a gesture is to choose some sort of input method, in the
above two examples (Microsoft Surface, 2008)(Apple IPhone, 2009) the input method has
been a multi-touch screen which has taken user input. But there are many other meth-
ods such as cameras(Wilson, 2004), IR cameras with IR emitting diodes(Kapoor, 2001),
gyroscopes(Sakaguchi, Kanamori, Katayose, Sato and Inokuchi”, 1996) and many more

6
CHAPTER 2. LITERATURE SURVEY 7

(Moeslund et al., 2001)(Moeslund et al., 2006).


A method receiving much attention is gesture capture using accelerometers (Pylvninen,
2005)(Hollar, Perng and Pister, 2000)(Sakaguchi et al., 1996). Accelerometers are instru-
ments that measure the rate at which the velocity of an object is changing (Accelerometer:
In Encyclopedia Britannica from Encyclopedia Britannica Online, 2009). This can be used
to measure an objects movement in a 3D space. With the Wiimote, Nintendo has intro-
duced accelerometers into its controllers, allowing its users to control onscreen actions by
moving the controller. The development of these cheap, widely available accelerometer
based systems has allowed accelerometers to be used for far wider purposes than originally
possible.
There are several methods of recognising a gesture captures by a system. During my
research, I learned of some of these methods which are described below:

2.1.1 Hidden Markov Models (HMM)

Hidden Markov Models (HMMs) are a generative approach to gesture capture (Morency,
2007) and can be used when a system of states can be described as a Markov process1 with
unknown parameters. Given a set of observable states an HMM can calculate the most
likely sequence of hidden state transitions from a sequence of the observed states.

Figure 2.1: An example Hidden Markov Model

An HMM is constructed using a Markov model to model the hidden states and the possibil-
1
A Markov process is a random process whose future probabilities are determined by its most recent
values.
CHAPTER 2. LITERATURE SURVEY 8

ities of transition between each state, along with a start vector containing the probabilities
of starting at each state. The status of these hidden states is unknown by the system.
In addition to these hidden states are a number of observable states, along with a matix
containing the probability that the system is in a certain hidden state when an observed
state is seen.
If a system can be described with an HMM then three separate problems can be solved.

• Firstly, given a sequence of observed states, the most likely sequence of hidden states
can be calculated (known as decoding).

• Secondly, given the HMM, the probability of a sequence of observed states occurring
can be calculated(known as evaluation).

• Finally, we can generate an HMM given a sequence of observations (Known as learn-


ing).

To use this to recognise gestures we would use the learning algorithm to create a Hidden
Markov Model for each of the gestures that could be recognised. Then when a gesture is
performed, we can use an evaluation of the sequence of observed states with each of the
HMMs in the system, the HMM returning the highest probability is the most likely to be
the gesture the user was performing. Knowing this we can perform the action related to
the gesture.
HMMs are popular because they are simple and flexible (Murphy, 2002). They have been
used extensively in speech recognition due to their ability to model speech in a mathemat-
ically tractable way (Warakagoda, 1996). Along with being used for speech recognition, it
has also been used in gesture recognition quite successfully (Pylvninen, 2005).

2.1.2 Conditional Random Fields (CRF)

Conditional Random Fields (CRFs) are a discriminative (as opposed to generative as the
HMMs above) method of gesture recognition (Wang, Quattoni, Morency, Demirdjian and
Darrell, 2006). It uses a ‘discriminative sequence model with a hidden state structure’(Wang
et al., 2006) and attempts to model the entire sequence with the given input sequence.
A CRF an undirected graphical model (G), which contains a number of vertices (V ) each
of which represents a random variable(y ∈ Y ), and a number of edges each representing a
dependency between two random variables. The distribution of each variable of the set Y
is conditioned on an input sequence X.
A CRF also contains a number of ‘potential functions’ derived from the idea of conditional
independence. Each potential function works on a subset of vertices and each of these
subsets must form a clique as to be sure that a function is not altering a conditional
distribution over two vertices where no conditional distribution exists.
CHAPTER 2. LITERATURE SURVEY 9

The graph can be laid out in any arbitrary manner, however it is more often set in a chain
of vertices with an edge between each Yi-1 and Yi vertex. This layout allows the use of
efficient algorithms for solving the three following problems:

• Calculating the most probable label sequence Y given X. (Known as decoding).

• Generating conditional distributions between vertices and functions from training


data (Known as training).

• Calculating the probability of a given sequence Y occurring given X (Known as


inference).

Here it is useful to note the similarities of the solutions between a CRF and an HMM. The
same method of using these solutions can be applied to a CRF to determine the gesture
attempted by the user from the observed data.
According to (Wallach, H.M., 2004), the advantage of CRFs as opposed to HMMs is in
their conditional nature which allows the relaxation of independence assumptions which
are required in HMMs.
The CRF method has been extended further into the two following methods:

Dynamic CRFs

Dynamic CRFs (DCRFs) are an extension of the CRF method whose structure and pa-
rameters are repeated over a sequence (Morency, 2007). When using hidden variables with
this system the system becomes difficult to optimise (Morency, 2007).

Latent Dynamic CRFs

Latent Dynamic CRFs (LDCRFs) are an extension of the CRF and the DCRF methods
which attempts to incorporate hidden fields and sub-structures to better recognise gestures
(Morency, 2007). The original testing was based on video input from a camera which at-
tempted to recognise the movements of a human subject. This method was seen to compare
favourably to other methods such as HMMs in the paper by Morency et al (Morency, 2007).

2.2 Wii Remote Connection

As I have already decided that I will be using an existing library to carry data from the
Wiimote into the programming language which I will use to create my system I will now
have a brief look at the libraries for connecting to the Wii Remote which are currently
available. Libraries exist for three languages; C, C++ and Java.
CHAPTER 2. LITERATURE SURVEY 10

2.2.1 Java

Java would be my preferred language to attempt the project with, I have used it extensively
in the past and this experience would mean less work to learning a new language. There
are three possible libraries for use with Java, these are WiiRemoteJ, Wiimote-Simple and
WiiuseJ.

WiiRemoteJ

WiiRemoteJ is a library written in Java and is currently hovering between beta and full
release. There is very little documentation supporting it to be found on the net and no
apparent official home for the system, although many videos and images can be found
displaying some of its functionality (WiiRemoteJ Technical Demo, 2009). Attempting to
run the demo of WiiRemoteJ on my system caused numerous errors and did not work,
making it unfit for use within this system. It has been mentioned here however as it is
considered one of the more ‘feature-rich’ libraries for the Wiimote.

Wiimote-Simple

http://code.google.com/p/Wiimote-simple/
Wiimote-Simple is a library designed as an alternative to WiiremoteJ, It is open source,
however as the designer mentions, it has less functionality than WiiRemoteJ but is offered
as an alternative for people who could not get WiiRemoteJ to work.
It does provide the ability to read accelerometer data, IR data and respond to button
pushes which should be enough to use the Wiimote to interface with the system. As with
WiiRemoteJ, there is little documentation available for the implementation, and is also not
as well supported as WiiRemoteJ.

WiiuseJ

http://code.google.com/p/wiiusej/
WiiuseJ is another library written in Java. It provides a Java API for the wiiuse system
for C (which provides C libraries for communicating with the Wiimote), The API is open
source, although the Wiiuse libraries it uses are not.
It provides a great deal more functionality than Wiimote-Simple such as filters to normalise
the accelerometers and support for a number of extension controllers. It also has a wealth
of documentation and has been used in many other projects (which can be found from the
libraries website). It also seems to be well supported by its creator.
CHAPTER 2. LITERATURE SURVEY 11

2.2.2 C++

I have never used C++ before, and this would mean that choosing any of the following
libraries would mean learning the language from scratch, this puts the library in this
section at a disadvantage, however, I will still summarise what I learnt of the system
for completeness.

WiiYourself

http://wiiyourself.gl.tter.org/
WiiYourself is a library for C++ which is quite well featured and has been used in the
development of several projects (of which details can be found on the developers site).
The system seems well supported and works with most commercially available Bluetooth
stacks, although there doesn’t seem to be a great deal of documentation to accompany the
system. WiiYouself currently only works on Windows, which may limit the abilities of any
system created with it.

2.2.3 C

The final language on the list is C. My knowledge of C is basic but functional making C
a viable choice for the development of the system. I discovered two C libraries during my
research, CWiid and WiimoteAPI.

CWiid

http://abstrakraft.org/cwiid/
CWiid is a library developed for use in C and is quite full of features (some unique to this
implementation, such as an IR Tracker and an interface for Python). Like WiiuseJ and
Wiimote-Simple it is Open Source and is well supported (including a roadmap of future
features to be implemented, as well as bugfixes)
It is currently at version 0.6.00 (as of April 2009) and as well as being well supported (with
a small community building around it) it is also fairly well documented. However, only
Linux support is available currently with no view to port the system to any other platform.
This would limit any system developed using it.

WiimoteAPI

http://code.google.com/p/Wiimote-api/
WiimoteAPI is a basic library for C, it contains functionality for the IR sensor and the
CHAPTER 2. LITERATURE SURVEY 12

buttons, but seemingly not the accelerometers. The lack of accelerometer support greatly
limits the feasibility of using this library.
There is a little documentation available, but not a great deal, and support for the system
seems to have ended about two year ago (as of April 2009)

2.2.4 Decision

From looking at the available options, It has become clear to me that due to my proficiency
with Java relative to the other available languages that I should be using a Java based
library. This leaves WiiremoteJ, WiiSimple and WiiuseJ.
Of these, as WiiremoteJ is not compatible with the computer I’ll be coding on, and the
very basic nature of WiiSimple, the best choice of library would seem to be WiiuseJ, and
as such, I will be using this for the development of the system.

2.3 Musical Interfaces

As this project is the creation of a musical instrument some time should be dedicated to
the study of current musical instruments and how they create music, as such I shall now
briefly go through the ideas found in my reading on music.

2.3.1 Physical Instruments

With most physical instruments the player crates sound by manipulating a part of the
instrument which vibrates (NH Fletcher, 1998). This could be a string on a guitar, the air
in a trombone, the skin of a drum etc. The note is altered by altering the vibrating part
(holding a fret on a guitar, lengthening the tube of a trombone, tightening the skin on a
drum.)
This holds true for all but a few instruments, these few are instruments created in the
electronic age and are not reliant on the player creating the vibration themselves (Glinsky,
1992). A major example of this is the Theremin. The worlds first electronic instrument
and the only instrument in the world played without physically touching it (Glinsky, 1992).
The Theremin was the first instrument where the player did not have to create the vibra-
tions as they were created with an oscillator which was manipulated by the player remotely.
This is however, an exception to the rule. the majority of musical instruments are played
my manipulating them physically (NH Fletcher, 1998).
To create a musical instrument from a Wiimote will have to be perched somewhere between
these two ideas. It will have to involve directly manipulating the interface (the Wiimote)
to choose the usual elements of the waves to create (i.e. The frequency and the amplitude
of the note). However, the actual notes will be created within the computer (which will
CHAPTER 2. LITERATURE SURVEY 13

act as the oscillators, and create the timbre of the note.)

2.3.2 Synthetic Instruments

This means that the system will need some method of creating synthetic sounds. For this
project this will be handled by an external process. The system will take the information
from the Wii remote and create from it the information it needs (gestures, note frequency,
volume etc.) and this will be passed to a digital synthesizer which will use this to make
sound which can be played out.
As I see it, there are two ways to achieve this, either have the external synthesizer written
into the system such that it has the data passed directly into it, Or have the system output
to an intermediate representation which can be understood by the (or possibly several
different) synthesizer(s).
Such an intermediary is MIDI. I feel that outputting via MIDI is the best idea for the
system as it allows a far wider number of musical synthesis systems to be used at will,
however, MIDI uses discrete codes for notes, which holds the instrument I create to certain
notes instead of being able to play all possible frequencies. This is a limitation on the
instrument, but may well make it easier to play (and to code!)
Chapter 3

Design

The design of the system created many challenges and in this section I will attempt to
discuss the decisions which led to the final system.

3.1 Features vs Playability

One of the biggest issues throughout the development was the balancing of the feature-set
of the system with its usability. From the outset, I wanted the system to behave as much
like a musical instrument as possible. By this I envisioned the ‘computer-system’ as being
as invisible as possible, thus making the journey between the Wiimote and the sounds
produced perceivable as a single step, without any intermediate factors.
To do this effectively, It seemed to me that the software between the Wiimote and the
MIDI output would have to have enough features within it that nothing could be seen
as ’missing’, however it needed to be lightweight enough that the user could focus on the
playing of the instrument and not on fiddling with options and settings within the system.
This led to one of the largest questions of the project ‘What features need to be in this
system, and which should be removed?’.
The set of features that I had originally envisioned was cut down to a subset which could be
implemented and complement each other well without overburdening the user whilst they
were playing the instrument. This reduced set was more useful to the system in several
ways. It helped create a ‘cleaner’ interface where all the options could be presented to the
user on a single frame without over-complicating the design of the interface.

3.2 UI Design

To keep the software as simple as possible whilst playing the instruments, The UI (see
Fig 3.1) was reduced to a single frame for all the musical systems (which was aided by the

14
CHAPTER 3. DESIGN 15

Figure 3.1: The Wiinote User Interface.


CHAPTER 3. DESIGN 16

decision to limit the number of features, see the section 3.1 ‘Features vs Playability’ above).
This decision however led to a further issue, which was how to display information from
the system to the user without further complicating the interface and in such a way that
the user could ignore the messages if they needed to.
This led to the development of two new Java classes, the MessageWindow class and the
NoteWindow class (see Fig 3.2). I chose to add these windows to the system as a method
of presenting useful information to the user without interrupting the use of the system (for
example with dialog boxes).

Figure 3.2: The Message Window (left) and the Note Window (Right)

3.3 Control

As the system was intended to perform as a musical instrument, It seemed that the system
should be controllable from away from the computer. There were two ways this could be
approached. The first was to map the system commands to different buttons upon the
Wiimote and allow those options to be chosen by pressing the correct button. The second
was to introduce a method of gesture recognition such that each gesture would trigger
different options within the system.
The idea of using separate buttons to select options was the most straightforward method
to implement as it would take very little to setup a system which reacted based on button
presses on the Wiimote. However, this would have also removed the buttons from being
used to create music which would have limited the instrumental options. A second problem
with this idea was that the Wiimote only has a limited number of buttons, after they have
been exhausted no new features could be added to the system which imposing a limitation
on the function of the system.
By comparison, using a gesture recognition system would leave the buttons free for use
CHAPTER 3. DESIGN 17

as an instrument and a much larger set of gestures could be added than that amount of
buttons on a Wiimote.
One issue would be the removal of the Accelerometers from the musical half of the system.
This can be avoided by using the ‘Nunkchuk’ extension controller to interpret gestures. As
such, the entire Wiimote was free to be used as an instrument, and using their off-hand,
the user could control system functions whilst still performing using the Wiimote.

3.4 Gestures

After the decision to use gesture recognition to issue system commands remotely a new
set of choices had to be made, the most significant of these was ‘Which method of gesture
recognition should be implemented?’. After taking a look at the domain of gesture recog-
nition (see section 2.1 above). The choices were narrowed down to using Hidden Markov
Models due to their proven abilities for gesture recognition or to using a new method which
was unproven.
Hidden Markov Models were a more simple choice, there were already implementations
available in Java and documentation supporting both how to create and use them and
their effectiveness in practice. They have also been used previously with Wiimote based
gesture recognition projects (Schlömer, Poppinga, Henze and and Boll, 2008).
The new method was an idea I imagined whilst thinking about gesture capture. After
researching the field it became apparent that there were no implementations of the method
previously attempted. This option seemed more risky as the implementation may not be a
useful method of recognition (There may well be a reason why it has not been implemented
before!). Notes on the implementation of this method can be found in the next chapter.
In the end, the decision was taken to go with the unproven method, The thinking behind
this decision was that re-implementing an HMM would provide less worth than attempting
a new method and observing its usefulness. The chance to create something new overtook
the rewards for taking the ‘secure’ option.

3.5 Instruments

The central part of the system was always the instruments. For this project the decision
was to create an instrument for each of the input on the Wiimote (Requirement FI04, Table
4.1.2).
This led to three different instruments, one using the accelerometer and orientation func-
tions, one using the buttons on the face of the Wiimote and the final using the IR camera
functionality built into the front of the Wiimote.
The 2 instruments considered for the accelerometer were to use the orientations of the
Wiimote (its pitch and roll) to represent the note and the volume of the note, or to use the
CHAPTER 3. DESIGN 18

acceleration of the Wiimote to determine its volume and the roll to determine the note.
The second instrument proved too inaccurate when choosing a volume and was also far
more tiring in practice, reducing the time it could be played. The first method was far
less tiring and seemed more intuitive to play. Therefore, the first style of instrument was
chosen to be implemented into the system.
The button input had less choices available to it (as any way of playing it will involve
pressing the buttons to play notes). The only choices here were how to play notes using
the buttons. There were not enough buttons to assign one to each note in a MIDI octave (as
required by requirement FI02, table 4.1.2). This led to the idea of taking inspiration from
wind instruments which use a combination of different finger positions to reach different
notes, Thus a combination of different buttons are used to represent each note.
The third input method was the infra-red camera mounted on the front of the Wiimote.
This picks up IR light sources and displays them as dots in an X-Y plane, then these values
are passed to the system through the WiiuseJ system. There were many methods this could
have been used to create music. The two foremost methods (see fig 3.3) were to either set
up an array of IR LEDs which the Wiimote could be pointed at and used to measure the
Wiimotes position in space then map this value to a note (much like how the position of a
musicians hand near a theremin leads to a note being produced.) The second method was
to have the Wiimote as a static position and move IR sources in front of it to create music,
with the position of each LED controlling different aspects of the music (such as volume,
pitch etc).
In the end, the second idea was chosen. Both would require some manner of hardware
creation (i.e. building an array of LEDs), however, the hardware for the second idea was
far less complex requiring only a cell and an LED (see circuit diagram, fig 5.7 to work (with
no capacitors or resistors). This simplicity made it far easier (and cheaper) to build two
LED ’instruments’ than it would have been to build an array of LED’s for the Wiimote to
’look at’.

Figure 3.3: The two proposed IR instruments. Idea 1 (left) and the chosen method, idea 2
(right)
Chapter 4

Requirements

The project contains a number of broad areas of work. As such it seems appropriate to
order the functional requirements of the system under these headings. The identified areas
are:

1. System Requirements

2. Instrumental Requirements

3. Gesture Requirements

4. Sound Requirements

The following section will look at each of these areas separately in order to devise a set of
requirements that the system must meet.

4.1 Functional Requirements

4.1.1 System

The system requirements are those that relate to the system as a whole and not to any of
the other areas noted above. As mentioned in the literature survey, the preferable method
of connection between Java and a Wiimote was chosen to be the WiiuseJ library. As such,
one of the more straightforward requirements will be to use this library to connect the
Wiimote to the system.
Another feature that the system will require is to present the user with a simple to use
graphical interface. It would be beneficial for the user to be able to spend as little time
as possible interfacing with the computer in order to get the most out of playing the
instrument. Because of this a graphical interface is preferable to a text interface as it can

19
CHAPTER 4. REQUIREMENTS 20

present all options to the user without the need to learn a list of commands. It also opens
the instruments to use by less confident computer users.
A less important extension to the development of a user interface would be to display
musical information to the user. A useful idea of this would be to display the currently
playing note (or midi note number etc) to the user as they are playing, much like a digital
tuner is used with ‘real’ instruments. This could then be used to help users create a tune
or develop a ‘musical ear’, expanding the possibilities of the system.
Whilst talking about the user interface, it would also be a useful extension to provide a
window in which system events can be displayed to the user without interrupting their
musical session. This would not be a high priority requirement, however it would be a
valuable addition to the system.

ID Description
FSy01 The system should present the user with a graphical user interface
FSy02 The system should gather Wiimote data using WiiuseJ
FSy03 The system should inform the user of the note playing
FSy04 The system should inform the user of system developments

Table 4.1: Table detailing the requirements covered in the system section.

4.1.2 Instruments

The instruments section is a central section of the system and much of the rest of the
development is designed around it. In order to create a musical interface for the Wiimote, it
is a high priority that the system delivers a method capable of supporting multiple musical
instruments, by this it is meant that the system should contain some way of switching
between different methods of using the Wiimote as a tool to interface with the system.
It is also of a high priority that any instruments developed for the purposes of the project are
capable of playing the entire range of notes in 12-TET tuning. This would (theoretically)
mean that any musical piece written in this tuning could be played using the Wiimote
instruments and thus creating the idea of them being used as ‘real’ instruments.
A feature which is of low priority but which would enhance the functionality of the system
would be to be able to customise and create new instruments within the system. This
would allow the system to be extended with new instruments and allow users to customise
their experience of the system.
In order to make sure that the instruments created within the frame of the project cover all
aspects of the Wiimotes input interfaces, there should be an input created for each input
method within the Wiimote. This is a requirement of the instruments created as part of
the project and can be discarded once the system and the project are completed. However,
in order to fully evaluate the abilities of the Wiimote and the interface, this should be a
high priority for the project.
CHAPTER 4. REQUIREMENTS 21

ID Description
FI01 The system must support multiple instruments
FI02 Each instrument must be able to play notes A-G#
FI03 Users should be able to add new instruments to the system
FI04 Should contain instruments based on all the Wiimotes input methods

Table 4.2: Table detailing the requirements covered in the instruments section.

4.1.3 Gesture

The gesture system is another large section of the system. As such there are a number
of requirements which govern how it fits in with and interacts with the system. The first
requirement that should be noted here is that the system MUST have some method of
recognising gestures. This is of very high priority in the system as the gestures will be used
to control the system remotely. This should also be a requirement, that the system should
be controllable using the gesture recognition functionality.
One thing to note here is how much functionality the gestures should be able to control
within the system. Giving too much control could lead to users having too many gestures
to memorise and the system becoming overwhelming. Having too few undermines the
functionality of the feature. This collision of requirements can be solved by providing the
ability to control as many features of the system as possible, and at the same time allowing
the user to select which actions are performed by which gestures. This allows the user to
dictate what actions are useful to them, and how to perform them.
This connection of actions to gestures will also need some interface to the user to perform
these operations. This then extends the requirements to provide some interface for this
operation. Also useful for this would be the ability to give each gesture a ‘human-friendly’
name to make it simpler to differentiate between gestures in the system.
Also, if it is possible to assign gestures to actions, it should be possible to define new
gestures within the system. This would support both the preferences of the user and the
possibility for future growth to the systems functionality in the future.

ID Description
FG01 The system must have some method of identifying gestures.
FG02 The gesture system must be able to control system functionality.
FG03 The system must be able to learn new gestures.
FG04 Gestures should have a ’friendly’ Name available to the user.
FG05 The user should be able to assign actions to gestures.
FG06 The system should provide some interface for managing gestures.

Table 4.3: Table detailing the requirements covered in the gesture section.
CHAPTER 4. REQUIREMENTS 22

4.1.4 Sound

The final area of the system covers the creation of sound based on the inputs from the in-
struments created. As decided upon in previous sections, the most straightforward method
of output was to output MIDI messages to another system which could then take care
of the generation of sound (for example, a MIDI synth or CSound.) As such, the major
requirement of this section is that the system is able to connect to a MIDI device and send
to it messages of what to play determined by the user input.
Further to this are two lower priority requirements. The first is the option to switch between
playing a single note and playing a chord. This functionality would allow a user to create
a range of different sounds and make it possible to form a lead/rhythm dynamic musically.
Leading from this, the system could be made into a more powerful tool by the addition of
basic recording functionality. This could be achieved by noting the MIDI messages sent
by the system and whilst at the same time as sending them, recording a copy into a MIDI
sequence file. This recording could then be played as a backing track allowing the user a
richer musical experience. However it is not core to the systems functionality and as such
is still a low priority.

ID Description
FSo01 The system should support MIDI.
FSo02 The system should be able to record the instruments playing.
FSo03 The system should allow the playing of both chords and single notes

Table 4.4: Table detailing the requirements covered in the sound section.

4.2 Non-Functional Requirements

As always, alongside the functional requirements of the system are the non-functional
requirements. These are requirements used to judge the operation of the system.
A major requirement here is that the delay between the playing of a note on the Wiimote
and the sounding of the note by the computer is perceived as ‘instantaneous’, by this it is
meant that there is no visible delay between cause and effect. This should be viable under
all usual circumstances.
The second requirement here is a requirement placed on the gesture recognition system.
For it to be found as useful, it should be able to recognise gestures with a sufficiently high
accuracy. For the purposes of this project that accuracy has been placed at 75% as this
should be seen to be the minimum accuracy by which the gesture system can be deemed
to work successfully.
CHAPTER 4. REQUIREMENTS 23

ID Description
NF01 The system should be able to play sounds in real time
NF02 The system must recognise gestures with at least 75% accuracy

Table 4.5: Table detailing the requirements covered in the non-functional requirements
section.

Functional Requirements
System Requirements
ID Description
FSy01 The system should present the user with a graphical user interface
FSy02 The system should gather Wiimote data using WiiuseJ
FSy03 The system should inform the user of the note playing
FSy04 The system should inform the user of system developments
Instrument Requirements
ID Description
FI01 The system must support multiple instruments
FI02 Each instrument must be able to play notes A-G#
FI03 Users should be able to add new instruments to the system
FI04 Should contain instruments based on all the Wiimotes input methods
Gesture Requirements
ID Description
FG01 The system must have some method of identifying gestures.
FG02 The gesture system must be able to control system functionality.
FG03 The system must be able to learn new gestures.
FG04 Gestures should have a ’friendly’ Name available to the user.
FG05 The user should be able to assign actions to gestures.
FG06 The system should provide some interface for managing gestures.
Sound Requirements
ID Description
FSo01 The system should support MIDI.
FSo02 The system should be able to record the instruments playing.
FSo03 The system should allow the playing of both chords and single notes
Non-Functional Requirements
ID Description
NF01 The system should be able to play sounds in real time
NF02 The system must recognise gestures with at least 75% accuracy

Table 4.6: A Summary of all the requirements of the system.


Chapter 5

Implementation and Testing

5.1 Gesture Recognition

As the gesture recognition system used is a new system, I will now describe how it works.
Following that, I will describe how it was implemented in Java.

5.1.1 Recogniser Method

The gesture recogniser is split into three distinct sections. The first captures the movements
and organises them into a state that the next section can use. The second section takes
the input captured by the first, then attempts to match what it has found to gestures it
has learnt previously, the third section then acts upon what the second section has found.
In this project, the system knows a gesture is occurring as the user holds the ‘C’ button
on the nunchuk extension whilst they are performing the gesture. During this time all
readings coming from the accelerometers within the nunchuk are stored as ‘samples’ of the
whole gesture.
Each of these samples are assigned a direction based on the accelerometer measurements
taken at that time. All these samples are stored in an array with their direction.
When the user releases the C button, the system traverses the array and removes any sample
where its direction is the same as the direction previous. For a visual representation of this
procedure, see fig 5.1.
This then completes the gesture capture section. The final array of directions is passed
to the recognition section. All the possible gestures are stored as an n-tree where each
branch represents a possible direction in the array. The directions are then used to move
through the tree until a final node is reached. This final node contains an ID of the gesture
performed to reach that node. This ID is returned from the tree and used to represent
the gesture that has been performed. If the final node has no registered ID, or the branch

24
CHAPTER 5. IMPLEMENTATION AND TESTING 25

Figure 5.1: A Visual Representation of the Array Condensing Procedure


CHAPTER 5. IMPLEMENTATION AND TESTING 26

Figure 5.2: A Visual Representation of the Gesture Recognition Tree


CHAPTER 5. IMPLEMENTATION AND TESTING 27

which needs to be followed is null, the tree returns a ‘Gesture Not Recognised’ value. A
visual representation is found in figure 5.2
The third section takes the ID returned from the tree and uses it to determine the action
of the system. This section is different in every system as each system will have different
actions to perform.

5.1.2 Implementation in Java

The recogniser was implemented in it own Java package as to allow its reuse in future
projects. This package contained several classes each covering separate parts of the imple-
mentation. A class diagram of the gesture recognition system can be found in the appendix
A.1.
The system recognises gestures in 2 dimensions, the X and Z directions1 . Thus performing
a gesture is akin to drawing an image on a blackboard. This was done to reduce the number
of directions being used. The system recognises 9 directions (as seen in fig 5.3) which are
‘North’, ‘North-East’, ‘East’, ‘South-East’, ‘South’, ‘South-West’, ‘West’, ‘North-West’ and
a point representing no movement called ‘Hold’. Figure 5.3 also shows a possible expansion
to this to cover 3D space.

Figure 5.3: Diagram showing the 9 directions recognised in the system (left) and a possible
set of 15 directions for use in 3-D space (right)

When a gesture is being captured, each sample is stored in an AccDirectionObject which,


1
The X and Z directions are used as these represent ‘Up/Down’ and ‘Left/Right’ in the WiiuseJ library.
The Y axis is the ‘Forward/Backward’ direction and is not used in this implementation.
CHAPTER 5. IMPLEMENTATION AND TESTING 28

as part of its construction takes the input of the X and Z acceleration and decides upon the
direction that this represents. Each sample that is taken is stored in an AccDirectionObject
and all of the AccDirectionObjects are stored, in order, in an AccelerationArray object.
This AccelerationArray object can then be thought of as the entire gesture.
When the gesture has finished being captured the AccelerationArray object calls its re-
moveLikeMotions() function. This removes all AccDirectionObjects from the array where
the direction is the same as the previous direction (fig 5.1, above) which condenses the
array down to a series of directions which define the gesture.

Figure 5.4: Diagram showing how the contents of an AccelerationArray object represent a
gesture before and after calling the function removeLikeMotions()

The learnt gestures are all stored in a tree, in this implementation the tree is built using
PathObject objects as nodes. Within each PathObject there is an array of PathObjects
which represent all the connected nodes beneath the current node. The PathObject tree is
traversed recursively by passing in an array of integers2 and following the nodes recursively
until one of three possibilities occurs:

1. The array is exhausted and the current PathObject contains a gesture ID. Return
the ID up the tree to the calling function.
2. The array is exhausted and the current PathObject contains no gesture ID. Return
a ‘Gesture Not Recognised’ ID to the calling function.
3. The array is not exhausted and the next required node doesn’t exist. Return a
‘Gesture Not Recognised’ ID to the calling function.
2
Each representing a direction, these are defined in AccDirectionObject.java
CHAPTER 5. IMPLEMENTATION AND TESTING 29

Once the tree has been traversed and a gesture ID returned (either, the ID of the gesture
which has been performed or the ‘Gesture Not Recognised’ ID) a GestureRecognisedEvent
is created and thrown to all GestureListeners listening to the GestureRecogniser class.
The GestureListeners are where the system decides what to do when a gesture occurs. Each
system would create a different GestureListener from the GestureListener interface.
In this system, the ListenerGestureCapture object implements GestureListener, within this
object is defined several system events. When a GestureRecognisedEvent occurs the Ges-
tureListener receives the gesture ID and looks up the system event related to it in the
ConvArray (short for Conv ersion Array) which is stored in the GestureRecogniser class.
This array stores GestureObjects which contain the Gesture ID, the related system event
and a name for the gesture which can be displayed in the GUI. Once the ListenerGesture-
Capture object has received the ID if the system event, the event is performed and the
system returns to a dormant state to listen for further gestures.

5.1.3 New Gestures

New gestures are added by first capturing that gesture (as above, using the Acceleration-
Array object) and then reducing the array in the same manner as before. Now, instead of
traversing the tree till its end we start at its root node and move through the tree following
the directions in the array. There are 4 outcomes that could happen at each branch:

1. If the branch we need to follow has a PathObject node at its end, we move forward
to that node, remove the top object in the array and repeat.

2. If the branch we need to follow has a null value at its end, we create a new PathObject
object and place it at the end of the branch as a new node. Then we move forward
to that node, remove the top object in the array and repeat.

3. If we reach a node with no gesture ID and the array is exhausted then we set the
Gesture ID of that node to be the gesture being recorded.

4. If we reach a node with a gesture ID and the array is exhausted there are two possible
outcomes:

(a) The gesture ID of the node and of the current gesture are equal, therefore we
have re-recorded a gesture and no action need occur.
(b) The gesture ID of the node and of the current gesture are not equal, we flag an
error message to the user and leave the current gesture ID in tact.

For the system to work effectively new gestures have to be taught to the system several
times (such that any slight differences get taught to the system as the same gesture). For
this implementation this is all handled in the GestureRecogniser class.
CHAPTER 5. IMPLEMENTATION AND TESTING 30

5.2 Instruments

The instruments in the system are all implemented in the MWProcess class. Each instru-
ment implements a different ActionListener which receives the necessary data from the
Wiimote and calls the functions in the MWProcess class with a different function for each
instrument.
The ActionListener classes (named ListenerPitchRoll, ListenerFlute and ListenerLeds) are
attached to WiiuseJs Wiimote object when the instrument is being played, to switch in-
struments the listeners are swapped out for the listener of the new instrument. By using
this method it is simple to direct the correct data to the functions requiring it, without
making a single, complex, listener. It also allows for new instruments to be added simply
in the future.
The MWProcess class contains the functions that turn the raw data input from the Wiimote
into a MIDI message output (which is sent using the MidiOut class). Each function takes
the raw data and determines a MIDI note number which is stored in a global variable.
Each instrument then calls the function ‘play()’ which sends two MIDI messages to the
connected MIDI device, the first is the message to stop playing the current note, and the
second is to start playing the new note determined by the function3 . An excerpt of the
play function can be found below:
Listing 5.1: function play()

// i f t h e n o t e s a r e t h e same , c o n t i n u e p l a y i n g c u r r e n t n o t e
i f ( newNote != p l a y i n g N o t e ) {

// i f t h e r e i s c u r r e n t l y a n o t e p l a y i n g
i f ( p l a y i n g N o t e != −1) {
ShortMessage o f f = null ;
try {
o f f = Wiinote . m i d i o u t . c r e a t e S h o r t M e s s a g e (
ShortMessage . NOTE OFF, 0 , p l a y i n g N o t e , 9 0 ) ;
Wiinote . m i d i o u t . sendMSG ( o f f ) ;
...
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e1 ) {
Wiinote . g u i . msgWindow . newMessage (
” Midi Message c o n t a i n s I n v a l i d Data ” , 3 ) ;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
Wiinote . g u i . msgWindow . newMessage ( ” Midi Port Not S e t ” ,
3) ;
}
}
3
If the system is set to play chords then 6 messages are sent, three to stop the current 3 notes, 3 to start
the new three.
CHAPTER 5. IMPLEMENTATION AND TESTING 31

// i f t h e r e i s a new n o t e t o p l a y
i f ( newNote != −1) {
ShortMessage on = null ;
try {
on = Wiinote . m i d i o u t . c r e a t e S h o r t M e s s a g e (
ShortMessage .NOTE ON, 0 , newNote , 9 0 ) ;
Wiinote . m i d i o u t . sendMSG ( on ) ;
...
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e1 ) {
e1 . p r i n t S t a c k T r a c e ( ) ;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
Wiinote . g u i . msgWindow . newMessage ( ” Midi Port Not S e t ” ,
3) ;
}
}
// t h e new n o t e i s now t h e n o t e p l a y i n g
p l a y i n g N o t e = newNote ;
}

5.2.1 Pitch/Roll Instrument

The Pitch/Roll instrument takes the roll of the Wiimote as the note to play. So by leaning
the Wiimote to the left or right different notes can be chosen. The volume of the current
note is determined by the pitch of the instrument with raising the Wiimote to vertical
creating silence and lowering it to horizontal creating maximum volume.

5.2.2 Flute Instrument

The flute instrument works by pressing a combination of buttons to receive a note. The
buttons used are the ‘Up’,‘A’,‘B’,‘1’ & ‘2’ (as seen in figure 5.5).
The ‘Up’ button is used to raise the selected note by an octave, The ‘2’ button raises
the selected note by a semitone (thus playing a sharp of the note). Table 5.1 shows the
combinations used to play each note.

5.2.3 IR Instrument

The IR instrument converts the (x,y) position of two IR light sources in view of the Wiimote
IR camera into MIDI notes. The note is selected based on the x distance between the two
sources, the octave is based on the y-position of the rightmost point (i.e. the light source
in the users right hand) and the volume is based on the y-position of the leftmost point
(i.e. the light source in the users left hand). Figure 5.6 displays how these are measured.
CHAPTER 5. IMPLEMENTATION AND TESTING 32

Figure 5.5: Buttons used with the Flute Instrument

Combination Note
1 A
A B
1+A C
B D
1+B E
A+B F
1+B+A G

Table 5.1: Flute instrument Button Combinations

Hardware

To play the IR instrument a pair of handheld IR devices had to be created. These were
made by creating a simple circuit composed of an AA Battery, a switch and an Infra-Red
LED. These were then attached to a drumstick to make the circuit more robust and to aid
in playing the instrument. An image of the drumstick controllers and a scientific view of
the circuit can be found in figure 5.7.

5.3 Testing

Due to the split nature of the system between the gesture recogniser and the musical
interface, the testing was performed in two parts. The following sections (5.3.1 and 5.3.2)
will describe the method used for testing each section.
CHAPTER 5. IMPLEMENTATION AND TESTING 33

Figure 5.6: The Measurements Taken by the IR instrument

5.3.1 Testing the Gesture Recogniser

The testing of the gesture recogniser was carried out by 6 volunteers. Each of the volunteers
was shown an image of a gesture to map into the system and asked to perform the gesture
a number of times such that the system could ‘learn’ the gestures. The test went as follows:

1. Perform 10 teaching gestures to allow the system to learn the gesture.

2. Attempt the gesture 50 times, noting the number of successes/failures.

3. Perform 40 further teaching gestures (bringing the number to 50).

4. Attempt the gesture again 50 times, noting the number of successes/failures.

5. Perform 50 further teaching gestures (bringing the number to 100).

6. Attempt the gesture again 50 times, noting the number of successes/failures.

This was performed for all 4 gestures allowing a view at how well the gesture recogniser
worked with different levels of complexity and learning.
CHAPTER 5. IMPLEMENTATION AND TESTING 34

Figure 5.7: The IR Drumsticks(left), Along with a Circuit Diagram of their Design(right)

5.3.2 Testing the Instruments

Using the same group of volunteers, each instrument was given to the user in turn and the
user given 5 minutes to ‘play’ with the instrument in order to learn how it worked. After
the 5 minutes play time the experiment was performed as follows:

1. Ask the user to play a specific note (C) and time their response

2. Ask the user to play another note (E) and time their response

3. Ask the user to play a third note (F#) and time their response

4. Ask the user to play a three note tune (C, E, F#) and time their response

Of the 6 volunteers in this test, 3 were musicians and 3 were non-musicians, this was
done purposely as to see how the instruments handled in the hand to people skilled in
playing music compared to those less skilled. Each of the volunteers was also asked a brief
questionnaire about themselves and their experiences of the instruments. The questions
asked were:

1. Name.

2. Instruments played.

3. Years playing each instrument.


CHAPTER 5. IMPLEMENTATION AND TESTING 35

Figure 5.8: The 4 Gestures the Volunteers were asked to perform

4. Favourite Wiinote instrument (+ Why?).

5. Easiest Wiinote instrument to play (+ Why?).

6. Least favourite Wiinote instrument (+ Why?).

7. Most difficult Wiinote instrument to play (+Why?)

5.4 Results

The next sections display the results of the tests documented above. A critique of each
method is supplied with the added benefit of hindsight. We will begin with the gesture
recognition testing.
CHAPTER 5. IMPLEMENTATION AND TESTING 36

5.5 Gesture Recognition Results

5.5.1 Results

Tables 5.2, 5.3, 5.4 and 5.5 are the results of the gesture recognition testing (described in
5.3.1). These results display the number of successful recognitions as a number (out of 50
repetitions) and as a percentage, the final column is a mean of the 6 columns of results.

Chris Ellie Paul Liam


10 Reps 12/50 (24%) 14/50 (28%) 13/50 (26%) 10/50 (16%)
50 Reps 37/50 (74%) 40/50 (80%) 37/50 (74%) 34/50 (68%)
100 Reps 42/50 (84%) 45/50 (90%) 41/50 (82%) 40/50 (80%)
Liz Dan Mean
10 Reps 12/50 (24%) 14/50 (28%) 12.5/50(25%)
50 Reps 38/50 (76%) 39/50 (78%) 37.5/50(75%)
100 Reps 41/50 (82%) 43/50 (86%) 42/50(84%)

Table 5.2: Successful gestures by number of repeat teachings for a 1 sided gesture

Chris Ellie Paul Liam


10 Reps 11/50 (22%) 13/50 (26%) 11/50 (22%) 9/50 (18%)
50 Reps 37/50 (74%) 38/50 (76%) 35/50 (70%) 35/50 (70%)
100 Reps 39/50 (78%) 41/50 (82%) 38/50 (76%) 41/50 (82%)
Liz Dan Mean
10 Reps 10/50 (20%) 12/50 (24%) 11/50 (22%)
50 Reps 34/50 (68%) 37/50 (74%) 36/50 (72%)
100 Reps 40/50 (80%) 41/50 (82%) 40/50 (80%)

Table 5.3: Successful gestures by number of repeat teachings for a 2 sided gesture

Chris Ellie Paul Liam


10 Reps 9/50 (18%) 10/50 (20%) 9/50 (18%) 8/50 (16%)
50 Reps 32/50 (64%) 35/50 (70%) 34/50 (68%) 34/50 (68%)
100 Reps 42/50 (84%) 35/50 (70%) 36/50 (72%) 35/50 (70%)
Liz Dan Mean
10 Reps 8/50 (16%) 10/50 (20%) 9/50 (18%)
50 Reps 33/50 (66%) 36/50 (72%) 34/50 (68%)
100 Reps 36/50 (72%) 38/50 (76%) 37/50 (74%)

Table 5.4: Successful gestures by number of repeat teachings for a 3 sided gesture
CHAPTER 5. IMPLEMENTATION AND TESTING 37

Chris Ellie Paul Liam


10 Reps 8/50 (16%) 10/50 (20%) 9/50 (18%) 5/50 (10%)
50 Reps 30/50 (60%) 31/50 (62%) 32/50 (64%) 27/50 (54%)
100 Reps 32/50 (64%) 36/50 (72%) 34/50 (68%) 31/50 (62%)
Liz Dan Mean
10 Reps 7/50 (14%) 9/50 (18%) 8/50 (16%)
50 Reps 28/50 (56%) 35/50 (70%) 30.5/50 (61%)
100 Reps 36/50 (72%) 35/50 (70%) 34/50 (68%)

Table 5.5: Successful gestures by number of repeat teachings for a 4 sided gesture

Figure 5.9: Graph depicting recognition accuracy against gesture complexity after 10, 50
and 100 repetitions

5.5.2 Results Analysis

It should be noted that with each volunteer, the four stages of tests (1, 2, 3 and 4 sides)
were performed within one session (usually lasting around an hour). Something noted by
almost all of the volunteers at some point was it was quite a tiring exercise physically, and
some noted that performing the same gesture 250 times was quite tedious. This issue with
the testing method should be taken into account if the test was ever to be re-performed.
There are two ways which this issue could be seen to have affected the results, first, the
physical fatigue could have subtly changed the motion of the user and lowered the average
number of successes. This would mean that we should look (increasingly with the later
tests) as the number being a low er than true count.
However, The constant repetition may also have ‘solidified’ the idea of the gesture in the
users mind, this would mean that the average number of successes would be higher than
CHAPTER 5. IMPLEMENTATION AND TESTING 38

usual, and that the numbers should be looked at as a higher than true count.
This ambiguous outcome means that without an in-depth look at the testing method we
cannot tell which outcome should be followed. Due to the time constraints on this project
it is not possible to perform testing on the testing of the project. However, the results taken
should be close enough to what would be found in ‘real-world’ use to take conclusions from
them.

5.6 Instrument Results

Figure 5.6 charts depicting the answers given in questionnaires by the volunteers to the
questions 4

1. Favourite Wiinote instrument.


2. Easiest Wiinote instrument to play.
3. Least favourite Wiinote instrument.
4. Most difficult Wiinote instrument to play.

Figure 5.11 displays two graphs. The first depicts the average time taken to hit each of the
three notes individually for each of the three instruments. The graph on the right shows
the average time taken to play the three note tune for each of the instruments.

5.6.1 Results Analysis

I believe these results give a good idea of how difficult each instrument was to play. The
only downside I can see to this method of quantifying each instrument was the uncertainty
in the timing (involving a human operator and a stopwatch). As such there may be a
margin of error on each side of the measurements taken. Most of this has been taken
care of by rounding the number to a single decimal place, the space of time between each
instruments results is sufficiently large to make the ordering of the instruments obvious
and as such, the remaining uncertainty can be ignored.
With hindsight, it may have also been a good idea to change the notes between timing the
individual notes and the tune sections of timing. It has been noted by the author whilst
looking at the testing process that the users were better able to play the tune after learning
where the individual notes were a moment earlier. However this happened with all three
instruments and all 6 volunteers, as such we can rule out any changes in timing this may
have made but should bear it in mind for any further testing using this method.
I believe this worked well as a method of testing the instruments against each other, how-
ever, this can only show how easy the instruments are compared to one another. A possible
4
Full versions of the questionnaires can be found in the appendix B.1
CHAPTER 5. IMPLEMENTATION AND TESTING 39

way to solve this would be to run the experiment again and include a number of ‘real’ mu-
sical instruments as a comparison of real instruments to their synthetic siblings. This,
however, is beyond the specifications of the project. Here it is enough to know that they
work.
CHAPTER 5. IMPLEMENTATION AND TESTING 40

Figure 5.10: Charts summarising the volunteers questionnaire choices.


CHAPTER 5. IMPLEMENTATION AND TESTING 41

Figure 5.11: Graphs depicting the average times taken to hit 3 notes on each instru-
ment(left), and the average time to play a three note tune on each instrument (right)
Chapter 6

Conclusions

The gesture recognition system worked reasonably well, If we look at figure 5.9 we can
see that as the complexity of the gesture increased, the systems accuracy in recognising it
decreased. By adding a line at 75% it is possible for us to compare the gesture recognition
ability of the system to the requirement NF02 (section 4.2). We can see that the gesture
recognition abilities of the system are within the requirements only up to gestures of 2
sides. We can also assume that this trend continues and that as the complexity of gestures
increases the accuracy in recognising them will continue to reduce.
If we look at the differences between the number of repetitions and accuracy in figure 5.9
we can see that the accuracy of the system increases after each set of repetitions. We can
also see that the 2 sided gesture only reaches an average of more than 75% accuracy after
it has been trained 100 times. With another round of teaching the system, the 3-sided
gesture may also cross the 75% threshold.
It is worth noting that this may hold true with any gesture, that after an ever-greater
amount of training any more complex gesture might be recognisable with a significant
accuracy.
Therefore, I believe it is possible to call the gesture recogniser a success, if only a ‘limited
success’ as it works well but for it to work well with a more complex range of gestures a
great deal of training must be performed.
The instrumental section of the project was, in my opinion, a greater success. Each of
the three instruments provided a useful interface with the system and all were able to
play simple musical pieces (and, with a little more practice of the operator, contained the
potential to play more complex pieces).
We can see that the Pitch/Roll instrument needs to be improved to remove the ‘overhead’
of playing multiple notes (See fig 5.11 where it was fastest at playing individual notes, but
slowed remarkably when attempting to chain notes together) and also to make it simpler
to play as evidenced in figure 5.6 where most users regarded it as both most difficult to
play, and least favourite overall. Many of the users regarding it as “too inaccurate” or “too

42
CHAPTER 6. CONCLUSIONS 43

awkward” to be useful as a musical instrument.


The other instruments faired better, with the IR instrument being hailed as the favourite
of the volunteers, and the flute being labelled easiest to play. The flute provided further
questions to the author, as the volunteers chose the flute as easiest to play, however the time
taken to play each note on the flute is by far the highest. After reviewing the volunteers
answers, I believe this can be attributed to the learning curve of the flute instrument.
The positive comments about the flute instrument seem to be based on the fact that the
notes were playable based on a discrete action (i.e. pressing the ‘A’ button) as opposed to
attempting to find an arbitrary angle.
The IR instrument was the favourite of the volunteers (and of the author) and also provided
the quickest timings during testing. I believe that this makes the IR instrument the most
successful of the three instruments created.
The instruments are limited by the way they each attempt to use only one input method of
the Wiimote. I believe that by making better use of the entire Wiimote and its functionality,
more creative instruments could be designed which provide the user with better methods
of creating and performing music, however as a proof of concept this set of instruments
works well.

6.0.1 Further Developments

It could be beneficial to extend the gesture recogniser to support gestures in 3 dimensions.


This would be done by adding extra directions (as in figure 5.3). The PathObject tree
would also have to be extended to have a path for each added direction. This would allow
the system to recognise gestures in 3 directions. However, this may reduce the accuracy of
the system. The only way to know for sure would be to implement it.
There were also two of the initial requirements which were not met by the project, firstly
‘FI03 - Users should be able to add new instruments to the system’ and ‘FSo02 - The system
should be able to record the instruments playing’. It was envisioned that users could be
able to write their own instruments within the system, however, due to time constraints the
instruments were hard coded into the system. It should still be simple to implement new
instruments given the source code and a knowledge of Java, but a further improvement
would be to provide functionality to create instruments within the system and without
altering the source.
The second requirement ‘FSo02 - The system should be able to record the instruments
playing’ was also removed due to time constraints, it should be possible to record the MIDI
messages coming from the system using an external MIDI sequencer. However, the ability
to control this within the system could be hugely beneficial to musicians and create an
‘all-in-one’ system.
A final further improvement following from this would be the ability to record and play
backing tracks as an accompaniment to the main instrument being played. This backing
CHAPTER 6. CONCLUSIONS 44

track could also be controlled using gestures to perform actions upon it (such as telling the
system to repeat a bar or skip forward a bar mid-song). These extensions of the project
would build on the possibilities of the system by providing further functionality useful to
musicians.
Bibliography

Accelerometer: In Encyclopedia Britannica from Encyclopedia Britannica Online


(2009)‘http://www.britannica.com/EBchecked/topic/2859/accelerometer’ Retrieved
January 2009.

Apple IPhone (2009)‘http://www.apple.com/iphone’ Retrieved January 2009.

Glinsky, A. V. (1992), ‘The theremin in the emergence of electronic music’.

Hollar, S., Perng, J. K. and Pister, K. S. J. (2000 ), ‘Wireless static hand gesture recognition
with accelerometers- the acceleration sensing glove’.

Kapoor, A. (2001), A real-time head nod and shake detector, in ‘in Proceedings from the
Workshop on Perspective User Interfaces’.

Louis-Philippe Morency, Ariadna Quattoni, T. D. (2007), ‘Latent-dynamic discriminative


models for continuous gesture recognition’.

Microsoft Surface (2008), ‘http://www.microsoft.com/surface/’ Retrieved December 2008.

Moeslund, T. B. and Granum, E. (2001), ‘A survey of computer vision-based human motion


capture’, Computer Vision and Image Understanding 81, 231–268.

Moeslund, T. B., Hilton, A. and Krger, V. (2006), ‘A survey of advances in vision-based hu-
man motion capture and analysis’, Computer Vision and Image Understanding 104(2-
3), 90 – 126. Special Issue on Modeling People: Vision-based understanding of a
person’s shape, appearance, movement and behaviour.

Murphy, K. P. (2002), Dynamic Bayesian Networks: Representation, Inference and Learn-


ing, PhD thesis, University Of California, Berkeley.

NH Fletcher, T. R. (1998), The Physics of Musical Instruments, Springer.

Pylvninen, T. (2005), Accelerometer based gesture recognition using continuous hmms, in


B. Pfitzmann, M. Mller-Olm, R. Cipolla, J. M. Boyle, J. J. Dongarra and C. B. Moler,
eds, ‘Lecture Notes in Computer Science’, Springer Berlin / Heidelberg, pp. 639–646.

45
BIBLIOGRAPHY 46

Sakaguchi, T., Kanamori, T., Katayose, H., Sato, K. and Inokuchi”, S. (1996), Human
motion capture by integrating gyroscopes and accelerometers, in ‘IEEE/SICE/RSJ
International Conference on Multisensor Fusion and Integration for Intelligent Sys-
tems, 1996.’, IEEE/SICE/RSJ, Washington, DC, USA, pp. 470–475.

Schlömer, T., Poppinga, B., Henze, N. and Boll, S. (2008), Gesture recognition with a Wii
controller, in TEI ’08: Proceedings of the 2nd international conference on Tangible
and embedded interaction, ACM, New York, NY, USA, pp. 11–14.

Wallach, H.M., (2004), Conditional Random Fields: An Introduction. , University of Penn-


sylvania CIS Technical Report, MS-CIS-04-21.

Wang, S. B., Quattoni, A., Morency, L.-P., Demirdjian, D. and Darrell, T. (2006), Hidden
conditional random fields for gesture recognition, in ‘Computer Vision and Pattern
Recognition, 2006 IEEE Computer Society Conference on’, Vol. 2, pp. 1521–1527.

Warakagoda, N. (1996), Narada Warakagoda’s hmm tutorial -


‘http://jedlik.phy.bme.hu/ gerjanos/HMM/node3.html’ Retrieved January 2009.

WiiRemoteJ Technical Demo (beta version 0.6) (2009), ‘http :


//www.dailymotion.com/Ctta0s/video/x1hrf lw iiremotej 0 Accessed April 2009.

Wilson, A. D. (2004), Touchlight: an imaging touch screen and display for gesture-based in-
teraction, in ‘ICMI ’04: Proceedings of the 6th international conference on Multimodal
interfaces’, ACM, New York, NY, USA, pp. 69–76.
Appendix A

Design Diagrams

47
APPENDIX A. DESIGN DIAGRAMS 48

Figure A.1: UML Class Diagram of the Gesture Recognition Package


Appendix B

Raw results output

B.1 User Questionnaires


49
APPENDIX B. RAW RESULTS OUTPUT
Name: Chris
Instruments Played Piano
Years Played 3 Years
Favourite Wiinote Instrument (+ Why)
IR Instrument - It was the most engaging to play, Felt more like a game than an instrument.
Easiest Wiinote Instrument to Play (+ Why)
Flute Instrument - The notes were more discrete, with the other instruments the thresholds between notes were ill-defined.
Least Favourite Wiinote Instrument (+ Why)
Pitch/Roll Instrument - It wasn’t particularly nice to play
Most Difficult Wiinote Instrument to Play (+ Why)
Pitch/Roll Instrument - the angles for each note were too difficult to get right.

Pitch/Roll Results
timed note (C): 1.3s
timed note (E): 1.3s
timed note (f#): 1.2s
tune (c, e, f#): 4.8s
Flute Results
timed note (C): 2.3s
timed note (E): 2.5s
timed note (f#): 2.0s
tune (c, e, f#): 5.9s
IR Results
timed note (C): 1.6s
timed note (E): 1.5s
timed note (f#): 1.7s
tune (c, e, f#): 5.5s

Table B.1: Table Containing Chris’ Questionaire Results

50
APPENDIX B. RAW RESULTS OUTPUT
Name: Paul
Instruments Played Guitar/Keyboard
Years Played Guitar - 9 Years, Keys - 6 Months
Favourite Wiinote Instrument (+ Why)
Flute: it felt like a real instrument and the notes were easier to find.
Easiest Wiinote Instrument to Play (+ Why)
Flute: See Above
Least Favourite Wiinote Instrument (+ Why)
IR: It was difficult to get right, seemed more like a gimmick to have the IR sensor working.
Most Difficult Wiinote Instrument to Play (+ Why)
Pitch/Roll: Finding the notes with any accuracy was too difficult.

Pitch/Roll Results
timed note (C): 1.1s
timed note (E): 1.0s
timed note (f#): 1.1s
tune (c, e, f#): 4.5s
Flute Results
timed note (C): 2.1s
timed note (E): 2.3s
timed note (f#): 1.9s
tune (c, e, f#): 5.4s
IR Results
timed note (C): 1.3s
timed note (E): 1.5s
timed note (f#): 1.6s
tune (c, e, f#): 5.0s

Table B.2: Table Containing Pauls Questionaire Results

51
APPENDIX B. RAW RESULTS OUTPUT
Name: Daniel
Instruments Played Drums
Years Played 6 Years(On and Off)
Favourite Wiinote Instrument (+ Why)
The Flute - Nicest to play, less messing around!
Easiest Wiinote Instrument to Play (+ Why)
The Flute - it was easier to hit the correct notes once I’d got used to where they were
Least Favourite Wiinote Instrument (+ Why)
The Pitch Roll Instrument - Playing it too long made my wrist ache! and it was tough to find the notes.
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch Roll Instrument - It was tough to find the right notes.

Pitch/Roll Results
timed note (C): 1.3s
timed note (E): 1.2s
timed note (f#): 1.4s
tune (c, e, f#): 5.7s
Flute Results
timed note (C): 2.5s
timed note (E): 2.6s
timed note (f#): 2.0s
tune (c, e, f#): 5.8s
IR Results
timed note (C): 1.6s
timed note (E): 1.6s
timed note (f#): 1.7s
tune (c, e, f#): 5.7s

Table B.3: Table Containing Dans Questionaire Results

52
APPENDIX B. RAW RESULTS OUTPUT
Name: Ellie
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
The IR Instrument was my favourite, it was fun to play around with!
Easiest Wiinote Instrument to Play (+ Why)
The IR Instrument was the easiest, after I found where the notes were it was quite easy to play them again!
Least Favourite Wiinote Instrument (+ Why)
The Pitch/Roll Instrument wasn’t nice to play, it was too awkward
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch/Roll Instrument as the notes were too close together, which maed them difficult to get right.

Pitch/Roll Results
timed note (C): 2.0s
timed note (E): 1.8s
timed note (f#): 1.9s
tune (c, e, f#): 7.9s
Flute Results
timed note (C): 2.8s
timed note (E): 2.8s
timed note (f#): 2.2s
tune (c, e, f#): 6.0s
IR Results
timed note (C): 1.9s
timed note (E): 2.0s
timed note (f#): 2.1s
tune (c, e, f#): 6.4s

Table B.4: Table Containing Ellies Questionaire Results

53
APPENDIX B. RAW RESULTS OUTPUT
Name: Liam
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
IR Instrument: It was quite fun using the drumsticks to play it.
Easiest Wiinote Instrument to Play (+ Why)
IR Instrument: It was easiest o figure out where different notes were
Least Favourite Wiinote Instrument (+ Why)
Flute Instrument: The Button Combinations made it too difficult to play
Most Difficult Wiinote Instrument to Play (+ Why)
Flute Instrument: Remembering all the button combinations was too much.

Pitch/Roll Results
timed note (C): 1.5s
timed note (E): 1.3s
timed note (f#): 1.5s
tune (c, e, f#): 6.2s
Flute Results
timed note (C): 3.0s
timed note (E): 3.4s
timed note (f#): 2.6s
tune (c, e, f#): 6.8s
IR Results
timed note (C): 1.7s
timed note (E): 1.8s
timed note (f#): 1.9s
tune (c, e, f#): 6.1s

Table B.5: Table Containing Liams Questionaire Results

54
APPENDIX B. RAW RESULTS OUTPUT
Name: Liz
Instruments Played None
Years Played n/a
Favourite Wiinote Instrument (+ Why)
The IR Instrument was good! It was very different!
Easiest Wiinote Instrument to Play (+ Why)
The Flute Instrument was easiest to get the notes right with!
Least Favourite Wiinote Instrument (+ Why)
The Pitch/Roll Instrument wasnt much fun after the first couple minutes!
Most Difficult Wiinote Instrument to Play (+ Why)
The Pitch/Roll Instrument had too many notes on it to make it easy to play.

Pitch/Roll Results
timed note (C): 2.1s
timed note (E): 1.9s
timed note (f#): 2.0s
tune (c, e, f#): 8.3s
Flute Results
timed note (C): 2.7s
timed note (E): 2.9s
timed note (f#): 2.1s
tune (c, e, f#): 6.4s
IR Results
timed note (C): 2.1s
timed note (E): 1.9s
timed note (f#): 2.0s
tune (c, e, f#): 6.5s

55
APPENDIX B. RAW RESULTS OUTPUT 56

’C’ ’E’ ’F#’ Tune (C, E, F#)


Chris 1.5 1.3 1.2 5.9
Paul 1.3 1.2 1.1 5.5
Dan 1.3 1.2 1.4 6.1
Ellie 1.7 1.8 1.9 6.9
Liam 1.5 1.3 1.5 6.2
Liz 2.1 1.7 2.0 7.3
Total 9.4 8.5 9.1 37.9
Mean 1.6 1.4 1.5 6.9

Table B.6: Table combining the results of all Pitch/Roll tests.

’C’ ’E’ ’F#’ Tune (C, E, F#)


Chris 2.3 2.5 2.0 5.9
Paul 2.1 2.3 1.9 5.4
Dan 2.5 2.6 2.0 5.8
Ellie 2.8 2.8 2.0 6.0
Liam 3.0 3.4 2.6 6.8
Liz 2.7 2.9 2.1 6.4
Total 15.4 16.5 12.8 36.3
Mean 2.6 2.8 2.1 6.1

Table B.7: Table combining the results of all flute tests.

’C’ ’E’ ’F#’ Tune (C, E, F#)


Chris 1.6 1.5 1.7 5.5
Paul 1.3 1.5 1.6 5.0
Dan 1.6 1.6 1.7 5.7
Ellie 1.9 2.0 2.1 6.4
Liam 1.7 1.8 1.9 6.1
Liz 2.1 1.9 2.0 6.5
Total 10.2 10.3 11.0 35.2
Mean 1.7 1.7 1.8 5.9

Table B.8: Table combining the results of all IR tests.


Appendix C

A Final View of the System


Requirements

57
APPENDIX C. A FINAL VIEW OF THE SYSTEM REQUIREMENTS 58

Functional Requirements
System Requirements
ID Description Complete?
FSy01 The system should present the user with a graphical user interface !
FSy02 The system should gather Wiimote data using WiiuseJ !
FSy03 The system should inform the user of the note playing !
FSy04 The system should inform the user of system developments !
Instrument Requirements
ID Description Complete?
FI01 The system must support multiple instruments !
FI02 Each instrument must be able to play notes A-G# !
FI03 Users should be able to add new instruments to the system %
FI04 should contain instruments based on all the Wiimotes input methods !
Gesture Requirements
ID Description Complete?
FG01 The system must have some method of identifying gestures !
FG02 The gesture system must be able to control system functionality !
FG03 The system must be able to learn new gestures !
FG04 Gestures should have a ’friendly’ name available to the user !
FG05 The user should be able to assign actions to gestures !
FG06 The system should provide some interface for managing gestures !
Sound Requirements
ID Description Complete?
FSo01 The system should support MIDI. !
FSo02 The system should be able to record the instruments playing. %
FSo03 The system should Allow the playing of Both chords and single notes !
Non-Functional Requirements
ID Description Complete?
NF01 The system should be able to play sounds in real time !
NF02 The system must recognise gestures with at least 75% accuracy -

Table C.1: Table containing the final status of the system requirements.
Appendix D

Code

59
APPENDIX D. CODE
D.1 wiinote.engine.ListenerFlute.java ∗/
public c l a s s L i s t e n e r L e d s implements W i i m o t e L i s t e n e r {

@ O v er r ide
/∗ ∗ public void o n I r E v e n t ( IREvent a r g 0 ) {
∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e B u t t o n IRSource [ ] pts = arg0 . getIRPoints ( ) ;
Data f r o m t h e
∗ Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t . if ( p t s . l e n g t h == 2 ) {
∗ <p> W i i n o t e . p r o c e s s . LedsToMidi ( p t s [ 0 ] . getX ( ) , p t s [ 0 ] . getY ( ) ,
∗ Whilst the wiimote o b j e c t i s attached to t h i s the f l u t e pts [ 1 ]
instrument w i l l be . getX ( ) , p t s [ 1 ] . getY ( ) ) ;
∗ a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o u s e t h e }
flute }
∗ i n s t r u m e n t and r e m o v e i t t o s t o p u s i n g i t . }
∗ <p>
∗ 05−Apr −2009



@author
@version
Louis Fellows
1.0.0.0
D.3 wiinote.engine.ListenerPitchRoll.java
∗/
public c l a s s L i s t e n e r F l u t e implements W i i m o t e L i s t e n e r {
/∗ ∗
@ O v er r ide ∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e
public void o n B u t t o n s E v e n t ( WiimoteButtonsEvent a r g 0 ) { Accellerometer
i f ( arg0 . isButtonAPressed ( ) ∗ Data f r o m t h e Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t .
| | arg0 . isButtonBPressed ( ) ∗ <p>
| | arg0 . isButtonUpPressed ( ) ∗ Whilst the wiimote o b j e c t i s attached to t h i s the Pitch / Roll
| | arg0 . isButtonOnePressed ( ) instrument
| | arg0 . isButtonTwoPressed ( ) ∗ w i l l b e a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o
| | arg0 . isButtonAJustReleased ( ) use the
| | arg0 . isButtonBJustReleased ( ) ∗ P i t c h / R o l l i n s t r u m e n t and r e m o v e i t t o s t o p u s i n g i t .
| | arg0 . isButtonUpJustReleased ( ) ∗ <p>
| | arg0 . isButtonOneJustReleased ( ) ∗ 05−Apr −2009
| | arg0 . isButtonTwoJustReleased ( ) ) { ∗
∗ @author Louis Fellows
Wiinote . p r o c e s s . ButtonstoMidi ( arg0 . getButtonsHeld ( ) ) ; ∗ @version 1.0.0.0
} ∗/
}
} public c l a s s L i s t e n e r P i t c h R o l l implements W i i m o t e L i s t e n e r {

@ O v er r ide
public void o n M o t i o n S e n s i n g E v e n t ( M o t i o n S e n s i n g E v e n t a r g 0 ) {
D.2 wiinote.engine.ListenerLeds.java Orientation o = arg0 . g e t O r i e n t a t i o n ( ) ;
W i i n o t e . p r o c e s s . motionToMidi ( o . g e t P i t c h ( ) , o . g e t R o l l ( ) ) ;
}
}
/∗ ∗
∗ A L i s t e n e r f o r t h e LED b a s e d i n s t r u m e n t . T a k e s t h e LED Data


from t h e
Wiimote and p a s s e s i t t o t h e MWProcess o b j e c t .
D.4 wiinote.engine.MidiOut.java
∗ <p>
∗ W h i l s t t h e w i i m o t e o b j e c t i s a t t a c h e d t o t h i s t h e IR
instrument w i l l be /∗ ∗
∗ a c t i v e , Add t h i s t o t h e w i i m o t e s a c t i o n l i s t e n e r s t o u s e t h e ∗ C o n t a i n s a number o f MIDI h e l p e r f u n c t i o n s w h i c h a r e u s e d t o
IR i n s t r u m e n t connect
∗ and r e m o v e i t t o s t o p u s i n g i t . ∗ t o a MIDI D e v i c e and t o b u i l d and s e n d MIDI M e s s a g e s t o i t .
∗ <p> ∗ <p>
∗ 05−Apr −2009 ∗ The Aim o f t h i s c l a s s i s t o k e e p a l l MIDI f u n c t i o n s
∗ t o g e t h e r i n one
∗ @author Louis Fellows ∗ c l a s s t o i n c r e a s e r e−u s e and r e d u c e t h e w o r k i n a l t e r i n g t h e

60
∗ @version 1.0.0.0 MIDI
APPENDIX D. CODE
∗ functions of the Project }
∗ <p>
∗ 05−Apr −2009 public void h o l d ( i n t t i m e ) {
∗ try {
∗ @author Louis Fellows Thread . s l e e p ( t i m e ) ;
∗ @version 1.0.0.0 } catch ( I n t e r r u p t e d E x c e p t i o n e ) {
∗/
public c l a s s MidiOut { e . printStackTrace () ;
}
p r i v a t e M i d i D e v i c e md ; }

public MidiOut ( ) { public S t r i n g [ ] o u t p u t P o r t s ( ) {


md = n u l l ; M i d i D e v i c e . I n f o [ ] ms = MidiSystem . g e t M i d i D e v i c e I n f o ( ) ;
} A r r a y L i s t <S t r i n g > r e t u r n S t r i n g s = new A r r a y L i s t <S t r i n g >() ;

public void c o n n e c t T o P o r t ( S t r i n g portName ) throws f o r ( M i d i D e v i c e . I n f o i n f : ms ) {


MidiUnavailableException { r e t u r n S t r i n g s . add ( i n f . t o S t r i n g ( ) ) ;
i f (md != n u l l ) { }
i f (md . i s O p e n ( ) ) {
md . c l o s e ( ) ; String [ ] returnVals ;
} r e t u r n V a l s = new S t r i n g [ r e t u r n S t r i n g s . s i z e ( ) ] ;
}
for ( int i = 0 ; i < r e t u r n S t r i n g s . s i z e ( ) ; i ++) {
M i d i D e v i c e . I n f o [ ] ms = MidiSystem . g e t M i d i D e v i c e I n f o ( ) ; returnVals [ i ] = returnStrings . get ( i ) ;
md = n u l l ; }
f o r ( M i d i D e v i c e . I n f o i n f : ms ) {
i f ( i n f . t o S t r i n g ( ) . e q u a l s I g n o r e C a s e ( portName ) ) { return r e t u r n V a l s ;
try { }
md = MidiSystem . g e t M i d i D e v i c e ( i n f ) ;
md . open ( ) ; public void sendMSG ( S h o r t M e s s a g e myMsg) throws
return ; MidiPortNotSetException {
} catch ( M i d i U n a v a i l a b l e E x c e p t i o n e ) { Receiver rcvr = null ;
throw new M i d i U n a v a i l a b l e E x c e p t i o n ( ) ; i f (md != n u l l ) {
} try {
} r c v r = md . g e t R e c e i v e r ( ) ;
} } catch ( M i d i U n a v a i l a b l e E x c e p t i o n e ) {
throw new M i d i U n a v a i l a b l e E x c e p t i o n ( ) ; e . printStackTrace () ;
} }

public S h o r t M e s s a g e c r e a t e S h o r t M e s s a g e ( i n t command , i n t d1 , long timeStamp = −1;


i n t d2 )
throws I n v a l i d M i d i D a t a E x c e p t i o n { r c v r . s e n d ( myMsg , timeStamp ) ;
S h o r t M e s s a g e myMsg = new S h o r t M e s s a g e ( ) ; } else {
try { throw new M i d i P o r t N o t S e t E x c e p t i o n ( ” P o r t Not S e t ( S t i l l
myMsg . s e t M e s s a g e ( command , d1 , d2 ) ; Null ) ” ) ;
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) { }
throw e ; }
}
return myMsg ; public void t e s t S i g n a l ( ) {
} try {
sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE ON, 0 , 60 ,
public S h o r t M e s s a g e c r e a t e S h o r t M e s s a g e ( i n t command , int 90) ) ;
c h a n n e l , i n t d1 , hold (200) ;
i n t d2 ) throws I n v a l i d M i d i D a t a E x c e p t i o n {
S h o r t M e s s a g e myMsg = new S h o r t M e s s a g e ( ) ; sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE OFF, 0 , 6 0 ,
try { 90) ) ;
myMsg . s e t M e s s a g e ( command , c h a n n e l , d1 , d2 ) ; sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE ON, 0 , 6 2 ,
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) { 90) ) ;
throw e ; hold (200) ;
}

61
return myMsg ; sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE OFF, 0 , 62 ,
APPENDIX D. CODE
90) ) ; public int octave ;
sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE ON, 0 , 64 , public int outType ;
90) ) ; public int playingNote ;
hold (200) ; public int volume = 9 0 ;

sendMSG ( c r e a t e S h o r t M e s s a g e ( S h o r t M e s s a g e . NOTE OFF, 0 , 64 , public MWProcess ( ) {


90) ) ; p l a y i n g N o t e = −1;
octave = 4;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) { newNote = −1;
e . printStackTrace () ; outType = 1 ;
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) { }
e . printStackTrace () ;
} /∗ ∗
∗ C o n v e r t s t h e B u t t o n s h e l d on t h e W i i r e m o t e i n t o a M i d i N o t e
} ∗
} ∗ @param b u t t o n s H e l d t h e b u t t o n s h e l d
∗/
public void B u t t o n s t o M i d i ( short b u t t o n s H e l d ) {
String binButtons = I n t e g e r . toBinaryString ( buttonsHeld ) ;
D.5 wiinote.engine.MWProcess.java b i n B u t t o n s = makeBinaryLength ( b i n B u t t o n s , 1 3 ) ;

/∗
∗ t he b i n a r y s t r i n g r e p r e s e n t i n g th e b u t t o n s i s as f o l l o w s :
/∗ ∗ ∗ +UDRL??−AB12 Where : U = Up D = Down L = L e f t R = R i g h t ?
∗ C o n t a i n s many f u n c t i o n s f o r c o n v e r t i n g d i f f e r e n t t y p e s i n = unknown
i n p u t i n t o MIDI ∗/
∗ Commands , A l l t h e F u n c t i o n s s e t t h e g l o b a l v a r i a b l e newNote
which i s the if ( binButtons . e q u a l s ( ” 0000000000000 ” ) ) {
∗ MIDI n o t e number t o p l a y and t h e n c a l l t h e f u n c t i o n p l a y ( ) newNote = −1;
which handles } else i f ( binButtons . e q u a l s ( ” 0000000000001 ” ) ) {
∗ t h e s e n d i n g o f MIDI m e s s a g e s u s i n g t h e M i d i O u t c l a s s . newNote = −1;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000000010 ” ) ) {
∗ Any new c o d e f o r i n s t r u m e n t s s h o u l d b e a d d e d h e r e and u s e t h e newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE A ;
play c l a s s to } else i f ( binButtons . e q u a l s ( ” 0000000000011 ” ) ) {
∗ h a n d l e MIDI m e s s a g e s . The S e n d i n g o f a Volume M e s s a g e i n t h e newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE ASHP ;
IR i n s t r u m e n t } else i f ( binButtons . e q u a l s ( ” 0000000001000 ” ) ) {
∗ s h o u l d b e moved t o p l a y ( ) e v e n t u a l l y t o o ! newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE B ;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000001001 ” ) ) {
∗ @see M i d i O u t newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE B ;
∗ <p> } else i f ( binButtons . e q u a l s ( ” 0000000001010 ” ) ) {
∗ 05−Apr −2009 newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE C ;
∗ } else i f ( binButtons . e q u a l s ( ” 0000000001011 ” ) ) {
∗ @author Louis Fellows newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE CSHP ;
∗ @version 1.0.0.0 } else i f ( binButtons . e q u a l s ( ” 0000000000100 ” ) ) {
∗/ newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE D ;
public c l a s s MWProcess { } else i f ( binButtons . e q u a l s ( ” 0000000000101 ” ) ) {
newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE DSHP ;
public static final int NOTE A = 9 ; } else i f ( binButtons . e q u a l s ( ” 0000000000110 ” ) ) {
public static final int NOTE ASHP = 10; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE E ;
public static final int NOTE B = 1 1 ; } else i f ( binButtons . e q u a l s ( ” 0000000000111 ” ) ) {
public static final int NOTE C = 1 2 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE E ;
public static final int NOTE CSHP = 13; } else i f ( binButtons . e q u a l s ( ” 0000000001100 ” ) ) {
public static final int NOTE D = 1 4 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE F ;
public static final int NOTE DSHP = 15; } else i f ( binButtons . e q u a l s ( ” 0000000001101 ” ) ) {
public static final int NOTE E = 1 6 ; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE FSHP ;
public static final int NOTE F = 1 7 ; } else i f ( binButtons . e q u a l s ( ” 0000000001110 ” ) ) {
public static final int NOTE FSHP = 18; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE G ;
public static final int NOTE G = 1 9 ; } else i f ( binButtons . e q u a l s ( ” 0000000001111 ” ) ) {
public static final int NOTE GSHP = 20; newNote = ( ( o c t a v e ) ∗ 1 2 ) + NOTE GSHP ;
} else i f ( binButtons . e q u a l s ( ” 0100000000000 ” ) ) {

62
public i n t newNote ; newNote = −1;
APPENDIX D. CODE
} else i f ( binButtons . e q u a l s ( ” 0100000000001 ” ) ) { return ”B” ;
newNote = −1; }
} else i f ( binButtons . e q u a l s ( ” 0100000000010 ” ) ) { return ” ” ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE A ; }
} else i f ( binButtons . e q u a l s ( ” 0100000000011 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE ASHP ; public i n t g e t O c t a v e ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001000 ” ) ) { return o c t a v e ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE B ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001001 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE B ; public i n t getOutType ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001010 ” ) ) { return outType ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE C ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001011 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE CSHP ; public i n t g e t P l a y i n g N o t e ( ) {
} else i f ( binButtons . e q u a l s ( ” 0100000000100 ” ) ) { return p l a y i n g N o t e ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE D ; }
} else i f ( binButtons . e q u a l s ( ” 0100000000101 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE DSHP ; public i n t getPlusNumberFromNoteNumber ( i n t n o t e ) {
} else i f ( binButtons . e q u a l s ( ” 0100000000110 ” ) ) { i f ( n o t e % 12 == 1 1 ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE E ; return 2 ;
} else i f ( binButtons . e q u a l s ( ” 0100000000111 ” ) ) { }
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE E ; i f ( n o t e % 12 == 4 ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001100 ” ) ) { return 2 ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE F ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001101 ” ) ) { return 3 ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE FSHP ; }
} else i f ( binButtons . e q u a l s ( ” 0100000001110 ” ) ) {
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE G ; public void IRToMidi ( i n t x , i n t y ) {
} else i f ( binButtons . e q u a l s ( ” 0100000001111 ” ) ) { i n t Note = ( x / 800 ∗ 1 0 0 ) ;
newNote = ( ( o c t a v e + 1 ) ∗ 1 2 ) + NOTE GSHP ; Note = Note ∗ 1 2 7 ;
} else { }
}
public void LedsToMidi ( i n t x1 , i n t y1 , i n t x2 , i n t y2 ) {
play () ; x1 = 1023 − x1 ;
} x2 = 1023 − x2 ;

public S t r i n g g e t M i d i N o t e L e t t e r ( i n t n o t e ) { // make s u r e 1 i s the d o t on t h e left !


switch ( n o t e % 1 2 ) { i f ( x1 > x2 ) {
case ( 0 ) : i n t temp = 0 ;
return ”C” ; temp = x1 ;
case ( 1 ) : x1 = x2 ;
return ”C#” ; x2 = temp ;
case ( 2 ) :
return ”D” ; temp = y1 ;
case ( 3 ) : y1 = y2 ;
return ”D#” ; y2 = temp ;
case ( 4 ) : }
return ”E” ;
case ( 5 ) : int locOctave = octave ;
return ”F” ;
case ( 6 ) : if ( y1 < 4 0 0 ) {
return ”F#” ; locOctave = octave − 1;
case ( 7 ) : i f ( locOctave < 0) {
return ”G” ; locOctave = 0;
case ( 8 ) : }
return ”G#” ; } e l s e i f ( y1 > 6 2 3 ) {
case ( 9 ) : locOctave = octave + 1;
return ”A” ; i f ( locOctave > 7) {
case ( 1 0 ) : locOctave = 7;
return ”A#” ; }

63
case ( 1 1 ) : }
APPENDIX D. CODE
i n t newvolume = 0 ; play () ;
i f ( y2 >= 6 0 0 ) { }
newvolume = 1 0 0 ;
} e l s e i f ( y2 >= 500 && y2 < 6 0 0 ) { /∗ ∗
newvolume = 9 0 ; ∗ I n c r e a s e s a B i n a r y Number t o t h e r i g h t l e n g t h by adding
} e l s e i f ( y2 >= 400 && y2 < 5 0 0 ) { zeros to
newvolume = 7 5 ; ∗ the front of i t .
} e l s e i f ( y2 >= 300 && y2 < 4 0 0 ) { ∗
newvolume = 6 0 ; ∗ @param b i n a r y S t r t h e b i n a r y s t r
} e l s e i f ( y2 >= 200 && y2 < 3 0 0 ) { ∗ @param Len t h e l e n
newvolume = 4 5 ; ∗
} e l s e i f ( y2 >= 100 && y2 < 2 0 0 ) { ∗ @return t h e s t r i n g
newvolume = 3 0 ; ∗/
} e l s e i f ( y2 >= 000 && y2 < 1 0 0 ) { public S t r i n g makeBinaryLength ( S t r i n g b i n a r y S t r , i n t Len ) {
newvolume = 0 ; int currLen = binaryStr . length ( ) ;
} i n t toAdd = Len − c u r r L e n ;
S t r i n g tempStr = ” ” ;
if ( newvolume != volume ) { f o r ( i n t i = 0 ; i < toAdd ; i ++) {
volume = newvolume ; tempStr += ” 0 ” ;
try { }
ShortMessage v o l = Wiinote . midiout . createShortMessage ( tempStr = tempStr + b i n a r y S t r ;
S h o r t M e s s a g e .CONTROL CHANGE, 0 x07 , volume ) ; return tempStr ;
W i i n o t e . m i d i o u t . sendMSG ( v o l ) ; }
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) {
e . printStackTrace () ; public void motionToMidi ( f l o a t pitch , float roll ) {
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
e . printStackTrace () ; newNote = −1; // s i l e n c e
} i f ( p i t c h > −45 && p i t c h < 4 5 ) {
} i f ( r o l l < −90) {
// S i l e n c e
i n t d i s t = x2 − x1 ; } e l s e i f ( r o l l < −70 && r o l l > −90) {
newNote = −1; newNote = ( o c t a v e ∗ 1 2 ) + NOTE G ;
} e l s e i f ( r o l l < −50 && r o l l > −70) {
if ( d i s t > 1000) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE F ;
} e l s e i f ( r o l l < −30 && r o l l > −50) {
} else i f ( d i s t > 825) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE E ;
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE A ; } e l s e i f ( r o l l < −10 && r o l l > −30) {
} else i f ( d i s t > 750) { newNote = ( o c t a v e ∗ 1 2 ) + NOTE D ;
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE ASHP ; } e l s e i f ( r o l l < 10 && r o l l > −10) {
} else i f ( d i s t > 675) { // S i l e n c e
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE B ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE C ;
} else i f ( d i s t > 600) { } e l s e i f ( r o l l < 30 && r o l l > 1 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE C ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE B ;
} else i f ( d i s t > 525) { } e l s e i f ( r o l l < 50 && r o l l > 3 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE CSHP ; newNote = ( o c t a v e ∗ 1 2 ) + NOTE A ;
} else i f ( d i s t > 450) { } e l s e i f ( r o l l < 70 && r o l l > 5 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE D ; // S i l e n c e
} else i f ( d i s t > 375) { } e l s e i f ( r o l l < 90 && r o l l > 7 0 ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE DSHP ; // S i l e n c e
} else i f ( d i s t > 300) { } else i f ( r o l l > 90) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE E ; // S i l e n c e
} else i f ( d i s t > 225) { }
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE F ; }
} else i f ( d i s t > 150) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE FSHP ; play () ;
} else i f ( d i s t > 75) { }
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE G ;
} else i f ( d i s t > 0) { public void p l a y ( ) {
newNote = ( l o c O c t a v e ∗ 1 2 ) + NOTE GSHP ;

64
} if ( outType == 1 ) {
APPENDIX D. CODE
playSingle () ; p l a y i n g N o t e = newNote ;
} e l s e i f ( outType == 2 ) { }
pl ay C h or d ( ) ; }
}
} public void p l a y S i n g l e ( ) {

public void pl ay C h or d ( ) { if ( newNote != p l a y i n g N o t e ) {


i f ( newNote != p l a y i n g N o t e ) {
if ( p l a y i n g N o t e != −1) {
if ( p l a y i n g N o t e != −1) { ShortMessage o f f = null ;
ShortMessage o f f = null ; try {
try { o f f = Wiinote . midiout . createShortMessage (
o f f = Wiinote . midiout . createShortMessage ( S h o r t M e s s a g e . NOTE OFF, 0 , p l a y i n g N o t e , 9 0 ) ;
S h o r t M e s s a g e . NOTE OFF, 0 , p l a y i n g N o t e , 9 0 ) ; W i i n o t e . m i d i o u t . sendMSG ( o f f ) ;
W i i n o t e . m i d i o u t . sendMSG ( o f f ) ; W i i n o t e . g u i . noteWindow . s e t N o t e L e t t e r ( ” ” ) ;
o f f = Wiinote . midiout . createShortMessage ( W i i n o t e . g u i . noteWindow . s e t N o t e O c t a v e ( ” ” ) ;
S h o r t M e s s a g e . NOTE OFF, 0 , p l a y i n g N o t e − 4 , } catch ( I n v a l i d M i d i D a t a E x c e p t i o n e 1 ) {
90) ; W i i n o t e . g u i . msgWindow . newMessage (
W i i n o t e . m i d i o u t . sendMSG ( o f f ) ; ” Midi Message c o n t a i n s I n v a l i d Data ” , 3 ) ;
o f f = Wiinote . midiout . createShortMessage ( } catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
S h o r t M e s s a g e . NOTE OFF, 0 , p l a y i n g N o t e W i i n o t e . g u i . msgWindow . newMessage ( ” Midi P o r t Not
+ Set ” , 3) ;
getPlusNumberFromNoteNumber ( p l a y i n g N o t e ) , }
90) ; }
W i i n o t e . m i d i o u t . sendMSG ( o f f ) ; if ( newNote != −1) {
W i i n o t e . g u i . noteWindow . s e t N o t e L e t t e r ( ” ” ) ; S h o r t M e s s a g e on = n u l l ;
W i i n o t e . g u i . noteWindow . s e t N o t e O c t a v e ( ” ” ) ; try {
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e 1 ) { on = W i i n o t e . m i d i o u t . c r e a t e S h o r t M e s s a g e (
W i i n o t e . g u i . msgWindow . newMessage ( S h o r t M e s s a g e . NOTE ON, 0 , newNote , 9 0 ) ;
” Midi Message c o n t a i n s I n v a l i d Data ” , 3 ) ; W i i n o t e . m i d i o u t . sendMSG ( on ) ;
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) { W i i n o t e . g u i . noteWindow
W i i n o t e . g u i . msgWindow . newMessage ( ” Midi P o r t Not . s e t N o t e L e t t e r ( g e t M i d i N o t e L e t t e r ( newNote ) ) ;
Set ” , 3) ; W i i n o t e . g u i . noteWindow . s e t N o t e O c t a v e ( I n t e g e r
} . t o S t r i n g ( ( newNote / 1 2 ) ) ) ;
} } catch ( I n v a l i d M i d i D a t a E x c e p t i o n e 1 ) {
if ( newNote != −1) { e1 . p r i n t S t a c k T r a c e ( ) ;
S h o r t M e s s a g e on = n u l l ; } catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) {
try { W i i n o t e . g u i . msgWindow . newMessage ( ” Midi P o r t Not
on = W i i n o t e . m i d i o u t . c r e a t e S h o r t M e s s a g e ( Set ” , 3) ;
S h o r t M e s s a g e . NOTE ON, 0 , newNote , 9 0 ) ; }
W i i n o t e . m i d i o u t . sendMSG ( on ) ; }
on = W i i n o t e . m i d i o u t . c r e a t e S h o r t M e s s a g e ( p l a y i n g N o t e = newNote ;
S h o r t M e s s a g e . NOTE ON, 0 , newNote − 4 , 9 0 ) ; }
W i i n o t e . m i d i o u t . sendMSG ( on ) ; }
on = W i i n o t e . m i d i o u t . c r e a t e S h o r t M e s s a g e (
S h o r t M e s s a g e . NOTE ON, 0 , newNote public void s e t O c t a v e ( i n t o c t a v e ) {
+ this . octave = octave ;
getPlusNumberFromNoteNumber ( newNote ) , }
90) ;
W i i n o t e . m i d i o u t . sendMSG ( on ) ; public void setOutType ( i n t outType ) {
W i i n o t e . g u i . noteWindow t h i s . outType = outType ;
. s e t N o t e L e t t e r ( g e t M i d i N o t e L e t t e r ( newNote ) ) ; }
W i i n o t e . g u i . noteWindow . s e t N o t e O c t a v e ( I n t e g e r
. t o S t r i n g ( ( newNote / 1 2 ) ) ) ; public void s e t P l a y i n g N o t e ( i n t p l a y i n g N o t e ) {
} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e 1 ) { this . playingNote = playingNote ;
e1 . p r i n t S t a c k T r a c e ( ) ; }
} catch ( M i d i P o r t N o t S e t E x c e p t i o n e ) { }
W i i n o t e . g u i . msgWindow . newMessage ( ” Midi P o r t Not
Set ” , 3) ;
}

65
}
APPENDIX D. CODE
D.6 wiinote.engine.Wiinote.java ∗
∗ @author Louis Fellows
∗ @version 1.0.0.0
∗/
/∗ ∗ public c l a s s AccDirectionObject {
∗ T h i s c l a s s s t a r t s up t h e W i i n o t e s y s t e m , i t l o a d s t h e o b j e c t s
that public s t a t i c final f l o a t ACC THRESHOLD X = ( f l o a t ) 0.3;
∗ c o m p r i s e t h e 4 main s e c t i o n s o f t h e s y s t e m . The P r o c e s s e s , public s t a t i c final f l o a t ACC THRESHOLD Z = ( f l o a t ) 0.3;
The M i d i System ,
∗ The G e s t u r e R e c o g n i t i o n S y s t e m and t h e GUI . T h i s t h e n s t a r t s public static final int DIR HOLD = 0 ;
the system . public static final int DIR N = 1 ;
∗ <p> public static final int DIR NE = 2 ;
∗ 05−Apr −2009 public static final int DIR E = 3 ;
∗ public static final int DIR SE = 4 ;
∗ @author Louis Fellows public static final int DIR S = 5 ;
∗ @version 1.0.0.0 public static final int DIR SW = 6 ;
∗/ public static final int DIR W = 7 ;
public c l a s s W i i n o t e { public static final int DIR NW = 8 ;

public static GestureRecogniser gestureRecog ; /∗ ∗


public s t a t i c Gui g u i ; ∗ Int to direction s t r .
public s t a t i c MidiOut m i d i o u t ; ∗
public s t a t i c MWProcess p r o c e s s ; ∗ @param i n t D i r t h e i n t d i r

public s t a t i c void main ( S t r i n g [ ] a r g s ) { ∗ @return t h e s t r i n g
g u i = new Gui ( ) ; ∗/
m i d i o u t = new MidiOut ( ) ; public s t a t i c S t r i n g i n t T o D i r e c t i o n S t r ( i n t intDir ) {
p r o c e s s = new MWProcess ( ) ; i f ( i n t D i r == DIR HOLD) {
g e s t u r e R e c o g = new G e s t u r e R e c o g n i s e r ( ) ; return ”O” ;
g e s t u r e R e c o g . a d d E v e n t L i s t e n e r s (new } e l s e i f ( i n t D i r == DIR N ) {
ListenerGestureCapture () ) ; return ”N” ;
} } e l s e i f ( i n t D i r == DIR NE ) {
return ”NE” ;
} } e l s e i f ( i n t D i r == DIR E ) {
return ”E” ;
} e l s e i f ( i n t D i r == DIR SE ) {
return ”SE” ;
D.7 wiinote.gesture.AccDirectionObject.java } e l s e i f ( i n t D i r == DIR S ) {
return ”S” ;
} e l s e i f ( i n t D i r == DIR SW) {
return ”SW” ;
/∗ ∗ } e l s e i f ( i n t D i r == DIR W) {
∗ H o l d s t h e X, Y a c c e l e r a t i o n o f a s a m p l e f r o m t h e Wiimote . return ”W” ;
From t h i s i t } e l s e i f ( i n t D i r == DIR NW) {
∗ d e c i d e s upon t h e d i r e c t i o n o f t h e s a m p l e ( a s one o f 9 return ”NW” ;
d i r e c t i o n s , known }
∗ i n t h e c l a s s a s N orth , N orth−E a s t , E a s t , S o u t h −E a s t , S o u t h , return ” ” ;
S o u t h −West }
∗ West , N orth−West and a s t a t e o f no−d i r e c t i o n known a s ’HOLD ’ )
∗ <p> public i n t d i r e c t i o n ;
∗ Note , t h a t t h e Y D i r e c t i o n i s a c t u a l l y t h e Z d i r e c t i o n f r o m public M o t i o n S e n s i n g E v e n t motion ;
t h e Wiimote ,
∗ a s Such , t h e R e f e r e n c e s i n t h i s c l a s s a r e t o Z and n o t Y . public A c c D i r e c t i o n O b j e c t ( M o t i o n S e n s i n g E v e n t motion ) {
∗ <p> t h i s . motion = motion ;
∗ T h i s C l a s s C o u l d b e e x t e n d e d t o i n c l u d e Y a l s o , w i t h more d i r e c t i o n = l e a r n D i r e c t i o n ( motion ) ;
possible }
∗ d i r e c t i o n s ( 1 5 o r 2 7 ? ) . The R e s t o f w i i n o t e . g e s t u r e p a c k a g e
would have public i n t g e t D i r e c t i o n ( ) {
∗ to be s i m i l a r l y updated . return d i r e c t i o n ;
∗ <p> }

66
∗ 05−Apr −2009
APPENDIX D. CODE
public S t r i n g g e t D i r e c t i o n S t r i n g ( ) { ∗ 05−Apr −2009
return A c c D i r e c t i o n O b j e c t . i n t T o D i r e c t i o n S t r ( d i r e c t i o n ) ; ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
p r i v a t e i n t l e a r n D i r e c t i o n ( M o t i o n S e n s i n g E v e n t m) { ∗/
GForce g f = m. g e t G f o r c e ( ) ; public c l a s s A c c e l e r a t i o n A r r a y {
f l o a t x = g f . getX ( ) ; // h o r i z o n t a l
f l o a t z = g f . g e t Z ( ) ; // v e r t i c a l public A r r a y L i s t <A c c D i r e c t i o n O b j e c t > m o t i o n A r r a y ;
z = ( float ) ( z − 0.2) ;
i n t r e t u r n I n t = −1; public A c c e l e r a t i o n A r r a y ( ) {
m o t i o n A r r a y = new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
if ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X && z > }
ACC THRESHOLD Z) { // N o r t h !
r e t u r n I n t = DIR N ; public void addMotionEvent ( M o t i o n S e n s i n g E v e n t m) {
} e l s e i f ( x > ACC THRESHOLD X && z > ACC THRESHOLD Z) { // m o t i o n A r r a y . add (new A c c D i r e c t i o n O b j e c t (m) ) ;
NE }
r e t u r n I n t = DIR NE ;
} e l s e i f ( x > ACC THRESHOLD X && z > −ACC THRESHOLD Z public void emptyArray ( ) {
&& z < ACC THRESHOLD Z) { // E a s t ! m o t i o n A r r a y = new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
r e t u r n I n t = DIR E ; }
} e l s e i f ( x > ACC THRESHOLD X && z < −ACC THRESHOLD Z) { //
SE /∗ ∗
r e t u r n I n t = DIR SE ; ∗ Moves t h r o u g h t h e A r r a y and r e m o v e s any s a m p l e s t h a t a r e
} e l s e i f ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X t h e same
&& z < −ACC THRESHOLD Z) { // S o u t h ∗ d i r e c t i o n as t h e p r e v i o u s sample . Condensing t h e samples
r e t u r n I n t = DIR S ; down i n t o
} e l s e i f ( x < −ACC THRESHOLD X && z < −ACC THRESHOLD Z) ∗ a List of the d i f f e r e n t vectors of the gesture .
{ // SW ∗
r e t u r n I n t = DIR SW ; ∗ I t a l s o r e m o v e s any e r r o n e o u s s a m p l e s ( r e c o g n i s e d b y t h e
} e l s e i f ( x < −ACC THRESHOLD X && z > −ACC THRESHOLD Z direction
&& z < ACC THRESHOLD Z) { // West ! ∗ −1)
r e t u r n I n t = DIR W ; ∗/
} e l s e i f ( x < −ACC THRESHOLD X && z > ACC THRESHOLD Z) { // public void r e m o v e L i k e M o t i o n s ( ) {
NW A r r a y L i s t <A c c D i r e c t i o n O b j e c t > r e p l a c e A r r a y =
r e t u r n I n t = DIR NW ; new A r r a y L i s t <A c c D i r e c t i o n O b j e c t >() ;
} e l s e i f ( x > −ACC THRESHOLD X && x < ACC THRESHOLD X L i s t I t e r a t o r <A c c D i r e c t i o n O b j e c t > l i =
&& z > −ACC THRESHOLD Z && z < ACC THRESHOLD Z) { // motionArray . l i s t I t e r a t o r ( ) ;
Hold !
r e t u r n I n t = DIR HOLD ; AccDirectionObject d i r = l i . next ( ) ;
} while ( d i r . g e t D i r e c t i o n ( ) == −1) {
return r e t u r n I n t ; d i r = l i . next ( ) ;
} }
} r e p l a c e A r r a y . add ( d i r ) ;

while ( l i . hasNext ( ) ) {
int l a s t I n d e x = replaceArray . s i z e ( ) − 1 ;
D.8 wiinote.gesture.AccelerationArray.java d i r = l i . next ( ) ;
i f ( d i r . g e t D i r e c t i o n ( ) != r e p l a c e A r r a y . g e t ( l a s t I n d e x )
. getDirection () ) {
i f ( d i r . g e t D i r e c t i o n ( ) != −1) {
/∗ ∗ r e p l a c e A r r a y . add ( d i r ) ;
∗ The A c c e l l e r a t i o n A r r a y o b j e c t g a t h e r s t o g e t h e r a number o f }
∗ AccDirectionObjects to provide a sampling of a movement o f }
the wiimote . }
∗ There a r e a l s o f u n c t i o n s f o r m a n i p u l a t i n g t h i s data , such as
the motionArray = r e p l a c e A r r a y ;
∗ ’ removeLikeMotions ’ f u n c t i o n which i s r e q u i r e d as part of the }
∗ Gesture Recognition Process .
∗ <p> public i n t [ ] t o I n t A r r a y ( ) {
∗ @see A c c D i r e c t i o n O b j e c t i n t [ ] i n t s = new i n t [ m o t i o n A r r a y . s i z e ( ) ] ;

67
∗ <p>
APPENDIX D. CODE
int count = 0 ; ∗ Find g e s t u r e o b j e c t .
for ( A c c D i r e c t i o n O b j e c t a : motionArray ) { ∗
i n t s [ count ] = a . g e t D i r e c t i o n ( ) ; ∗ @param i d t h e i d
c o u n t ++; ∗
} ∗ @return t h e g e s t u r e o b j e c t
∗/
return i n t s ; public G e s t u r e O b j e c t f i n d G e s t u r e O b j e c t ( i n t id ) {
} f o r ( G e s t u r e O b j e c t ob : c on v A r r a y ) {
i f ( ob . g e t G e s t u r e I D ( ) == i d ) {
public S t r i n g t o S t r i n g ( ) { return ob ;
S t r i n g s t r = new S t r i n g ( ) ; }
}
for ( A c c D i r e c t i o n O b j e c t a : motionArray ) { return n u l l ;
str = str + a . getDirectionString () + ” ” ; }
}
/∗ ∗
return s t r ; ∗ Find g e s t u r e o b j e c t .
} ∗
} ∗ @param i d t h e i d

∗ @return t h e g e s t u r e o b j e c t
∗/
D.9 wiinote.gesture.ConvArray.java public G e s t u r e O b j e c t f i n d G e s t u r e O b j e c t ( S t r i n g
f o r ( G e s t u r e O b j e c t ob : c on v A r r a y ) {
id ) {

i f ( ob . getGestureName ( ) == i d ) {
return ob ;
/∗ ∗ }
∗ H o l d s an a r r a y o f G e s t u r e O b j e c t s w i t h f u n c t i o n s to search }
t h r o u g h them . return n u l l ;
∗ <p> }
∗ 05−Apr −2009
∗ /∗ ∗
∗ @author Louis Fellows ∗ Gets t h e .
∗ @version 1.0.0.0 ∗
∗/ ∗ @param p o s t h e p o s
public c l a s s ConvArray implements S e r i a l i z a b l e { ∗
∗ @return t h e g e s t u r e o b j e c t
/∗ ∗ The C o n s t a n t s e r i a l V e r s i o n U I D . ∗/ ∗/
p r i v a t e s t a t i c f i n a l long public G e s t u r e O b j e c t g e t ( i n t p o s ) {
s e r i a l V e r s i o n U I D = 1849777868583837139L ; return c o n v A r r ay . g e t ( p o s ) ;
}
/∗ ∗ The c o n v a r r a y . ∗/
public A r r a y L i s t <G e s t u r e O b j e c t > c o n v A r r a y ; /∗ ∗
∗ Len c o n v a r r a y .
/∗ ∗ ∗
∗ I n s t a n t i a t e s a new c o n v a r r a y . ∗ @return t h e i n t
∗/ ∗/
public ConvArray ( ) { public i n t l e n C on v A r r a y ( ) {
c o n v A r r a y = new A r r a y L i s t <G e s t u r e O b j e c t >() ; return c o n v A r r ay . s i z e ( ) ;
} }

/∗ ∗ /∗ ∗
∗ Adds t h e . ∗ L i s t names .
∗ ∗
∗ @param o b t h e o b ∗ @return t h e s t r i n g [ ]
∗/ ∗/
public void add ( G e s t u r e O b j e c t ob ) { public S t r i n g [ ] l i s t N a m e s ( ) {
c o n v A r r a y . add ( ob ) ; A r r a y L i s t <S t r i n g > names = new A r r a y L i s t <S t r i n g >() ;
}
if ( c o n v A r r a y . s i z e ( ) == 0 ) {

68
/∗ ∗ String [ ] returnVals ;
APPENDIX D. CODE
r e t u r n V a l s = new S t r i n g [ 0 ] ; ∗ <p>
return r e t u r n V a l s ; ∗ 05−Apr −2009
} ∗
∗ @author Louis Fellows
f o r ( G e s t u r e O b j e c t go : co n v A r r a y ) { ∗ @version 1.0.0.0
names . add ( go . getGestureName ( ) ) ; ∗/
} public c l a s s G e s t u r e O b j e c t implements S e r i a l i z a b l e {

String [ ] returnVals ; p r i v a t e s t a t i c f i n a l long


r e t u r n V a l s = new S t r i n g [ names . s i z e ( ) ] ; s e r i a l V e r s i o n U I D = −46905671347497801775L ;
private int g e s t u r e I D ;
f o r ( i n t i = 0 ; i < names . s i z e ( ) ; i ++) { p r i v a t e S t r i n g gestureName ;
r e t u r n V a l s [ i ] = names . g e t ( i ) ; private int g e s t u r e R e s u l t ;
}
/∗ ∗
return r e t u r n V a l s ; ∗ I n s t a n t i a t e s a new g e s t u r e o b j e c t .

} ∗ @param g e s t u r e N a m e t h e g e s t u r e name
∗ @param g e s t u r e I D t h e g e s t u r e i d
/∗ ∗ ∗ @param g e s t u r e R e s u l t t h e g e s t u r e r e s u l t
∗ Sets the . ∗/
∗ public G e s t u r e O b j e c t ( S t r i n g gestureName , i n t g e s t u r e I D , int
∗ @param p o s t h e p o s gestureResult ) {
∗ @param x t h e x super ( ) ;
∗/ t h i s . gestureName = gestureName ;
public void s e t ( i n t pos , G e s t u r e O b j e c t x ) { this . gestureID = gestureID ;
c o n v A r r a y . s e t ( pos , x ) ; this . gestureResult = gestureResult ;
} }

/∗ ∗ public i n t g e t G e s t u r e I D ( ) {
∗ Sets the gesture response . return g e s t u r e I D ;
∗ }
∗ @param s t r t h e s t r
∗ @param r e s p t h e r e s p public S t r i n g getGestureName ( ) {
∗ return gestureName ;
∗ @return t h e i n t }
∗/
public i n t s e t G e s t u r e R e s p o n s e ( S t r i n g str , int resp ) { public i n t g e t G e s t u r e R e s u l t ( ) {
return g e s t u r e R e s u l t ;
f o r ( G e s t u r e O b j e c t ob : c o n vA r r a y ) { }
i f ( ob . getGestureName ( ) == s t r ) {
ob . s e t G e s t u r e R e s u l t ( r e s p ) ; public void s e t G e s t u r e I D ( i n t g e s t u r e I D ) {
return 1 ; this . gestureID = gestureID ;
} }
}
return 0 ; public void s e t G e s t u r e N a m e ( S t r i n g gestureName ) {
} t h i s . gestureName = gestureName ;
} }

public void s e t G e s t u r e R e s u l t ( i n t g e s t u r e R e s u l t ) {
this . gestureResult = gestureResult ;
D.10 wiinote.gesture.GestureObject.java }
}

/∗ ∗
∗ A G e s t u r e O b j e c t h o l d s a u n i q u e g e s t u r e ID a l o n g w i t h t h e
’ friendly ’
D.11 wiinote.gesture.GestureRecognisedEvent.java
∗ name o f t h e g e s t u r e and t h e a c t i o n t h a t s h o u l d o c c u r when t h e
gesture

69
∗ happens . /∗ ∗
APPENDIX D. CODE
∗ Thrown b y t h e G e s t u r e R e c o g n i s e r when a G e s t u r e h a s o c c u r r e d . ∗ @author Louis Fellows
∗ <p> ∗ @version 1.0.0.0
∗ 05−Apr −2009 ∗/
∗ public c l a s s GestureRecogniser {
∗ @author Louis Fellows
∗ @version 1.0.0.0 public A c c e l e r a t i o n A r r a y AccArr ;
∗/ public i n t addEndNum ;
public c l a s s G e s t u r e R e c o g n i s e d E v e n t extends A c t i o n E v e n t { public boolean addNewEnd = f a l s e ;
public boolean cOn ;
p r i v a t e s t a t i c f i n a l long public ConvArray c o n v A rr a y ;
s e r i a l V e r s i o n U I D = −7617671388195703967L ; public E v e n t L i s t e n e r L i s t l i s t e n e r s = new E v e n t L i s t e n e r L i s t ( ) ;
private int gestureFound ; public i n t numAdds ;
public P a t h O b j e c t path ;
public G e s t u r e R e c o g n i s e d E v e n t ( O b j e c t a r g 0 , int arg1 , String
arg2 , public G e s t u r e R e c o g n i s e r ( ) {
int Gesture ) { loadPaths ( ) ;
super ( a r g 0 , a r g 1 , a r g 2 ) ; AccArr = new A c c e l e r a t i o n A r r a y ( ) ;
gestureFound = Gesture ; cOn = f a l s e ;
} }

public i n t g e t G e s t u r e F o u n d ( ) { public void a d d E v e n t L i s t e n e r s ( G e s t u r e L i s t e n e r l i s t e n e r ) {


return g e s t u r e F o u n d ; l i s t e n e r s . add ( G e s t u r e L i s t e n e r . c l a s s , l i s t e n e r ) ;
} }

public void s e t G e s t u r e F o u n d ( i n t g e s t u r e F o u n d ) { p r i v a t e void a d d G e s t u r e s ( ) {


this . gestureFound = gestureFound ; cOn = f a l s e ;
} AccArr . r e m o v e L i k e M o t i o n s ( ) ;
path . c r e a t e P a t h ( addEndNum , AccArr . t o I n t A r r a y ( ) ) ;
}
AccArr . emptyArray ( ) ;
numAdds−−;

D.12 wiinote.gesture.GestureRecogniser.java if ( numAdds <= 0 ) {


addNewEnd = f a l s e ;
W i i n o t e . g u i . msgWindow . newMessage ( ”New G e s t u r e Capture
Complete ” , 1 ) ;
/∗ ∗ savePaths ( ) ;
∗ H a n d l e s t h e r e c o g n i t i o n o f g e s t u r e s from t h e w i i m o t e . p u t s } else {
each of the W i i n o t e . g u i . msgWindow . newMessage ( numAdds + ” more t o
∗ s a m p l e s i n t o an A c c D i r e c t i o n O b j e c t and s t o r e s them a l l i n an go ! ” , 2 ) ;
∗ A c c e l e r a t i o n A r r a y . When a l l s a m p l e s a r e c o l l e c t e d i t r u n s t h e }
directions }
∗ t h r o u g h t h e P a t h T r e e and a t t e m p t s t o d i s c o v e r t h e g e s t u r e .
I t then public void c P r e s s e d ( ) {
∗ throws i t s r e s u l t s as a GestureRecognisedEvent to a l l Gesture cOn = true ;
Listeners }
∗ l i s t e n i n g to t h i s object .
∗ <p> public void c R e l e a s e d ( ) {
∗ I t a l s o c o n t a i n s t h e ConvArray o b j e c t w h i c h c o n t a i n s t h e i f ( addNewEnd == f a l s e ) {
information searchGestures () ;
∗ r e l a t e d t o w h a t t o do when a g e s t u r e i s r e c o g n i s e d , t h i s } else {
information addGestures ( ) ;
∗ can b e c a l l e d b y a n o t h e r c l a s s when i t i s n e e d e d . }
∗ <p> }
∗ @see A c c D i r e c t i o n O b j e c t
∗ @see A c c e l e r a t i o n A r r a y public E v e n t L i s t e n e r [ ] g e t E v e n t L i s t e n e r s ( ) {
∗ @see ConvArray return l i s t e n e r s . g e t L i s t e n e r s ( G e s t u r e L i s t e n e r . c l a s s ) ;
∗ @see G e s t u r e L i s t e n e r }
∗ @see P a t h O b j e c t
∗ <p> public void l o a d P a t h s ( ) {

70
∗ 05−Apr −2009 O u t pu t O b je ct oo = new O u tpu tOb j e ct ( ) ;
APPENDIX D. CODE
path = oo . i n p u t P a t h ( ) ; addEndNum = c o n v A r ra y . l en C o n v A rr a y ( ) − 1 ;
c o n v A r r a y = oo . inputC onvAr ray ( ) ;
} G e s t u r e O b j e c t newGestOb = new G e s t u r e O b j e c t ( newName ,
addEndNum , 0 ) ;
public void newMotion ( M o t i o n S e n s i n g E v e n t e ) {
i f ( cOn == true ) { c o n v A r r a y . s e t ( c o n v Ar r a y . le n C o n v A r r a y ( ) − 1 , newGestOb ) ;
AccArr . addMotionEvent ( e ) ;
} return c o n v A r ra y . l en C o n v A rr a y ( ) − 1 ;
} } else {
addNewEnd = true ;
p r i v a t e void n o t i f y E v e n t L i s t e n e r s ( G e s t u r e R e c o g n i s e d E v e n t e v t ) { numAdds = nnumAdds ;
for ( EventListener l i s t e n e r : getEventListeners ( ) ) { addEndNum = g e s t . g e t G e s t u r e I D ( ) ;
( ( G e s t u r e L i s t e n e r ) l i s t e n e r ) . onGestureEvent ( evt ) ; return addEndNum ;
} }
} }
}
public void r e m o v e E v e n t L i s t e n e r s ( G e s t u r e L i s t e n e r l i s t e n e r ) {
l i s t e n e r s . remove ( G e s t u r e L i s t e n e r . c l a s s , l i s t e n e r ) ;
}

public void r e s e t G e s t u r e s ( i n t l e v e l ) {
D.13 wiinote.gesture.ListenerGestureCapture.java
i f ( l e v e l == 1 ) {
path = new P a t h O b j e c t ( −2) ;
savePaths ( ) ; /∗ ∗
W i i n o t e . g u i . msgWindow . newMessage ( ” G e s t u r e System R e s e t ” , ∗ Handles G e s t u r e R e c o g n i s e d E v e n t s thrown by a G e s t u r e R e c o g n i s e r .
2) ; ∗ T a k e s t h e g e s t u r e t h a t h a s b e e n f o u n d and d e c i d e s w h a t a c t i o n
} e l s e i f ( l e v e l == 2 ) { i s to be
c o n v A r r ay = new ConvArray ( ) ; ∗ p e r f o r m e d now t h a t t h e g e s t u r e h a s b e e n r e c o g n i s e d .
path = new P a t h O b j e c t ( −2) ; ∗ <p>
savePaths ( ) ; ∗ @see G e s t u r e R e c o g n i s e r
W i i n o t e . g u i . msgWindow . newMessage ( ” G e s t u r e Tree R e s e t ” , ∗ <p>
2) ; ∗ 05−Apr −2009
} ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
public void s a v e P a t h s ( ) { ∗/
O u t pu t O bje ct oo = new O u t pu t O b jec t ( ) ; public c l a s s L i s t e n e r G e s t u r e C a p t u r e implements G e s t u r e L i s t e n e r {
oo . o u t p u t P a t h ( path ) ;
oo . outputConvArray ( c o n v A r r a y ) ; public static final int ACTION EXIT = 7 ;
} public static final int ACTION INSTRUMENT FLUTE = 3 ;
public static final int ACTION INSTRUMENT IR = 5 ;
p r i v a t e void s e a r c h G e s t u r e s ( ) { public static final int ACTION INSTRUMENT PITCH ROLL = 4 ;
cOn = f a l s e ; public static final int ACTION OCTAVE DOWN = 2 ;
AccArr . r e m o v e L i k e M o t i o n s ( ) ; public static final int ACTION OCTAVE UP = 1 ;
i n t motion = path . f o l l o w P a t h ( AccArr . t o I n t A r r a y ( ) ) ; public static final int ACTION PRINT GESTURE FOUND = 0 ;
public static final int ACTION SET MIDI = 6 ;
n o t i f y E v e n t L i s t e n e r s (new G e s t u r e R e c o g n i s e d E v e n t ( t h i s ,
motion , I n t e g e r public ConvArray a c t i o n s = W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y ;
. t o S t r i n g ( motion ) , motion ) ) ;

AccArr . emptyArray ( ) ; public void actionDB ( i n t g e s t u r e ) {


} Wiimote wmote = W i i n o t e . g u i . wmote ;
O r i e n t a t i o n W i i m o t e E v e n t P a n e l orPan = W i i n o t e . g u i . orPan ;
public i n t setToAdd ( i n t nnumAdds , S t r i n g newName ) { A c c e l e r a t i o n W i i m o t e E v e n t P a n e l acPan = W i i n o t e . g u i . acPan ;
MessagesWindow msgWindow = W i i n o t e . g u i . msgWindow ;
G e s t u r e O b j e c t g e s t = c o n v A r r a y . f i n d G e s t u r e O b j e c t ( newName ) ;
i f ( g e s t == n u l l ) { int a c t i o n = −1;
addNewEnd = true ;
numAdds = nnumAdds ; GestureObject caughtGesture =
actions . findGestureObject ( gesture ) ;

71
c o n v A r r ay . add ( n u l l ) ; i f ( c a u g h t G e s t u r e == n u l l ) {
APPENDIX D. CODE
msgWindow . newMessage ( ” G e s t u r e Not R e c o g n i s e d ” , 6 ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r P i t c h R o l l ( ) ) ;
return ;
} else { msgWindow . newMessage ( ” P i t c h / R o l l I n s t r u m e n t Selected ” ,
action = caughtGesture . getGestureResult ( ) ; 1) ;
} break ;
case ( 5 ) :
switch ( a c t i o n ) { W i i m o t e L i s t e n e r [ ] wmLi2 =
case ( −1) : // e s c a p e c l a u s e ! ! wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
break ;
case ( 0 ) : f o r ( W i i m o t e L i s t e n e r l : wmLi2 ) {
msgWindow . newMessage ( ” G e s t u r e ” + wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ;
c a u g h t G e s t u r e . getGestureName ( ) }
+ ” Recognised ” , 6) ;
break ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
case ( 1 ) : wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
i n t newuOct = W i i n o t e . p r o c e s s . g e t O c t a v e ( ) + 1 ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
i f ( newuOct > 7 ) {
newuOct = 7 ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r L e d s ( ) ) ;
}
W i i n o t e . p r o c e s s . s e t O c t a v e ( newuOct ) ; msgWindow . newMessage ( ”Hand Waving I n s t r u m e n t S e l e c t e d ” ,
msgWindow . newMessage ( ” G e s t u r e R e c o g n i s e d : Octave Up , Now 1) ;
” break ;
+ newuOct , 4 ) ; case ( 6 ) :
break ; Wiinote . gui . s e t M i d i S e t t i n g s ( ) ;
case ( 2 ) : break ;
i n t newdOct = W i i n o t e . p r o c e s s . g e t O c t a v e ( ) − 1 ; case ( 7 ) :
i f ( newdOct < 0 ) { I m a g e I c o n e x i t I c o n = new I m a g e I c o n (
newdOct = 0 ; ”C: \ \ Documents and
} S e t t i n g s \\ L o u i s .METALMACHINE\\ w o r k s p a c e \\ M u s i c a l W i i m o t e \\ s r c \\ i c o n s \\ e x i t . pn
W i i n o t e . p r o c e s s . s e t O c t a v e ( newdOct ) ;
msgWindow . newMessage ( ” G e s t u r e R e c o g n i s e d : Octave Down , int r e s = JOptionPane . s h o w O p t i o n D i a l o g ( W i i n o t e . g u i . frame ,
Now ” ” Are You S u r e You Want To Quit ? ” , ” Quit ? ” ,
+ newdOct , 5 ) ; JOptionPane . OK CANCEL OPTION,
break ; JOptionPane .WARNING MESSAGE,
case ( 3 ) : e x i t I c o n , null , null ) ;
W i i m o t e L i s t e n e r [ ] wmL = wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ; if ( r e s == JOptionPane . OK OPTION) {
System . e x i t ( 0 ) ;
f o r ( W i i m o t e L i s t e n e r l : wmL) { }
wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ; break ;
} }
}
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; @ O v er r ide
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ; public void o n G e s t u r e E v e n t ( G e s t u r e R e c o g n i s e d E v e n t e v t ) {
actionDB ( e v t . g e t G e s t u r e F o u n d ( ) ) ;
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r F l u t e ( ) ) ; }
}
msgWindow . newMessage ( ” F l u t e I n s t r u m e n t S e l e c t e d ” , 1) ;
break ;
case ( 4 ) :
W i i m o t e L i s t e n e r [ ] wmLi =
wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
D.14 wiinote.gesture.PathObject.java
f o r ( W i i m o t e L i s t e n e r l : wmLi ) {
wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ; /∗ ∗
} ∗ A c t s a s a n o d e i n an n−t r e e w h i c h r e p r e s e n t s t h e p o s s i b l e s
g e s t u r e s in
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ; ∗ t h e s y s t e m . The e n d I n t v a l u e i s t h e ID o f t h e g e s t u r e t h a t
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; has been
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ; ∗ p e r f o r m e d t o r e a c h t h i s node . I f i t i s n u l l t h e n t h e g e s t u r e

72
has not
APPENDIX D. CODE
∗ been r e c o g n i s e d . if( path . g e t ( hd ) != n u l l ) {
∗ <p> path . g e t ( hd ) . c r e a t e P a t h ( endID , t l ) ;
∗ T h i s i s c u r r e n t l y a 9− t r e e , w i t h one b r a n c h f o r e a c h } else {
direction defined path . s e t ( hd , new P a t h O b j e c t ( −2) ) ;
∗ i n A c c D i r e c t i o n O b j e c t ( @see A c c D i r e c t i o n O b j e c t ) . I f path . g e t ( hd ) . c r e a t e P a t h ( endID , t l ) ;
AccDirectionObject }
∗ was t o b e u p d a t e d t o h a v e more p o s s i b l e d i r e c t i o n s ( s a y , b y }
adding the }
∗ Y d i r e c t i o n ) then t h i s c l a s s would have to be updated to
b u i l d an n−t r e e public i n t f o l l o w P a t h ( i n t [ ] dirs ) {
∗ o f t h e number o f d i r e c t i o n s d e f i n e d i n t h e A c c D i r e c t i o n O b j e c t .
∗ <p> for ( int i : dirs ) {
∗ 05−Apr −2009 }

∗ @author Louis Fellows i f ( d i r s . l e n g t h == 0 ) {
∗ @version 1.0.0.0 return e n d I n t ;
∗/ } else {
public c l a s s P a t h O b j e c t implements S e r i a l i z a b l e { i n t hd = d i r s [ 0 ] ;
i n t [ ] t l = new i n t [ d i r s . l e n g t h − 1 ] ;
p r i v a t e s t a t i c f i n a l long for ( int i = 0 ; i < d i r s . length − 1 ; i ++) {
s e r i a l V e r s i o n U I D = −2458132543035895739L ; tl [ i ] = dirs [ i + 1];
private int endInt ; }
p r i v a t e A r r a y L i s t <PathObject> path ;
if( path . g e t ( hd ) != n u l l ) {
public P a t h O b j e c t ( i n t e n d I n t ) { return path . g e t ( hd ) . f o l l o w P a t h ( t l ) ;
this . endInt = endInt ; } else {
path = new A r r a y L i s t <PathObject >() ; return −1;
path . add ( 0 , n u l l ) ; }
path . add ( 1 , n u l l ) ; }
path . add ( 2 , n u l l ) ; }
path . add ( 3 , n u l l ) ;
path . add ( 4 , n u l l ) ; public P a t h O b j e c t g e t P a t h ( i n t d i r ) {
path . add ( 5 , n u l l ) ; i f ( path . g e t ( d i r ) != n u l l ) {
path . add ( 6 , n u l l ) ; return path . g e t ( d i r ) ;
path . add ( 7 , n u l l ) ; }
path . add ( 8 , n u l l ) ; return n u l l ;
} }

public void addPath ( P a t h O b j e c t x , int dir ) { public S t r i n g p r i n t T a b l e ( i n t i n d e n t ) {


i f ( path . g e t ( d i r ) == n u l l ) { S t r i n g d e n t = new S t r i n g ( ) ;
path . add ( d i r , x ) ; S t r i n g r e t u r n S t r i n g = new S t r i n g ( ) ;
} f o r ( i n t i n = 0 ; i n <= i n d e n t ; i n ++) {
} dent = dent + ” ”;
}
public void c r e a t e P a t h ( i n t endID , int [ ] dirs ) {
for ( int i : d i r s ) { f o r ( i n t i = 0 ; i < 9 ; i ++) {
} i f ( path . g e t ( i ) == n u l l ) {
r e t u r n S t r i n g . c o n c a t ( d e n t + ” [ ” + i + ” ] = n u l l \n” ) ;
if( d i r s . l e n g t h == 0 ) { } else {
i f ( e n d I n t == −2) { r e t u r n S t r i n g . concat ( dent + ” [ ” + i + ” ] = ”
e n d I n t = endID ; + path . g e t ( i ) . e n d I n t +” \n” ) ;
} e l s e i f ( e n d I n t != endID ) { path . g e t ( i ) . p r i n t T a b l e ( i n d e n t + 1 ) ;
// t h r o w p a t h a l r e a d y e x i s t s e x c e p t i o n ! }
} }
} else { return r e t u r n S t r i n g ;
i n t hd = d i r s [ 0 ] ; }
i n t [ ] t l = new i n t [ d i r s . l e n g t h − 1 ] ; }
f o r ( i n t i = 0 ; i < d i r s . l e n g t h − 1 ; i ++) {
tl [ i ] = dirs [ i + 1];
}

73
APPENDIX D. CODE
D.15 wiinote.ui.GestureGui.java BoxLayout . X AXIS ) ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l ,
BoxLayout . X AXIS ) ) ;
s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
/∗ ∗ BoxLayout . X AXIS ) ) ;
∗ C r e a t e s a frame w i t h a l l t h e c o n t r o l s t o the Gesture Capture p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ;
System
∗ Within i t a l o n g w i t h t h e code to c o n t r o l these functions . p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
∗ <p> p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
∗ 06−Apr −2009
∗ panel . setBorder ( lineBdr ) ;
∗ @author Louis Fellows J L a b e l h e a d e r = new J L a b e l ( ” Capture G e s t u r e ” ) ;
∗ @version 1.0.0.0 h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ;
∗/ t o p P a n e l . add ( h e a d e r ) ;
public c l a s s G e s t u r e G u i extends JFrame { t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;

p r i v a t e s t a t i c f i n a l long h e a d e r . setMinimumSize (new Dimension ( 4 0 0 , 2 0 ) ) ;


s e r i a l V e r s i o n U I D = −3304226460527517729L ; p a n e l . add ( t o p P a n e l ) ;
p r i v a t e JComboBox c a p G e s t s ; p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ;
p r i v a t e JComboBox g e s t s ;
private Border l i n e B d r ; J L a b e l t i t l e = new J L a b e l ( ” G e s t u r e : ” ) ;
p r i v a t e JComboBox r e p s ; i n n e r P a n e l . add ( t i t l e ) ;
p r i v a t e JComboBox r e s p ;
private JLabel s a v e S t r i n g ; c a p G e s t s = new
JComboBox ( W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y . l i s t N a m e s ( ) ) ;
public G e s t u r e G u i ( ) { c a p G e s t s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
s e t S i z e (400 , 600) ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
s e t T i t l e ( ” W i i n o t e G e s t u r e Capture ” ) ;
s e t D e f a u l t C l o s e O p e r a t i o n ( JFrame . EXIT ON CLOSE ) ; }
getRootPane ( ) . s e t W i n d o w D e c o r a t i o n S t y l e ( 0 ) ; }) ;
I m a g e I c o n w i i m o t e I c o n = new i n n e r P a n e l . add ( c a p G e s t s ) ;
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w i i m o t e . png ” ) ;
setIconImage ( wiimoteIcon . getImage ( ) ) ; S t r i n g [ ] r e s p o n s e s = { ” 10 Times ” , ” 20 Times ” , ” 30 Times ” ,
l i n e B d r = B o r d e r F a c t o r y . c r e a t e L i n e B o r d e r (new C o l o r ( 1 6 0 , ” 40 Times ” ,
160 , 160) ) ; ” 50 Times ” , ” 60 Times ” , ” 70 Times ” , ” 80 Times ” , ” 90
Times ” ,
setResizable ( false ) ; ” 100 Times ” } ;
setLocationRelativeTo ( getParent () ) ; r e p s = new JComboBox ( r e s p o n s e s ) ;

s e t L a y o u t (new BoxLayout ( t h i s . g e t C o n t e n t P a n e ( ) , i n n e r P a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 2 0 ) ) ) ;


BoxLayout . Y AXIS ) ) ; i n n e r P a n e l . add ( r e p s ) ;
r e s p . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
add ( g e s t u r e R e s p o n s e P a n e l ( ) ) ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ; s a v e S t r i n g . s e t T e x t ( ” Unsaved ” ) ;
add ( c a p t u r e G e s t u r e P a n e l ( ) ) ; }
add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ; }) ;
add ( g e s t u r e R e s e t P a n e l ( ) ) ; i n n e r P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ;
add ( Box . c r e a t e G l u e ( ) ) ; p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 20) ) ) ;
add ( r e t u r n P a n e l ( ) ) ; p a n e l . add ( i n n e r P a n e l ) ;

s e t V i s i b l e ( true ) ; JButton s a v e = new JButton ( ” Capture ” ) ;


} s a v e . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
public J P a n e l c a p t u r e G e s t u r e P a n e l ( ) { int times = ( reps . g e t S e l e c t e d I n d e x ( ) + 1) ∗ 10;
J P a n e l p a n e l = new J P a n e l ( ) ; W i i n o t e . g e s t u r e R e c o g . setToAdd ( t i m e s , ( S t r i n g ) c a p G e s t s
J P a n e l t o p P a n e l = new J P a n e l ( ) ; . getSelectedItem () ) ;
J P a n e l i n n e r P a n e l = new J P a n e l ( ) ; W i i n o t e . g u i . msgWindow . newMessage ( ” Adding New
J P a n e l s a v e P a n e l = new J P a n e l ( ) ; Gesture ” , 2) ;
W i i n o t e . g u i . msgWindow . newMessage ( ” P l e a s e P e r f o r m t h e

74
i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l , Gesture ”
APPENDIX D. CODE
+ t i m e s + ” Times ! ” , 2 ) ; JButton r e s e t G e s t u r e s = new JButton ( ” R e s e t G e s t u r e Tree ” ) ;
r e s e t G e s t u r e s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
dispose () ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
} i n t n = JOptionPane . s h o w C o n f i r m D i a l o g ( G e s t u r e G u i . t h i s ,
}) ; ” Are You S u r e ? ” , ” R e a l l y ? ? ” ,
JOptionPane . YES NO OPTION) ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; i f ( n == JOptionPane . YES OPTION) {
s a v e P a n e l . add ( s a v e ) ; Wiinote . gestureRecog . r e s e t G e s t u r e s ( 1 ) ;
}
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 10) ) ) ; }
p a n e l . add ( s a v e P a n e l ) ; }) ;
return p a n e l ;
} s a v e P a n e l . add ( r e s e t G e s t u r e s ) ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
public J P a n e l g e s t u r e R e s e t P a n e l ( ) {
J P a n e l p a n e l = new J P a n e l ( ) ; p a n e l . add ( s a v e P a n e l ) ;
J P a n e l t o p P a n e l = new J P a n e l ( ) ; return p a n e l ;
J P a n e l i n n e r P a n e l = new J P a n e l ( ) ; }
J P a n e l s a v e P a n e l = new J P a n e l ( ) ;
public J P a n e l g e s t u r e R e s p o n s e P a n e l ( ) {
i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l , J P a n e l p a n e l = new J P a n e l ( ) ;
BoxLayout . X AXIS ) ) ; J P a n e l t o p P a n e l = new J P a n e l ( ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l , J P a n e l i n n e r P a n e l = new J P a n e l ( ) ;
BoxLayout . X AXIS ) ) ; J P a n e l s a v e P a n e l = new J P a n e l ( ) ;
s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
BoxLayout . X AXIS ) ) ; i n n e r P a n e l . s e t L a y o u t (new BoxLayout ( i n n e r P a n e l ,
p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ; BoxLayout . X AXIS ) ) ;
t o p P a n e l . s e t L a y o u t (new BoxLayout ( t o p P a n e l ,
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 180) ) ; BoxLayout . X AXIS ) ) ;
p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 180) ) ; s a v e P a n e l . s e t L a y o u t (new BoxLayout ( s a v e P a n e l ,
BoxLayout . X AXIS ) ) ;
panel . setBorder ( lineBdr ) ; p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . Y AXIS ) ) ;
J L a b e l h e a d e r = new J L a b e l ( ” System R e s e t s ” ) ;
h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ; p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
t o p P a n e l . add ( h e a d e r ) ; p a n e l . setMinimumSize (new Dimension ( 4 0 0 , 1 8 0 ) ) ;
t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
panel . setBorder ( lineBdr ) ;
h e a d e r . setMinimumSize (new Dimension ( 4 0 0 , 20) ) ; J L a b e l h e a d e r = new J L a b e l ( ” G e s t u r e R e s p o n s e s ” ) ;
h e a d e r . s e t F o n t (new Font ( ” A r i a l ” , Font . BOLD, 1 5 ) ) ;
p a n e l . add ( t o p P a n e l ) ; t o p P a n e l . add ( h e a d e r ) ;
t o p P a n e l . add ( Box . c r e a t e G l u e ( ) ) ;
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 5 , 0) ) ) ;
h e a d e r . setMinimumSize (new Dimension ( 4 0 0 , 2 0 ) ) ;
JButton r e s e t S y s t e m = new JButton ( p a n e l . add ( t o p P a n e l ) ;
” R e s e t G e s t u r e System ( D e l e t e A l l G e s t u r e s ) ” ) ; p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 5) ) ) ;
r e s e t S y s t e m . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { J L a b e l t i t l e = new J L a b e l ( ” G e s t u r e : ” ) ;
i n t n = JOptionPane . s h o w C o n f i r m D i a l o g ( G e s t u r e G u i . t h i s , i n n e r P a n e l . add ( t i t l e ) ;
” Are You S u r e ? ” , ” R e a l l y ? ? ” ,
JOptionPane . YES NO OPTION) ; g e s t s = new
i f ( n == JOptionPane . YES OPTION) { JComboBox ( W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y . l i s t N a m e s ( ) ) ;
Wiinote . gestureRecog . r e s e t G e s t u r e s ( 2 ) ; g e s t s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
} public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
} G e s t u r e O b j e c t gOb = W i i n o t e . g e s t u r e R e c o g . c o n v A r r a y
}) ; . findGestureObject (( String )
gests . getSelectedItem () ) ;
i n n e r P a n e l . add ( r e s e t S y s t e m ) ; r e s p . s e t S e l e c t e d I n d e x ( gOb . g e t G e s t u r e R e s u l t ( ) ) ;
i n n e r P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; s a v e S t r i n g . setText ( ”” ) ;
}
p a n e l . add ( i n n e r P a n e l ) ; }) ;

75
i n n e r P a n e l . add ( g e s t s ) ;
APPENDIX D. CODE
i n n e r P a n e l . add (new J L a b e l ( ” Responds With : ” ) ) ; }
S t r i n g [ ] r e s p o n s e s = { ” P r i n t ’ G e s t u r e Found ’ ” , ” Octave Up” ,
” Octave Down” , ” F l u t e I n s t r u m e n t ” , ” P i t c h / R o l l }
Instrument ” ,
” IR I n s t r u m e n t ” , ” S e t MIDI” , ” E x i t ” } ;
r e s p = new JComboBox ( r e s p o n s e s ) ;

i n n e r P a n e l . add ( r e s p ) ;
D.16 wiinote.ui.Gui.java
r e s p . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
s a v e S t r i n g . s e t T e x t ( ” Unsaved ” ) ; /∗ ∗
} ∗ The Main Gui O b j e c t , C o n t a i n s a l l o b j e c t s t h a t b u i l d up t h e
}) ; main
∗ s y s t e m g u i and much o f t h e c o d e t h a t c o n t r o l s t h e s y s t e m .
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 20) ) ) ; ∗ <p>
p a n e l . add ( i n n e r P a n e l ) ; ∗ 06−Apr −2009

JButton s a v e = new JButton ( ” Save ” ) ; ∗ @author Louis Fellows
s a v e . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { ∗ @version 1.0.0.0
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { ∗/
Wiinote . g e s t u r e R e c o g . convArray . s e t G e s t u r e R e s p o n s e ( public c l a s s Gui {
( String ) gests . getSelectedItem () , resp
. getSelectedIndex () ) ; public A c c e l e r a t i o n W i i m o t e E v e n t P a n e l acPan ;
s a v e S t r i n g . s e t T e x t ( ” Saved ” ) ; public J P a n e l eWindow ;
Wiinote . gestureRecog . savePaths ( ) ; public JFrame f r a m e ;
} public MessagesWindow msgWindow ;
}) ; public JButton newGesture ;
public NoteWindow noteWindow ;
s a v e S t r i n g = new J L a b e l ( ” ” ) ; public O r i e n t a t i o n W i i m o t e E v e n t P a n e l orPan ;
public JLabel s t a t u s b a r ;
s a v e P a n e l . add ( s a v e S t r i n g ) ; public Toolkit t o o l k i t ;
s a v e P a n e l . add ( Box . c r e a t e G l u e ( ) ) ; public Wiimote wmote ;
s a v e P a n e l . add ( s a v e ) ;
public Gui ( ) {
p a n e l . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 0 , 10) ) ) ; f r a m e = new JFrame ( ) ;
p a n e l . add ( s a v e P a n e l ) ;
return p a n e l ; I m a g e I c o n w i i m o t e I c o n = new
} I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w i i m o t e . png ” ) ;
frame . setIconImage ( wiimoteIcon . getImage ( ) ) ;
public J P a n e l g e s t u r e T i t l e ( ) {
J P a n e l p a n e l = new J P a n e l ( ) ; frame . s e t S i z e ( 1 0 2 4 , 768) ;
p a n e l . s e t L a y o u t (new BoxLayout ( p a n e l , BoxLayout . X AXIS ) ) ; frame . s e t T i t l e ( ” Wiinote ” ) ;
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 2 0 ) ) ; f r a m e . s e t D e f a u l t C l o s e O p e r a t i o n ( JFrame . EXIT ON CLOSE ) ;
p a n e l . s e t A l i g n m e n t Y (LEFT ALIGNMENT) ;
J L a b e l t i t l e = new J L a b e l ( ” G e s t u r e Capture P r o p e r t i e s ” ) ; B o r d e r L a y o u t l o = new B o r d e r L a y o u t ( ) ;
p a n e l . add ( t i t l e ) ; frame . setLayout ( l o ) ;
return p a n e l ;
} J P a n e l t o o l b a r s = new J P a n e l ( ) ;
t o o l b a r s . s e t L a y o u t (new BoxLayout ( t o o l b a r s ,
public J P a n e l r e t u r n P a n e l ( ) { BoxLayout . X AXIS ) ) ;
J P a n e l p a n e l = new J P a n e l ( ) ; t o o l b a r s . add ( c r e a t e G e n e r a l T o o l b a r ( ) ) ;
p a n e l . setMaximumSize (new Dimension ( 4 0 0 , 2 0 ) ) ; t o o l b a r s . add ( c r e a t e W i i m o t e T o o l b a r ( ) ) ;
p a n e l . s e t A l i g n m e n t Y (RIGHT ALIGNMENT) ; f r a m e . add ( t o o l b a r s , B o r d e r L a y o u t .NORTH) ;
JButton r e t u r n B u t t o n = new JButton ( ” Return ” ) ;
r e t u r n B u t t o n . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { J P a n e l t o o l b a r s W e s t = new J P a n e l ( ) ;
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { t o o l b a r s W e s t . s e t L a y o u t (new BoxLayout ( t o o l b a r s W e s t ,
dispose () ; BoxLayout . Y AXIS ) ) ;
} t o o l b a r s W e s t . add ( c r e a t e M u s i c a l T o o l b a r ( ) ) ;
}) ; f r a m e . add ( t o o l b a r s W e s t , B o r d e r L a y o u t .WEST) ;
p a n e l . add ( r e t u r n B u t t o n ) ;

76
return p a n e l ; J P a n e l window = new J P a n e l ( ) ;
APPENDIX D. CODE
window . s e t B a c k g r o u n d (new C o l o r ( 0 , 0 , 0 ) ) ; return w i i m o t e ;
window . s e t L a y o u t (new BoxLayout ( window , BoxLayout . Y AXIS ) ) ; }
f r a m e . add ( window , B o r d e r L a y o u t .CENTER) ;
p r i v a t e JToolBar c r e a t e G e n e r a l T o o l b a r ( ) {
eWindow = new J P a n e l ( ) ; JToolBar t o o l b a r = new JToolBar ( ) ;
eWindow . s e t L a y o u t (new BoxLayout ( eWindow , BoxLayout . Y AXIS ) ) ; toolbar . setFloatable ( false ) ;

eWindow . add (new J L a b e l ( ” O r i e n t a t i o n ” ) ) ; f i n a l I m a g e I c o n e x i t I c o n = new


orPan = new O r i e n t a t i o n W i i m o t e E v e n t P a n e l ( ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ e x i t . png ” ) ;
eWindow . add ( orPan , B o r d e r L a y o u t . EAST) ; f i n a l I m a g e I c o n m i d i I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m i d i . png ” ) ;
eWindow . add (new J L a b e l ( ” A c c e l l e r a t i o n ” ) ) ;
acPan = new A c c e l e r a t i o n W i i m o t e E v e n t P a n e l ( ) ; f i n a l JButton e x i t = new JButton ( e x i t I c o n ) ;
eWindow . add ( acPan , B o r d e r L a y o u t . EAST) ; f i n a l JButton m i d i o p s = new JButton ( m i d i I c o n ) ;

eWindow . s e t V i s i b l e ( f a l s e ) ; e x i t . setToolTipText ( ” Exit ” ) ;


t o o l b a r . add ( e x i t ) ;
f r a m e . add ( eWindow , B o r d e r L a y o u t . EAST) ; e x i t . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
s t a t u s b a r = new J L a b e l ( ” Wiinote , By L o u i s F e l l o w s 2009 ” ) ; i n t r e s = JOptionPane . s h o w O p t i o n D i a l o g ( frame ,
s t a t u s b a r . setMinimumSize (new Dimension ( 8 0 0 , 3 0 ) ) ; ” Are You S u r e You Want To Quit ? ” , ” Quit ? ” ,
s t a t u s b a r . setBorder ( BorderFactory JOptionPane . OK CANCEL OPTION,
. c r e a t e E t c h e d B o r d e r ( E t c h e d B o r d e r . RAISED) ) ; JOptionPane .WARNING MESSAGE, e x i t I c o n , n u l l ,
null ) ;
f r a m e . add ( s t a t u s b a r , B o r d e r L a y o u t .SOUTH) ; i f ( r e s == JOptionPane . OK OPTION) {
System . e x i t ( 0 ) ;
t o o l k i t = frame . g e t T o o l k i t ( ) ; }
Dimension s i z e = t o o l k i t . g e t S c r e e n S i z e ( ) ; }
f r a m e . s e t L o c a t i o n ( s i z e . w i d t h / 2 − f r a m e . getWidth ( ) / 2 , }) ;
s i z e . height
/ 2 − frame . ge t H e ig h t ( ) / 2) ; m i d i o p s . s e t T o o l T i p T e x t ( ” S e t Midi O p t i o n s ” ) ;
t o o l b a r . add ( m i d i o p s ) ;
msgWindow = new MessagesWindow ( 4 0 0 , 4 0 0 , new C o l o r ( 0 , 0, m i d i o p s . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
0 ) , new C o l o r ( public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
225 , 225 , 225) ) ; setMidiSettings () ;
window . add ( msgWindow ) ; }
msgWindow . s e t A l i g n m e n t X ( Component . LEFT ALIGNMENT) ; }) ;

window . add ( Box . c r e a t e V e r t i c a l G l u e ( ) ) ; return t o o l b a r ;


}
noteWindow = new NoteWindow (new C o l o r ( 2 2 5 , 2 2 5 , 2 2 5 ) ,
new C o l o r ( 0 , 0 , 0 ) ) ; p r i v a t e JToolBar c r e a t e M u s i c a l T o o l b a r ( ) {
window . add ( noteWindow ) ; JToolBar t o o l b a r = new JToolBar ( S w i n g C o n s t a n t s . VERTICAL) ;
noteWindow . s e t A l i g n m e n t X ( Component . LEFT ALIGNMENT) ; toolbar . setFloatable ( false ) ;

f r a m e . s e t V i s i b l e ( true ) ; I m a g e I c o n o c t U p I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ octaveUp . png ” ) ;
} I m a g e I c o n o c t D n I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ octaveDown . png ” ) ;
public Wiimote c o n n e c t W i i m o t e ( ) { I m a g e I c o n s t a v e 1 I c o n = new
Wiimote [ ] w i i m o t e s = WiiUseApiManager . g e t W i i m o t e s ( 1 , true ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ s t a v e 1 . png ” ) ;
i f ( w i i m o t e s . l e n g t h == 0 ) { I m a g e I c o n s t a v e C r d I c o n = new
s e t S t a t u s b a r ( ”No Connected Wiimotes Found ” , 2 ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ s t a v e c h o r d . png ” ) ;
return n u l l ; I m a g e I c o n PRIcon = new
} I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ p i t c h R o l l i n s . png ” ) ;
Wiimote w i i m o t e = w i i m o t e s [ 0 ] ; I m a g e I c o n FLIcon = new
wiimote . activat eMotionSensi ng ( ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ f l u t e i n s . png ” ) ;
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ; I m a g e I c o n IRHIcon = new
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ; I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ IRHandsIns . png ” ) ;
w i i m o t e . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;

77
s e t S t a t u s b a r ( ” Wiimote Connected ” , 1 ) ; f i n a l JButton octUp = new JButton ( o c t U p I c o n ) ;
APPENDIX D. CODE
final JButton octDn = new JButton ( o c t D n I c o n ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
final JButton p i t c h R o l l = new JButton ( PRIcon ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
final JButton f l u t e = new JButton ( FLIcon ) ;
final JButton IRHands = new JButton ( IRHIcon ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r F l u t e ( ) ) ;
final JButton s t a v e 1 = new JButton ( s t a v e 1 I c o n ) ; wmote . d e a c t i v a t e I R T R a c k i n g ( ) ;
final JButton s t a v e C r d = new JButton ( s t a v e C r d I c o n ) ; wmote . a c t i v a t e M o t i o n S e n s i n g ( ) ;

octUp . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { wmote . s e t L e d s ( true , false , false , false ) ;


public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
int i = Wiinote . p r o c e s s . getOctave ( ) ; msgWindow . newMessage ( ” F l u t e I n s t r u m e n t S e l e c t e d ” , 1) ;
i ++; }
Wiinote . p r o c e s s . setOctave ( i ) ; }) ;
i f ( i == 7 ) {
octUp . s e t E n a b l e d ( f a l s e ) ; p i t c h R o l l . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
} public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
msgWindow . newMessage ( ” Octave i s now s e t t o ” i f ( wmote == n u l l ) {
+ Wiinote . p r o c e s s . getOctave ( ) , 4) ; return ;
octDn . s e t E n a b l e d ( true ) ; }
} W i i m o t e L i s t e n e r [ ] wmL =
}) ; wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
octDn . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { f o r ( W i i m o t e L i s t e n e r l : wmL) {
int i = Wiinote . p r o c e s s . getOctave ( ) ; wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ;
i −−; }
Wiinote . p r o c e s s . setOctave ( i ) ;
i f ( i == −1) { wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
octDn . s e t E n a b l e d ( f a l s e ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
} wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
msgWindow . newMessage ( ” Octave i s now s e t t o ”
+ Wiinote . p r o c e s s . getOctave ( ) , 5) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new
octUp . s e t E n a b l e d ( true ) ; ListenerPitchRoll () ) ;
} wmote . d e a c t i v a t e I R T R a c k i n g ( ) ;
}) ; wmote . a c t i v a t e M o t i o n S e n s i n g ( ) ;

s t a v e 1 . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { wmote . s e t L e d s ( f a l s e , true , false , false ) ;


public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
msgWindow . newMessage ( ” P l a y i n g S i n g l e N o t e s ” , 1 ) ; msgWindow . newMessage ( ” P i t c h / R o l l Instrument
W i i n o t e . p r o c e s s . setOutType ( 1 ) ; S e l e c t e d ” , 1) ;
} }
}) ; }) ;

s t a v e C r d . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { IRHands . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {


public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
W i i n o t e . p r o c e s s . setOutType ( 2 ) ; i f ( wmote == n u l l ) {
msgWindow . newMessage ( ” P l a y i n g Chords ” , 1 ) ; return ;
} }
}) ; W i i m o t e L i s t e n e r [ ] wmL =
wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ;
f l u t e . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { f o r ( W i i m o t e L i s t e n e r l : wmL) {
i f ( wmote == n u l l ) { wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ;
return ; }
}
W i i m o t e L i s t e n e r [ ] wmL = wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
wmote . g e t W i i M o t e E v e n t L i s t e n e r s ( ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( orPan ) ;
wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new ProcessWMListen ( ) ) ;
f o r ( W i i m o t e L i s t e n e r l : wmL) {
wmote . r e m o v e W i i M o t e E v e n t L i s t e n e r s ( l ) ; wmote . a d d W i i M o t e E v e n t L i s t e n e r s (new L i s t e n e r L e d s ( ) ) ;
} wmote . a c t i v a t e I R T R a c k i n g ( ) ;
wmote . d e a c t i v a t e M o t i o n S e n s i n g ( ) ;

78
wmote . a d d W i i M o t e E v e n t L i s t e n e r s ( acPan ) ;
APPENDIX D. CODE
wmote . s e t L e d s ( f a l s e , f a l s e , true , false ) ; d i s c o n n e c t . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {

msgWindow . newMessage ( ” IR Hand Waving I n s t r u m e n t public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {


S e l e c t e d ! ! ” , 1) ; wmote . d i s c o n n e c t ( ) ;
} c o n n e c t . s e t E n a b l e d ( true ) ;
}) ; disconnect . setEnabled ( false ) ;
rumble1 . s e t E n a b l e d ( f a l s e ) ;
toolbar . add ( octUp ) ;
toolbar . add ( octDn ) ; }
toolbar . add (new J S e p a r a t o r ( ) ) ; }) ;
toolbar . add ( f l u t e ) ;
toolbar . add ( p i t c h R o l l ) ; rumble1 . s e t E n a b l e d ( f a l s e ) ;
toolbar . add ( IRHands ) ; rumble1 . s e t T o o l T i p T e x t ( ”Rumble Wiimote 1 ” ) ;
toolbar . add (new J S e p a r a t o r ( ) ) ; t o o l b a r . add ( rumble1 ) ;
toolbar . add ( s t a v e 1 ) ; rumble1 . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
toolbar . add ( s t a v e C r d ) ;
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
return t o o l b a r ; msgWindow . newMessage ( ”Rumble Wiimote 1 ” , 6 ) ;
} wmote . a c t i v a t e R u m b l e ( ) ;
try {
p r i v a t e JToolBar c r e a t e W i i m o t e T o o l b a r ( ) { Thread . s l e e p ( 5 0 0 ) ;
JToolBar t o o l b a r = new JToolBar ( ) ; } catch ( I n t e r r u p t e d E x c e p t i o n e ) {
toolbar . setFloatable ( false ) ; e . printStackTrace () ;
}
I m a g e I c o n d i s c o n n e c t I c o n = new I m a g e I c o n ( wmote . d e a c t i v a t e R u m b l e ( ) ;
” . \ \ s r c \\ i c o n s \\ d i s c o n n e c t . png ” ) ;
I m a g e I c o n c o n n e c t I c o n = new }
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ c o n n e c t . png ” ) ; }) ;
I m a g e I c o n r u m b l e 1 I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ rumble1 . png ” ) ; showGraphs . s e t T o o l T i p T e x t ( ”Show/ Hide Graphs ” ) ;
I m a g e I c o n g r a p h s I c o n = new t o o l b a r . add ( showGraphs ) ;
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ g r a p h s . png ” ) ; showGraphs . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
I m a g e I c o n n e w G e s t u r e I c o n = new I m a g e I c o n (
” . \ \ s r c \\ i c o n s \\ newnunchuk . png ” ) ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
I m a g e I c o n G e s t u r e P r o p I c o n = new I m a g e I c o n ( eWindow . s e t V i s i b l e ( ! eWindow . i s V i s i b l e ( ) ) ;
” . \ \ s r c \\ i c o n s \\ G e s t u r e P r o p e r t i e s . png ” ) ; }
}) ;
f i n a l JButton c o n n e c t = new JButton ( c o n n e c t I c o n ) ;
f i n a l JButton d i s c o n n e c t = new JButton ( d i s c o n n e c t I c o n ) ; newGesture . s e t T o o l T i p T e x t ( ”New G e s t u r e ” ) ;
f i n a l JButton rumble1 = new JButton ( r u m b l e 1 I c o n ) ; t o o l b a r . add ( newGesture ) ;
f i n a l JButton showGraphs = new JButton ( g r a p h s I c o n ) ; newGesture . s e t E n a b l e d ( f a l s e ) ;
f i n a l JButton G e s t u r e P r o p = new JButton ( G e s t u r e P r o p I c o n ) ; newGesture . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
newGesture = new JButton ( n e w G e s t u r e I c o n ) ; public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
S t r i n g s = ( S t r i n g ) JOptionPane . s h o w I n p u t D i a l o g ( frame ,
c o n n e c t . s e t T o o l T i p T e x t ( ” Connect Wiimote ” ) ; ” Give t h e new g e s t u r e a name ! ” , ”Name New
t o o l b a r . add ( c o n n e c t ) ; Gesture ” ,
c o n n e c t . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) { JOptionPane . PLAIN MESSAGE , n u l l , n u l l , ”New
Gesture ” ) ;
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) {
wmote = c o n n e c t W i i m o t e ( ) ; String [ ] current =
i f ( wmote != n u l l ) { W i i n o t e . g e s t u r e R e c o g . c o n v A r ra y . l i s t N a m e s ( ) ;
connect . setEnabled ( false ) ; Boolean pass = f a l s e ;
d i s c o n n e c t . s e t E n a b l e d ( true ) ;
rumble1 . s e t E n a b l e d ( true ) ; do {
} p a s s = true ;
} f o r ( S t r i n g name : c u r r e n t ) {
}) ; i f ( name . e q u a l s ( s ) ) {
pass = false ;
d i s c o n n e c t . s e t T o o l T i p T e x t ( ” d i s c o n n e c t Wiimote ” ) ; }
disconnect . setEnabled ( false ) ; }

79
t o o l b a r . add ( d i s c o n n e c t ) ;
APPENDIX D. CODE
if ( p a s s == f a l s e ) { I m a g e I c o n msgIcon = n u l l ;
s = ( S t r i n g ) JOptionPane . s h o w I n p u t D i a l o g ( frame , i f ( i c o n == 1 ) {
” That name a l r e a d y e x i s t s , p l e a s e g i v e msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m e s s a g e . png ” ) ;
this ” } e l s e i f ( i c o n == 2 ) {
+ ” g e s t u r e a UNIQUE name ! ” , msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w a r n i n g . png ” ) ;
”Name New G e s t u r e ” , } e l s e i f ( i c o n == 3 ) {
JOptionPane . PLAIN MESSAGE , msgIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ e r r o r . png ” ) ;
n u l l , n u l l , ”New G e s t u r e ” ) ; }
} s t a t u s b a r . s e t I c o n ( msgIcon ) ;
} while ( p a s s == f a l s e ) ; statusbar . setText ( status ) ;
msgWindow . newMessage ( s t a t u s , i c o n ) ;
W i i n o t e . g e s t u r e R e c o g . setToAdd ( 1 0 , s ) ; }
msgWindow . newMessage ( ” Adding New G e s t u r e ” , 2 ) ; }
msgWindow . newMessage ( ” P l e a s e P e r f o r m t h e G e s t u r e 10
Times ! ” , 2 ) ;
}
}) ;
D.17 wiinote.ui.MessagesWindow.java
GestureProp . setToolTipText ( ” Gesture P r o p e r t i e s ” ) ;
t o o l b a r . add ( G e s t u r e P r o p ) ;
G e s t u r e P r o p . a d d A c t i o n L i s t e n e r (new A c t i o n L i s t e n e r ( ) {
public void a c t i o n P e r f o r m e d ( A c t i o n E v e n t e v e n t ) { /∗ ∗
G e s t u r e G u i gGui = new G e s t u r e G u i ( ) ; ∗ A Window w h i c h d i s p l a y s a l i s t o f m e s s a g e s f r o m t e s y s t e m
a l o n g w i t h an
} ∗ i c o n d e s r i b i g t h e m e s s a g e . Once t h e maximum number o f m e s s a g e s
}) ; i s reached
∗ t h e o l d e s t m e s s a g e i s r e m o v e d and a l l m e s s a g e s move a s p a c e
return t o o l b a r ; back in the
} ∗ array .
∗ <p>
public void n u n c h u k A c t i v e ( boolean a c t i v e ) { ∗ 06−Apr −2009
newGesture . s e t E n a b l e d ( a c t i v e ) ; ∗
} ∗ @author Louis Fellows
∗ @version 1.0.0.0
public void s e t M i d i S e t t i n g s ( ) { ∗/
S t r i n g [ ] p o r t s = Wiinote . midiout . outputPorts ( ) ; public c l a s s MessagesWindow extends J P a n e l {
I m a g e I c o n m i d i I c o n = new
I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m i d i . png ” ) ; p r i v a t e s t a t i c f i n a l long
s e r i a l V e r s i o n U I D = 3677227138328480862L ;
String s = ( S t r i n g ) JOptionPane . s h o w I n p u t D i a l o g ( frame , public I m a g e I c o n mesgIcon , warnIcon , e r r I c o n , upIcon ,
” P l e a s e c h o o s e t h e name o f t h e Midi p o r t t o u s e : ” , downIcon , wmIcon ;
” Choose Midi P o r t ” , JOptionPane . PLAIN MESSAGE , public A r r a y L i s t <J L a b e l> m e s s a g e s ;
midiIcon , ports ,
null ) ; public MessagesWindow ( i n t numberMessages ) {
init () ;
if ( ( s != n u l l ) && ( s . l e n g t h ( ) > 0 ) ) {
try { f o r ( i n t i = 0 ; i < numberMessages ; i ++) {
Wiinote . midiout . connectToPort ( s ) ; J L a b e l m e s s a g e = new J L a b e l ( ) ;
} catch ( M i d i U n a v a i l a b l e E x c e p t i o n e ) { m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
msgWindow . newMessage ( ” Midi U n a v a l i a b l e Exception ” , 3) ; m e s s a g e s . add ( m e s s a g e ) ;
return ; }
}
for ( JLabel label : messages ) {
Wiinote . midiout . t e s t S i g n a l ( ) ; add ( l a b e l ) ;
s e t S t a t u s b a r ( ” Midi Output S e t ” , 1 ) ; }

return ; s e t L a y o u t (new G ri dL ay ou t ( numberMessages , 1 ) ) ;


} setMaximumSize (new Dimension ( 4 0 0 , 20 ∗ numberMessages ) ) ;
}
}

80
public void s e t S t a t u s b a r ( S t r i n g status , int icon ) {
APPENDIX D. CODE
public MessagesWindow ( i n t numberMessages , C o l o r background , }
Color foreground ) {
init () ; public I m a g e I c o n getIconByNumber ( i n t number ) {
ImageIcon r e t u r n I c o n = null ;
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { i f ( number == 1 ) {
J L a b e l m e s s a g e = new J L a b e l ( ) ; r e t u r n I c o n = mesgIcon ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ; } e l s e i f ( number == 2 ) {
message . setForeground ( f o r e g r o u n d ) ; r e t u r n I c o n = warnIcon ;
m e s s a g e s . add ( m e s s a g e ) ; } e l s e i f ( number == 3 ) {
} returnIcon = errIcon ;
} e l s e i f ( number == 4 ) {
for ( JLabel label : messages ) { r e t u r n I c o n = upIcon ;
add ( l a b e l ) ; } e l s e i f ( number == 5 ) {
} r e t u r n I c o n = downIcon ;
} e l s e i f ( number == 6 ) {
setBackground ( background ) ; r e t u r n I c o n = wmIcon ;
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; }
setMaximumSize (new Dimension ( 4 0 0 , 20 ∗ numberMessages ) ) ; return r e t u r n I c o n ;
}
}
p r i v a t e void i n i t ( ) {
public MessagesWindow ( i n t h e i g h t , int width ) { m e s s a g e s = new A r r a y L i s t <J L a b e l >() ;
init () ; m e s g I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ m e s s a g e . png ” ) ;
w a r n I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ w a r n i n g . png ” ) ;
i n t numberMessages = h e i g h t / 2 0 ; e r r I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ e r r o r . png ” ) ;
u p I c o n = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\upsm . png ” ) ;
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { downIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\downsm . png ” ) ;
J L a b e l m e s s a g e = new J L a b e l ( ) ; wmIcon = new I m a g e I c o n ( ” . \ \ s r c \\ i c o n s \\ wiimotesm . png ” ) ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
m e s s a g e s . add ( m e s s a g e ) ; }
}
public void newMessage ( S t r i n g msg , i n t i c o n ) {
for ( JLabel label : messages ) { I m a g e I c o n msgIcon = getIconByNumber ( i c o n ) ;
add ( l a b e l ) ;
} f o r ( i n t i = m e s s a g e s . s i z e ( ) − 1 ; i > 0 ; i −−) {
messages . get ( i ) . s e t I c o n ( messages . get ( i − 1) . getIcon ( ) ) ;
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; messages . get ( i ) . setText ( messages . get ( i − 1) . getText ( ) ) ;
setMaximumSize (new Dimension ( width , h e i g h t ) ) ; messages . get ( i )
} . setToolTipText ( messages . get ( i −
1) . getToolTipText ( ) ) ;
public MessagesWindow ( i n t h e i g h t , i n t width , C o l o r background , }
Color foreground ) {
init () ; m e s s a g e s . g e t ( 0 ) . s e t T e x t ( msg ) ;
m e s s a g e s . g e t ( 0 ) . s e t I c o n ( msgIcon ) ;
i n t numberMessages = h e i g h t / 2 0 ; m e s s a g e s . g e t ( 0 ) . s e t T o o l T i p T e x t ( msg ) ;
}
f o r ( i n t i = 0 ; i < numberMessages ; i ++) { }
J L a b e l m e s s a g e = new J L a b e l ( ) ;
m e s s a g e . setMaximumSize (new Dimension ( 4 0 0 , 20) ) ;
message . setForeground ( f o r e g r o u n d ) ;

}
m e s s a g e s . add ( m e s s a g e ) ; D.18 wiinote.ui.NoteWindow.java
for ( JLabel label : messages ) {
add ( l a b e l ) ; /∗ ∗
} ∗ A UI O b j e c t w h i c h d i p l a y s i n f o r m a t i o n on t h e note currently
being played
setBackground ( background ) ; ∗ <p>
s e t L a y o u t (new G r i dL ay ou t ( numberMessages , 1 ) ) ; ∗ 06−Apr −2009
setMaximumSize (new Dimension ( width , h e i g h t ) ) ; ∗

81
∗ @author Louis Fellows
APPENDIX D. CODE
∗ @version 1.0.0.0 n o t e O c t a v e . s e t F o n t (new Font ( ” A r i a l ” , Font . PLAIN , 20) ) ;
∗/ noteOctave . setBackground ( backcol ) ;
public c l a s s NoteWindow extends J P a n e l { noteOctave . setForeground ( f o r e c o l ) ;

p r i v a t e s t a t i c f i n a l long c o n t e n t F r a m e . add ( Box . c r e a t e H o r i z o n t a l G l u e ( ) ) ;


s e r i a l V e r s i o n U I D = 4065069675658932780L ; c o n t e n t F r a m e . add ( n o t e L e t t e r ) ;
public J L a b e l n o t e L e t t e r , n o t e O c t a v e , s t a v e i m g ; c o n t e n t F r a m e . add ( Box . c r e a t e R i g i d A r e a (new Dimension ( 2 0 ,
public I m a g e I c o n s t v , stvA , stvAs , stvB , stvC , s tvCs , stvD , 30) ) ) ;
stvDs , stvE , c o n t e n t F r a m e . add ( n o t e O c t a v e ) ;
stvF , s t v F s , stvG , s t v G s ; c o n t e n t F r a m e . add ( Box . c r e a t e H o r i z o n t a l G l u e ( ) ) ;

J L a b e l name = new J L a b e l ( ” Note ” ) ;


public NoteWindow ( C o l o r f o r e c o l , Color backcol ) { name . setMinimumSize (new Dimension ( 2 0 , 5) ) ;
setBackground ( backcol ) ;
setForeground ( f o r e c o l ) ; J L a b e l o c t v = new J L a b e l ( ” Octave ” ) ;
o c t v . setMinimumSize (new Dimension ( 2 0 , 5) ) ;
J P a n e l noteFrame = new J P a n e l ( ) ;
t i t l e F r a m e . add ( name ) ;
noteFrame . s e t B a c k g r o u n d ( b a c k c o l ) ; t i t l e F r a m e . add ( Box . c r e a t e H o r i z o n t a l G l u e ( ) ) ;
noteFrame . s e t F o r e g r o u n d ( f o r e c o l ) ; t i t l e F r a m e . add ( o c t v ) ;
noteFrame . s e t L a y o u t (new BoxLayout ( noteFrame ,
BoxLayout . Y AXIS ) ) ; noteFrame . add ( c o n t e n t F r a m e ) ;
noteFrame . add ( t i t l e F r a m e ) ;
J P a n e l c o n t e n t F r a m e = new J P a n e l ( ) ;
contentFrame . setBackground ( b a c k c o l ) ; s e t L a y o u t (new B o r d e r L a y o u t ( ) ) ;
contentFrame . setForeground ( f o r e c o l ) ; add ( noteFrame , B o r d e r L a y o u t .WEST) ;
c o n t e n t F r a m e . s e t L a y o u t (new BoxLayout ( contentFrame ,
BoxLayout . X AXIS ) ) ; setMaximumSize (new Dimension ( 4 0 0 , 200) ) ;

J P a n e l t i t l e F r a m e = new J P a n e l ( ) ; }
t i t l e F r a m e . setBackground ( backcol ) ;
titleFrame . setForeground ( f o r e c o l ) ; public void s e t N o t e L e t t e r ( S t r i n g newText ) {
t i t l e F r a m e . s e t L a y o u t (new BoxLayout ( t i t l e F r a m e , n o t e L e t t e r . s e t T e x t ( newText ) ;
BoxLayout . X AXIS ) ) ; }

noteLetter = new J L a b e l ( ) ; public void s e t N o t e O c t a v e ( S t r i n g newText ) {


noteLetter . s e t F o n t (new Font ( ” A r i a l ” , Font . PLAIN , 20) ) ; n o t e O c t a v e . s e t T e x t ( newText ) ;
noteLetter . setBackground ( backcol ) ; }
noteLetter . setForeground ( f o r e c o l ) ; }
noteOctave = new J L a b e l ( ) ;

82

You might also like