You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/278390248

Multi-point vibrotactile feedback for an expressive musical interface

Conference Paper · June 2015

CITATIONS READS

12 912

3 authors:

Stefano Papetti Sébastien Schiesser


Zürcher Hochschule der Künste Zürcher Hochschule der Künste
62 PUBLICATIONS   577 CITATIONS    17 PUBLICATIONS   253 CITATIONS   

SEE PROFILE SEE PROFILE

Martin Fröhlich
Zürcher Hochschule der Künste
6 PUBLICATIONS   86 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Multimodal interaction with Professional Appliances View project

Audio-Haptic modalities in Musical Interfaces (AHMI) View project

All content following this page was uploaded by Stefano Papetti on 20 June 2015.

The user has requested enhancement of the downloaded file.


Multi-point vibrotactile feedback
for an expressive musical interface

Stefano Papetti, Sébastien Schiesser, Martin Fröhlich


Institute for Computer Music and Sound Technology
Zurich University of the Arts
Pfingsweidstrasse 96, 8031 Zurich
{stefano.papetti},{sebastien.schiesser},{martin.froehlich}@zhdk.ch

ABSTRACT alters the action-perception loop [19] that is normally estab-


This paper describes the design of a hardware/software sys- lished in tactual interactions with traditional instruments.
tem for rendering multi-point, localized vibrotactile feed- Touch is the most intimate of the senses, and indeed a
back in a multi-touch musical interface. A prototype was more intimate connection with digital musical interfaces
developed, based on the Madrona Labs Soundplane, which should strongly rely on the haptic modality. Qualitative
was chosen for it provides easy access to multi-touch data, aspects [11], as well as musical performance indicators [13]
including force, and its easily expandable layered construc- appear also to be affected by the haptic response of an in-
tion. The proposed solution makes use of several piezo ac- strument.
tuator discs, densely arranged in a honeycomb pattern on a Following the increasing availability of low-cost sensors,
thin PCB layer. Based on off-the-shelf components, custom actuators and computing systems, several prototype musi-
amplifying and routing electronics were designed to drive cal interfaces offering haptic feedback have been developed
each piezo element with standard audio signals. Features, in the last few decades. Some of them simulate the hap-
as well as electronic and mechanical issues of the current tic behavior of an acoustic or electro-acoustic counterpart
prototype are discussed. (e.g. [7, 12, 22]), while some others aim at implementing
new paradigms (e.g. [3, 20, 26, 1]), inspired to different ex-
tent by traditional musical instruments.
Author Keywords In [14] a wearable vibrotactile system is described, to ren-
Haptic musical interface, Vibrotactile feedback, Multi-point, der compositions made expressly for the sense of touch.
Piezo actuators Research on haptic interfaces also exists, where feedback
is used to teach, facilitate or enhance playing techniques [26,
16], or to help follow a score [15].
ACM Classification As for commercial products, only a few examples of hap-
H.5.2 [Information Interfaces and Presentation] User Inter- tic musical interfaces or instruments are currently found.
faces — Haptic I/O, H.5.5 [Information Interfaces and Pre- The Yamaha AvantGrand digital pianos1 offer vibrotactile
sentation] Sound and Music Computing — Systems feedback through transducers embedded in the instrument
body, simulating the effect of strings and soundboard vi-
brating, and pedal depression. Syntact2 is a contact-free in-
1. INTRODUCTION terface, which provides contactless tactile feedback through
Looking at current musical interfaces, that of tactile feed- an array of ultrasonic transducers.
back seems like a minor issue as compared to ergonomics The use of multi-touch surfaces in music started some
or gesture mapping. Nevertheless, several recent studies years ago with the JazzMutant Lemur touchscreen con-
(e.g. [25, 24, 4, 21, 8]) suggest that the development of mu- troller and the reacTable, and the trend is now exploding
sical skills strongly relies on tactile and kinesthetic cues: with iPads and other tablets. While the possibility to design
These would inform sophisticated control strategies that al- custom GUIs has opened to great flexibility in live electron-
low experienced musicians to achieve top performance levels ics and interactive installations, such devices still cannot
(for example in terms of precise timing and accurate into- convey a rich haptic experience to the performer.
nation), and enable expressivity and self-monitoring. Aiming at investigating how performance in basic musi-
Indeed, while performing on acoustic or electro-acoustic cal gestures may be affected by audio-tactile feedback, in a
musical instruments, an intense haptic experience is un- previous experiment some of the present authors took into
avoidable, as well as highly relevant to the musical perfor- account a finger-pressing task [17]: Somewhat similarly to
mance itself. Interaction with digital musical interfaces is what happens when learning a musical instruments, by re-
also generally mediated by touch, however, while such inter- lying on kinesthetic memory, subjects had to memorize and
faces can track input gestures, they generally provide haptic reproduce different pressing force targets with the best ac-
feedback only as by-product of their built-in mechanics, if curacy. Results show that audio-tactile augmentation al-
any. This lack of physical experience at the performer’s side lowed subjects to achieve the target forces with improved
accuracy. The psychophysics of active touch for broadband
vibrotactile stimuli was also investigated in [27, 30]. In an-
Permission to make digital or hard copies of all or part of this work for other study [10] the perceived quality of a digital piano aug-
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies 1
bear this notice and the full citation on the first page. To copy otherwise, to http://europe.yamaha.com/en/products/
republish, to post on servers or to redistribute to lists, requires prior specific musical-instruments/keyboards/hybridpianos/
permission and/or a fee. avantgrand/
2
NIME’15, May 31-June 3, 2015, Louisiana State Univ., Baton Rouge, LA. Still unreleased as of January 2015, http://www.
Copyright remains with the author(s). ultrasonic-audio.com/products/syntact.html
exchange advices for hacking and fine tuning. Furthermore,
when we contacted the Soundplane inventor and mentioned
our goal, he showed interest in the idea of implementing a
haptic layer.
The interface patented capacitive sensing technology
makes use of several carrier antennas, each sending a sig-
nal at a different fixed frequency. Separated by a dielectric
layer, transversal pickup antennas catch these signals, which
Figure 1: The Madrona Labs Soundplane are modulated by changes of thickness in the dielectric layer
due to pressure on the Soundplane surface.
In the commercial version, the generation of carrier sig-
mented with vibrotactile actuators was tested according to nals and the decoding of the modulated signals are done
different auditory and tactile feedback conditions. A par- internally by a DSP chip, while a USB connection sends
tial preference for the combination of audio and vibrotactile three-dimensional touch data (x, y: surface coordinates, z:
feedback was found. pressure intensity) to a host computer. Conversely, in the
In this perspective, we argue that the addition of rich “analog” 8×8 DIY version all the digital signal processing is
haptic feedback to future musical interfaces would enhance done in software by a host computer: Eight carrier signals
several aspects of musical practice, such as improved user are generated in Max and sent from the analog outputs of
experience, control and expressivity. For this reason, we an audio interface, while the eight modulated signals are
decided to design an advanced vibrotactile feedback system acquired by its inputs and decoded by a Max external to
to be used in a musical interface. Such augmented inter- extract three-dimensional touch data.
face would then serve as a open and versatile framework, It is worth noting that, despite its wooden finish, the
allowing experimentation with different audio-tactile map- Soundplane surface does not feel nor it behaves like a stiff
pings, and testing the effectiveness of vibrotactile feedback wooden panel. Indeed its tiled touch pads are engraved
in musical practice. and independent of each other, and they rest on a natural
rubber layer enabling a certain extent of compression when
pressed.
2. DESIGN
Vibrotactile feedback as found in the current generation of 2.2 Construction
touch-screen devices is affected by several limitations. Such
In what follows, we will refer to our haptic Soundplane pro-
devices usually make use of either an Eccentric Rotating
totype as the “HSoundplane”.
Mass (ERM) motor or a Linear Resonant Actuator (LRA)
The original Soundplane multi-layered design consists of
coupled with the device body, which therefore vibrates as
a top tiled surface – a sandwich construction made of wood
a whole. Also, these actuator technologies can only pro-
veneer, stuck to a thin Plexiglas plate and a natural rubber
duce simple vibratory signals: ERM motors produce vibra-
foil – resting on top of the capacitive sensing layer described
tions whose frequency and amplitude cannot be set indepen-
above (made of carrier antennas, dielectric and pickup an-
dently, and show a considerable lag in response time; LRA
tennas). Since these components are simply laid upon each
conversely have improved response time, however they only
other and kept in place with little pegs built into the wooden
vibrate at a fixed resonant frequency, not affected by the
casing, it is quite simple to disassemble the structure and
amplitude. Furthermore, with regard to touch input, cur-
replace some of the elements.
rent touch-screen systems cannot detect finger pressure3 ,
Ideally, vibrotactile actuators should be placed as close
and often do not offer response times suitable for real-time
as possible to the touch location, in that way maximizing
musical applications.
the vibration energy conveyed to the fingers. In our case
As opposed to what described above, our goal was to
that would actually mean that actuators should be embed-
implement distributed and localized vibrotactile feedback,
ded in the top surface, just below the wood veneer. As a
with as little limitations as possible on the vibration signal
compromise we chose to place our haptic layer between the
in terms of temporal dynamics and spectral envelope. Being
top surface and the sensing components. However, such so-
a very demanding objective, we started evaluating existing
lution poses some serious challenges: the original flexibility,
multi-touch interfaces that could be adapted to our purpose
flatness and thickness of the layers above the sensing com-
by adding a newly developed haptic layer. After doing some
ponents has to be maintained as much as possible, so as
research, our choice fell on the Madrona Labs Soundplane.
to preserve the sensitivity and calibration uniformity of the
2.1 The Madrona Labs Soundplane Soundplane.
A review of actuator technologies suitable for musical ap-
The Soundplane, pictured in Figure 1, is an elegant-looking
plications is found in [20]. To implement a haptic layer for
wooden musical interface that was first described in [18],
the Soundplane we chose a solution based on low-cost piezo-
and is now a commercially available product4 . It provides
electric transducer elements, having the following cumula-
a large multi-touch and pressure-sensitive surface based on
tive advantages: below their resonant frequency, they have a
capacitive sensing, and it offers high tracking speed.
response suitable to the vibrotactile range (5–1000 Hz [29]);
The interface allows easy disassembly and is potentially
they offer fast response times; they can be driven by audio
open to hacking, which was required for our purpose. More-
signals; they are thin (down to a few tenths of a millimeter);
over, Do-It-Yourself instructions are provided for building
they allow scaling up due to their size and cheap price.
the original prototype version, with details on the used ma-
In order to provide differentiated haptic information to
terials and construction solutions, as well as source code
multiple touch points, a large amount of densely distributed,
and patches for Max5 . An online forum is also available to
individually driven piezo elements is necessary. To max-
3
With the exception of the just announced Force Touch imize the density of actuators, they have been arranged
technology by Apple. in a honeycomb pattern, which required to cut their origi-
4
http://madronalabs.com/ nal round shape into hexagons. As shown in Figure 2, the
5
http://cycling74.com/ actuators were arranged so as to match the tiled pads on
Figure 2: Arrangement of piezo actuators in a hon-
eycomb pattern, matching the Soundplane tiled sur-
face.

the Soundplane surface (see Figure 1): interleaved columns


made of 5 or 4 piezo elements correspond respectively to
a column of pads, or to intersections between them. It is
worth noticing that this solution even exceeds the number of
pads at the Soundplane surface, thus allowing for continu-
ous feedback in sliding gestures even when crossing different
pads.
To retain as much as possible the flexibility of the original
surface – in this way allowing the sensors below to detect in-
dividual finger pressures – the piezo elements were wired via
a ad-hoc designed flexible PCB with SMD soldering tech-
niques (see Section 2.3).
To let the actuators vibrate, despite being sandwiched
between the top surface and the sensing layer, the PCB with
piezo elements was laid on top of an additional thin rubber
foil, ad-hoc designed with holes corresponding to each piezo
element. This solution also guaranteed to leave the overall
flexibility unaltered.
However thin, the addition of the actuators layer alters
the overall thickness of the hardware. For this reason we had
to re-design the original top surface (a sandwich made of
wood veneer, Plexiglas and natural rubber) replacing it with Figure 3: Multi-layered construction of the
a thinner version. As a result, the thickness of the new top HSoundplane: a) wooden case; b) new touch sur-
surface plus the actuators layer matches that of the original face (wood veneer, 0.5 mm); c) new Plexiglas plate
surface. In this way we could fit the new instrumentation (1 mm); d) new natural rubber foil (1.3 mm); e) flex-
into the original Soundplane wooden case. ible PCB (0.3 mm); f ) piezo elements (0.2 mm); g)
Figure 3 shows an exploded view of the HSoundplane con- natural rubber holed foil (1.3 mm); h) carrier an-
struction, consisting of nine layers. tennas (original); i) dielectric (original); j) pickup
antennas (original).
2.3 Electronics
In order to provide effective vibrotactile feedback at the
HSoundplane surface, some key considerations have to be Multiplexing continuous analog signals can be a delicate
made. On the one hand, the use of input audio signals issue, since the end user must not notice any disturbance
gives great advantages in terms of versatility and richness or delay in the feedback. Moreover, amplifying the already
of feedback. On the other hand, the voltage needed to drive multiplexed signals would imply a huge number of ampli-
piezo actuators – in our case up to 200 V – is not compatible fiers, while amplifying the original signals would require
with standard audio equipment. Moreover, as mentioned in high voltage multiplexers, which are available in a limited
Section 2.2, the piezo elements have been placed under each choice, are expensive and more difficult to drive.
pad (32 × 5) and at each intersection (31 × 4), which makes
for a total number of 284 elements, thus posing a non-trivial 2.3.1 Complete setup
challenge from an electrical standpoint. To satisfy the requirements and solve the issues pointed
These observations lead to the following requirements, so out above, we came up with a solution based on three key
as to drive all the piezo actuators. Being in the analog do- components: 1) Texas Instruments DRV2667 piezo drivers,
main, having one separate audio signal per actuator would that can be directly fed by audio input and drive a signal
be clearly overkill, therefore we considered using a maxi- up to 200 V; 2) serial-to-parallel shift registers with output
mum of one channel per column of pads, that is 32 sepa- latches of the 74HC595 family; 3) high voltage MOSFET
rate audio channels, which can be easily provided by e.g. a relays.
MADI interface. This already results in a limitation, for it For the sake of simplicity, the whole output stage of the
implies that each group of 5 + 4 piezos – corresponding to HSoundplane was divided into four identical parts, repre-
a column of 5 pads and 4 intersections – would be fed by a sented in Figure 4, each consisting of a flexible PCB with
single signal (see Section 2.3.2 for more details). To reach 72 piezo actuators (a), connected by a flat cable to a driver
all of the actuators, each of the 32 channels has to be mul- PCB with 8 audio-to-haptic amplifiers and routing electron-
tiplexed with a 1:9 ratio (in this way slightly exceeding the ics (b). In order to address the right actuators and syn-
overall number of actuators). Moreover, to comply with the chronize their switching with the audio signals, a master
electrical specifications of piezo elements, each audio signal controller (c) parses the control data generated at the host
has to be amplified by about a factor 100. computer and routes them to the appropriate slave drivers.
HSoundplane
2.4 Software
The original Soundplane comes with a client application
a) for Mac OS, which receives multi-touch data sensed by
the interface, and transmits them as Open Sound Control6
messages according to an original format named “t3d” (for
touch-3d). The t3d data represent touch information for
each contacting finger, reporting absolute x and y coordi-
nates, and force along the z axis.
In our prototype, these data are not only used to drive
b)
some audio engine, but also to activate the piezo actuators
32 located at the corresponding x and y coordinates, and to
audio

audio

audio
audio

driver driver driver driver


µC µC µC µC drive them with vibrotactile signals.

c) 2.4.1 Control of signal switching


master
In the current prototype, the synchronization between vi-
brotactile signals and the relay matrix happens at the host
Figure 4: Overview of the complete actuators con- computer level. While vibrotactile signals are output by the
trol electronics: a) piezo actuators on flexible PCBs audio interface, control messages are sent to the master con-
(simplified view); b) slave PCBs with audio-to- troller via USB. The master controller parses the received
haptic drivers and routing electronics; c) master messages and addresses the slave driver boards on a serial
controller. bus to set the state of relay matrices.
The choice of using a master controller, rather than ad-
dressing each driver board directly, was due to two reasons:
first, properly interfacing several external controllers with a
9

host computer can be complex; second, the perspective of


J

developing a self-contained musical interface would require


H

8J 6C 3F
8H 6B 3E
74HC595

74HC595

74HC595

8G 6A 3D

to get rid of a controlling computer and work in closed-loop.


G

8F 5J 3C
8E 5H 3B
8D 5G 3A
8C 5F 2J
8B 5E 2H
F

72 The presence of a main processing unit that receives touch


E

8A 5D 2G
7J 5C 2F
74HC595

74HC595

74HC595

7H 5B 2E
7G 5A 2D
7F 4J 2C
7E 4H 2B
D

7D 4G 2A

ch 1 ch 2
7C 4F 1J

data, processes them and generates vibrotactile informa-


C

2667 2667
7B 4E 1H

tion, is therefore a requirement in a mid-term perspective


7A 4D 1G
74HC595

74HC595

74HC595

6J 4C 1F
B

6H 4B 1E
6G 4A 1D
ch 3 ch 4 6F
6E
3J
3H
1C
1B
6D 3G 1A
A

2667 2667

(see Section 4).


audio in

8
ch 8

ch 7

ch 6

ch 5

ch 4

ch 3

ch 2

ch 1

ch 5 ch 6
d)
audio piezo
c)
2667 2667

signal signal sync


ch 7

2667
ch 8

2667 8
2.4.2 Rendering of vibrotactile feedback
The sense of touch is generally regarded as capable of per-
a) b) init
µcontroller
ceiving vibrations in the 5–1000 Hz frequency range. Four
mechanoreceptive channels have been identified in the skin,
e) which mediate the mechanical aspects of touch [5]. The
sync
Pacinian channel is the most important for vibrotactile per-
ception, as it determines sensitivity thresholds in the range
Figure 5: Overview of a slave driver board: a) 40–800 Hz (a U-shaped contour, with peak sensitivity be-
8-channel audio input; b) 8 piezo drivers; c) 72- tween 200 and 300 Hz), and is sensitive to spatial and tem-
point matrix of relays individually connected to poral summation.
each piezo actuator; d) relay control; e) microcon- In general, touch has been studied mostly as a receptive
troller for initialization and synchronization. sense, by measuring the perception of vibrations in pas-
sive settings. However, the everyday experience of touch
– including that arising from the performance on musical
instruments – clearly shows that active touch (manipula-
2.3.2 Driver board tion, exploration) is of primary importance. So far, only
Figure 5 shows a slave driver board, which works as follows: a few studies have investigated the perception of vibration
Eight audio signals (a) are routed to the piezo drivers (b), in active touch and musical tasks (e.g. [6, 2, 9]). Aiming
where they are amplified to high-voltage and sent to a 8 × at overcoming the lack of knowledge in this field, recently
9 relay matrix (c) which connects to each piezo actuator. some of the present authors reported novel results concern-
This 72-point matrix is addressed by a chain of serial-to- ing the psychophysics of active touch for different pressing
parallel shift registers (d), commanded by a microcontroller force conditions [27, 17].
(e). Based on the current knowledge of vibration perception,
At startup, the microcontroller initializes the piezo and in order to optimize the output from the piezo actua-
drivers, setting among other things their amplification level. tors, vibration signals are first filtered with a bandpass fil-
When in running mode, the slave microcontroller receives ter (40–400 Hz). This also minimizes any sound spill, while
routing information from the master, sets a corresponding maximizing vibrotactile perception. A low frequency boost
72-bit word and sends it to the shift registers, which indi- is also available to compensate for the frequency response
vidually open or close the relays of the matrix. of the piezo actuators (see Section 3).
As shown in Figure 5, each amplified audio signal feeds a As it often happens with digital musical interfaces, the
whole row of 9-points in the relay matrix. Therefore, each mapping possibilities between the users’ gesture and audio
signal path is hard-coded to 9 addresses, which can be more output are manifold. Moreover, as opposed to common mu-
appropriately defined as switching rather than multiplexing. sical interfaces, the HSoundplane provides vibrotactile feed-
Such fixed addressing is the main limitation of the current back to the user, and this requires to define an additional
HSoundplane prototype: each column of 9 actuators can
6
only be fed with a single vibrotactile signal. http://opensoundcontrol.org/
mapping strategy. Since the actuators layer is part of the
−70 Piezo resonance
interface itself, we decided to provide the users with a cou- 3.6 KHz
4 KHz

ple of predefined vibrotactile feedback mapping strategies, 6.3 KHz


6.5 KHz

−75
while the sound mapping is left freely definable as in the
original Soundplane. Two alternative mapping strategies
−80
are offered in the current prototype:

Amplitude [dBFS]
• One makes use of the sound output controlled by the −85

HSoundplane: Audio signals are first filtered as men-


tioned above, and fed back to the actuators layer. This −90

approach is straightforward and ensures coherence be-


tween the musical output and the tactile feedback, −95

resembling what found in traditional musical instru-


ments, where the source of vibration is also the acous- −100

tic source. 100


Frequency [Hz]

• A simpler mapping strategy relies on a pseudo-white


Figure 6: Vibration frequency response, measured
noise signal, filtered as above, whose amplitude is set
at four different types of piezo actuators.
according to force data values. This approach has the
advantage of maximizing the performance of the ac-
tuators and the perception of vibrations. Also, in a piezo drivers can support a higher current load, they could
mid-term perspective, relying on the waveform mem- be easily replaced.
ory provided by the piezo drivers, this mapping can Vibrotactile cross-talk was informally evaluated by the
be totally self-contained. authors during the development: Indeed we found that the
Currently we still have to evaluate the effectiveness of HSoundplane is able to render localized vibrotactile feed-
the implemented mappings. For instance, in the second back with unperceivable vibration spill at other locations,
one, being the produced vibrations independent from the even when touching right next to the target feedback point.
sound synthesis, this may result in occasional perceptual Quantitative characterization will be performed by measur-
mismatch between touch and audition. ing the vibration amplitude at non-feedback points.
Currently, time-domain measurements of the synchro-
nization between audio signals and relay control are being
3. MEASUREMENTS carried out. The closed-loop latency from touch to the arise
Before the final implementation, the characteristics of dif- of vibrotactile feedback will be also evaluated.
ferent types of piezo actuators have been measured, and
their performance evaluated, in this way guiding the choice 4. PERSPECTIVES AND ISSUES
of piezo elements to be used in the HSoundplane prototype.
In a mid-term perspective we planned the development of an
Measurements were made with a Wilcoxon Research 736T
autonomous interface, able to generate vibrotactile feedback
piezoelectric accelerometer, connected to a Wilcoxon Re-
on-board while relying on an external computer for audi-
search iT100M Intelligent Transmitter. The AC-coupled
tory feedback only. For this purpose, the touch information
output of the transmitter was recorded as audio signals at
processed by the original Soundplane DSP chip could be di-
44.1 KHz with 24 bit resolution via a RME Fireface 800
rectly sent to the master controller in our design, and used
interface.
to route vibrotactile signals synthesized on-board or from
The frequency responses of four different types of piezo
pre-computed waveforms stored on the piezo drivers.
elements were measured in the 10–2000 Hz range. For this
Currently, the development of new musical interfaces is
purpose, we realized a scaled-down version of the HSound-
mostly grounded on practice and intuition only, often lead-
plane layers, and placed the four actuators in correspon-
ing to the production of one-of-a-kind prototypes. Con-
dence of four pads of the top surface. The accelerometer
versely, we aim at a scientifically-grounded design of haptic
was stuck to the surface with soft double-tape. Figure 6
musical interfaces. Parallel investigations on vibrotactile
shows a detail of the measured frequency responses in the
perception in musical tasks [27, 9, 17] are being carried out
40–500 Hz range.
in our lab. Results from these ongoing studies will help op-
More measurements are still needed to assess the effective
timizing the rendering of vibration and touch-to-vibration
dynamic range of vibrations provided by different piezos,
mappings in the HSoundplane and other haptic interfaces.
and their frequency response under finger-load condition.
The designed solutions are currently being tested for iso-
In addition to such measurements, we informally evalu-
lating possible electronic and mechanical issues.
ated the behavior of the different actuators by touch. Indeed
In addition, inspired by user experience studies in HCI,
the piezo elements having resonance at 4 KHz felt like the
the HSoundplane will be tested in controlled experiments in
best of the group, both in terms of conveyed energy and
musical practice context, aimed at evaluating the musician’s
low frequency response, thus confirming the measurement
playing experience. To this end, we will rely both on ex-
results.
isting and new methodologies for the evaluation of musical
Nevertheless, since each piezo driver has to feed 9 actua-
interfaces (e.g. [23, 28]). We envision that rich vibrotactile
tors in parallel, particular attention has to be paid to issues
feedback would make musical interfaces more accessible for
such as current consumption and heat dissipation. Among
hearing- or visually-impaired musicians, and therefore we
the analyzed actuators, the one resonating at 6.3 KHz has
will invite a pool of such subjects to the planned experi-
the smallest capacitance value, and therefore current needs,
ments.
while the 4 KHz one has the highest. For this reason, as
a starting point we chose to go for the piezo components
resulting in less electrical issues, even though they are the 5. CONCLUSIONS
least performing. However, in case their actual performance We presented the HSoundplane, a prototype of musical in-
were not satisfying and new calculations showed that the terface based on the Madrona Labs Soundplane, which was
augmented with multi-point vibrotactile feedback. Several [14] E. Gunther and S. O’Modhrain. Cutaneous Grooves:
constructive and electronic issues have been addressed in Composing for the Sense of Touch. J. New Music
the current design to reach this goal. Nevertheless further Res., 31(1):369–381, 2003.
optimization and evaluation are still needed. [15] L. Hayes. Vibrotactile feedback-assisted performance.
The rendering of localized vibrotactile feedback is an open (June):72–75, 2011.
issue in current multi-touch interfaces, and we imagine it [16] K. Huang, E. Y.-L. Do, and T. Starner. PianoTouch:
will become more and more relevant in the search for richer, A wearable haptic piano instruction system for passive
more engaging interaction. We believe that at least some of learning of piano skills. In IEEE Int. Symposium on
the technological solutions described here can be of inspira- Wearable Computers, pages 41–44. IEEE, 2008.
tion for the implementation of multi-point haptic feedback [17] H. Järveläinen, S. Papetti, S. Schiesser, and
in future interfaces. T. Grosshauser. Audio-tactile feedback in musical
gesture primitives: finger pressing. In Proceedings of
6. ACKNOWLEDGMENTS the Sound and Music Computing Conference SMC
The authors wish to thank Randy Jones, the inventor of the 2013, pages 109–114, Stockholm, Sweden, 2013.
original Soundplane, for providing technical support during [18] R. Jones, P. Driessen, A. Schloss, and G. Tzanetakis.
the development the HSoundplane prototype. A Force-Sensitive Surface for Intimate Control. In
This research is pursued as part of the project AHMI Proc. Conf. on New Interfaces for Musical Expression
(Audio-Haptic modalities in Musical Interfaces), funded by (NIME), 2009.
the Swiss National Science Foundation (SNSF). [19] J. Lagarde and J. A. S. Kelso. Binding of movement,
sound and touch: multimodal coordination dynamics.
7. REFERENCES Experimental Brain Research, 173:673–688, 2006.
[20] M. T. Marshall and M. M. Wanderley. Vibrotactile
[1] T. Ahmaniemi. Gesture Controlled Virtual
feedback in digital musical instruments. In Proc.
Instrument with Dynamic Vibrotactile Feedback. In
Conf. on New Interfaces for Musical Expression
Proc. Conf. on New Interfaces for Musical Expression
(NIME), pages 226–229, 2006.
(NIME), pages 485–488, 2010.
[21] M. T. Marshall and M. M. Wanderley. Examining the
[2] A. Askenfelt and E. V. Jansson. On vibration
effects of embedded vibrotactile feedback on the feel
sensation and finger touch in stringed instrument
of a digital musical instrument. In Proc. Conf. on New
playing. Music Perception: An Interdisciplinary
Interfaces for Musical Expression (NIME), June 2011.
Journal, 9(3):pp. 311–349, 1992.
[22] C. Nichols. The vBow: Development of a virtual violin
[3] D. Birnbaum. The Touch Flute : Exploring roles of
bow haptic human-computer interface, pages 1–4.
vibrotactile feedback in music performance. Technical
2002.
report, McGill University, 2003.
[23] S. O’Modhrain. A framework for the evaluation of
[4] D. M. Birnbaum and M. M. Wanderley. A systematic
digital musical instruments. Computer Music J.,
approach to musical vibrotactile feedback. In Proc.
35(1):28–42, Mar. 2011.
Int. Computer Music Conf. (ICMC), 2007.
[24] S. O’Modhrain and C. Chafe. Incorporating Haptic
[5] S. J. Bolanowski, G. A. Gescheider, R. T. Verrillo,
Feedback into Interfaces for Music Applications. In
and C. M. Checkosky. Four channels mediate the
Proc. of ISORA, World Automation Conf., 2000.
mechanical aspects of touch. J. Acoust. Soc. Am.,
84(5):1680–94, Nov. 1988. [25] S. O’Modhrain, S. Serafin, C. Chafe, and J. O. Smith.
Influence of attack parameters on the playability of a
[6] A. J. Brisben, S. S. Hsiao, and K. O. Johnson.
virtual bowed string instrument: tuning the model. In
Detection of Vibration Transmitted Through an
Proc. Int. Computer Music Conf. (ICMC), 2000.
Object Grasped in the Hand. J Neurophysiol,
81(4):1548–1558, Apr. 1999. [26] D. Overholt, E. Berdahl, and R. Hamilton.
Advancements in actuated musical instruments.
[7] C. Cadoz, L. Lisowski, and J.-L. Florens. A modular
Organised Sound, 16(02):154–165, Jun 2011.
feedback keyboard design. Computer Music J.,
14(2):pp. 47–51, 1990. [27] S. Papetti, H. Järveläinen, and G.-M. Schmid.
Vibrotactile sensitivity in active finger pressing. In
[8] C. Chafe. Tactile audio feedback. In Proc. Int.
IEEE World Haptics Conf., 2015.
Computer Music Conf. (ICMC), 1993.
[28] G.-M. Schmid. Measuring Musician’s Playing
[9] F. Fontana, F. Avanzini, H. Järveläinen, S. Papetti,
Experience: Development of a questionnaire for the
F. Zanini, and V. Zanini. Perception of interactive
evaluation of musical interaction. In Practice-based
vibrotactile cues on the acoustic grand and upright
Research Workshop at NIME, London, UK, 2014.
piano. In Proc. Int. Conf. on Sound and Music
Computing (SMC), 2014. [29] R. T. Verrillo. Vibration sensation in humans. Music
Perception, 9(3):281–302, 1992.
[10] F. Fontana, S. Papetti, M. Civolani, V. del Bello, and
B. Bank. An exploration on the influence of [30] L. Wyse, S. Nanayakkara, P. Seekings, S. H. Ong, and
vibrotactile cues during digital piano playing. In E. A. Taylor. Palm-area sensitivity to vibrotactile
Sound Music Comput., Padua, Italy, 2011. stimuli above 1 kHz. In Proc. conf. on New Interf. for
Music. Expr. (NIME), 2012.
[11] A. Galembo and A. Askenfelt. Quality assessment of
musical instruments: effects of multimodality. In
ESCOM Conf., number September, pages 441–444,
2003.
[12] B. Gillespie. The Touchback Keyboard. In Proc. Int.
Computer Music Conf. (ICMC), 1992.
[13] W. Goebl and C. Palmer. Tactile feedback and timing
accuracy in piano performance. Exp. Brain Res.,
186(3):471–9, Apr. 2008.

View publication stats

You might also like