The design of tactile musical devices for the deaf

by ROBERT JACK, ANDREW MCPHERSON, TONY STOCKMAN

Background

There is a growing body of research pointing towards the fundamental correspondences between the sense of hearing and that of touch. Both consist of receptors that are sensitive to pressure stimuli; both have the ability to process vibrations, although with different accuracy and within different perceptual ranges. Amplitude, frequency and waveform all have their equivalent in touch and as such the tactile perception of musical parameters such as pitch (Rovan & Hayward, 2000), rhythm (Kosonen & Raisama, 2006), loudness (Morley & Rowe, 1990) and timbre (Russo, 2012) has been the focus of recent psychophysical research. Schürmann, Caetano, Hlushchuk, Jousmäki and Hari (2006), amongst others, have also demonstrated that tactile vibrations activate the auditory cortex in the congenitally deaf in a similar manner to sound.

Related work

Hearing impairment means that music must be experienced through a different or altered set of sensory channels with audio information being perceived in a variety of ways: through visual feedback (gesture and movement), through residual hearing, and through tactile sensation of sound vibrations. Fulford (2011) has highlighted the multiplicity of experience and opinion in regard to music within this specific group: in an extensive study of hearing-impaired musicians, their opinions on performing music as well as the specific demands that hearing impairment places on their practice are presented. In our project the focus was on the experiences of profoundly deaf individuals, the case in which sound at 95dB SPL cannot be heard.

Merchel and Altinsoy (2014) have identified ‘whole-body vibrations’, i.e. vibrations perceived on the surface of the body, as playing a significant role in the perception of music, and as having a connection to the perception of quality in a concert experience. Gaining access to whole-body vibrations is a key aspect of deaf musical experience: ‘speaker-listening’, where bodily contact is made with a vibrating speaker cone is a common practice for gaining access to reproduced sound. Further,  ‘Deaf raves’ are events especially designed for this kind of bodily listening – here high-volume, bass-heavy music is played. The main collaborators on this project, deaf arts charity Incloodu (www.incloodu.co.uk), are the major organiser of deaf raves in the UK.

There have been a number of devices built with the specific aim of translating sound to felt vibration in a bid to build upon speaker-listening. For example, early experiments in sensory substitution produced the tactile vocoder (Brooks & Frost, 1983) a device developed to aid lip-reading by providing a tactile sensation of the frequency content of a sound through vibration motors mounted on the forearm. The principle of this device has influenced further research that aims to represent music, such as Skinscape (Gunther & O’Modhrain, 2003), the Model Human Cochlea (Karam, 2009) and the Haptic Chair (Nanayakkara, 2009).

Aims

Our aim in this project was to create a device that could heighten access to the ‘whole-body vibrations’ (Merchel & Altinsoy, 2014) present in a concert situation. The cross-modal devices that we developed aim to convey a sense of musical experience by representing some of the musically salient features of an input signal in real time through tactile vibration. The desire was not to simply amplify the bass content of the music to a point where it is perceived by the skin, but to provide additional musical information, not usually available, to the tactile senses. Working in collaboration with Incloodu provided first-hand accounts from profoundly deaf individuals regarding their enjoyment of music. Our aim was to design a device that takes the above into account, to create a music-driven tactile experience, using a psychophysically informed mapping strategy.

Main Contribution

Development

Some general themes that informed our design emerged from conversations conducted with profoundly deaf individuals before the development process began. These themes included: the importance of being able to physically feel sound (through the legs, torso and fingertips), a lack of awareness of anything that happens above the tactile frequency range (0Hz to 1000Hz) (Rovan & Hayward, 2000) and hence preference for bass-heavy music, and an importance given to seeing musical gestures, particularly of a rhythmic nature.

The prototypes

The tactile devices that we developed took the form of a suite of furniture: two armchairs and a sofa. Voicecoil actuators, similar to those found in speaker drivers, were used as the main transducers in the back and arms of the chairs. This decision was reached after experimentation with various actuators – for an extensive review of current vibrotactile technologies provided see Choi and Kuchenbecker (2013). Paddles were created for the voicecoils in the back panel which would make contact with the back of the user; plates were affixed to the actuators used in the armrests for the fingertips; a tactile subwoofer provided by Subpac was used in the seat of the chairs where the lowest frequencies would be presented.

The signal processing was performed on a Mac Mini running a custom patch built in Max/MSP; an Alesis I/O26 was used to interface between the computer and the chair; 4 T-class amplifiers were used to power the actuators. The structure of the chairs was designed by architect Martin Glover, who is profoundly deaf and specifically develops interiors for the deaf and hard-of-hearing. The cushions that house the electronics were built at Queen Mary, University of London. The conduction of vibration from actuator to the user’s back was optimised in a number of ways: the paddles on each actuator were shaped to be particularly effective at vibrating at the frequency they were delivering, the housing of the actuators was laser-cut in foam to tightly hold them in place to maximise vibration coupling, and the placement of actuators maximised perceptibility based on the spatial acuity of skin to vibrotactile stimuli (Karam, Russo, & Fels, 2009).

Mapping strategy and representation of musical parameters

The main mapping model that we followed in the design of the chair is similar to that found in the tactile vocoder (Brooks & Frost, 1983). A musical signal is passed through a series of bandpass filters; by envelope-following each of these bands, a series of control signals is generated: each signal reflecting the energy present in each frequency band. These signals were used to control the amplitude of signals tailored for the tactile range of skin. In a manner similar to the signal substitution model used by Merchel and Altinsoy (2014), control signals were used to set the amplitude of a bank of seven sine tones spaced at half-octave distances from 15Hz to 120Hz. Figure 1 shows the signal path and final layout of actuators in the single-seater.

Fig 1

Figure 1. Signal path and actuator placement used in the single chair.

Pitch was spatialised with pitch height represented from low to high from the seat to the top of the back rest. Equal-loudness curves for vibrotactile perception display a similar variation across frequency range to the Fletcher-Munsen equal-loudness contours for audition (Morley & Rowe, 1990). This was accounted for by applying equalisation to the actuation signals appropriate to the frequency they were delivering in a manner similar to Birnbaum’s (2007) FA/SA system. Timbre was taken into account by taking a noisiness measure of the input signal (bark-based spectral flatness measure) using Tristan Jehan’s MaxMSP external analyzer. This was used to drive interpolation between a sine tone at 500Hz and white noise that was output by the panel actuators in the arms of the chair. This was a simplification of timbral quality to a contrast between smooth and rough stimuli as seen in the vibrotactile perception of the transition from pure to complex waveforms (Rovan & Hayward, 2000). Finally, in order to accent transients in the input signal, a high-shelf filter with cut-off frequency of 3.5 Hz was applied. This highlighted any rhythmic onsets in the input signal.

Conclusions

The final setup at the festival consisted of two single chairs set up as an installation with a twenty minute loop of music playing through them that included four distinct styles: dubstep, ambient electronic, classical church organ and contemporary percussion. The double-seater was set up as part of an open workshop with Barbican Drum Works, a 16-piece drum ensemble who provided live input. Responses to the devices were gathered via a paper survey to which we received 14 responses from profoundly deaf users. The data showed a great difference in response depending on the type of music being used as input: highly rhythmic music elicited responses that were generally more positive than the responses to music in which harmonic motion was key. Responses such as ‘It feels like dancing while sitting down’ and ‘it’s like the drums are playing right on your back’ came in relation to highly rhythmic input. Music with fewer transients like the ambient and dubstep pieces seemed to encourage responses that were more about the chairs’ value within a therapeutic context, for example ‘The vibrations would work well for diffusing stress in deaf adolescents’. This gap in responses may signal that devices such as these may be designed in a manner that is more dependent on the musical situation in which they will be used; it may also signal that the device’s mapping strategy could be made more explicit to the listener in order to yield a more satisfying musical experience.

Responses were generally focused on the clarity of vibration and sense of rhythm purveyed, rather than on the distributed frequency spectrum in the back of the chair. While this could be seen as reflecting the low degree of frequency resolution present in our implementation (seven channels), it also points to a hierarchy present in the perception of tactile stimuli. Chafe (1993) has suggested that only certain musical events are useful as tactile events, such as timing, amplitude, and spectral weighting. The responses we received seem to be in agreement with this statement, while also highlighting the primacy of the rhythmic event in tactile perception.

Notes

This research project began as part of a placement at the Union Chapel as part of the Organ Project. The first prototypes were built to make the Henry Willis organ at the Chapel more accessible to a deaf audience and were exhibited at the concert The Organ of Corti. The design was then refined with Incloodu and showcased at its Deaf arts festival in January 2015. This research was conducted as part of the EPSRC and AHRC funded Media and Arts Technology Doctoral Training Centre at Queen Mary, University of London. The early stages of this project where greatly aided by Dr Chris Harte whose guidance helped shape the project as a whole.

Address for correspondence: Robert Jack, School of Electronic Engineering and Computer Science, Queen Mary, University of London, Mile End Road, London, E1 4NS, UK.

Email: r.h.jack@qmul.ac.uk.

References

Birnbaum, D. M., & Wanderley, M. M. (2007). A systematic approach to musical vibrotactile feedback. In Proceedings of the International Computer Music Conference (ICMC) (Vol. 2, pp. 397-404).

Brooks, P. L., & Frost, B. J. (1983). Evaluation of a tactile vocoder for word recognition. The Journal of the Acoustical Society of America74(1), 34-39.

Chafe, C. (1993, September). Tactile audio feedback. In Proceedings of the International Computer Music Conference (pp. 76-76). International Computer Music Association.

Choi, S., & Kuchenbecker, K. J. (2013). Vibrotactile display: Perception, technology, and applications. Proceedings of the IEEE101(9), 2093-2104.

Fulford, R., Ginsborg, J., & Goldbart, J. (2011). Learning not to listen: the experiences of musicians with hearing impairments. Music Education Research13(4), 447-464.

Gunther, E., & O’Modhrain, S. (2003). Cutaneous grooves: composing for the sense of touch. Journal of New Music Research32(4), 369-381.

Karam, M., Russo, F. A., & Fels, D. I. (2009). Designing the model human cochlea: An ambient crossmodal audio-tactile display. IEEE Transactions on Haptics2(3), 160-169.

Kosonen, K., & Raisamo, R. (2006). Rhythm perception through different modalities. In Proc. EuroHaptics (pp. 365-370).

Merchel, S., & Altinsoy, M. E. (2014). The Influence of Vibrations on Musical Experience. Journal of the Audio Engineering Society62(4), 220-234.

Morley, J. W., & Rowe, M. J. (1990). Perceived pitch of vibrotactile stimuli: effects of vibration amplitude, and implications for vibration frequency coding.The Journal of physiology431(1), 403-416.

Nanayakkara, S., Taylor, E., Wyse, L., & Ong, S. H. (2009, April). An enhanced musical experience for the deaf: design and evaluation of a music display and a haptic chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 337-346). ACM.

Rovan, J., & Hayward, V. (2000). Typology of tactile sounds and their synthesis in gesture-driven computer music performance. Trends in gestural control of music, 297-320.

Russo, F. A., Ammirante, P., & Fels, D. I. (2012). Vibrotactile discrimination of musical timbre. Journal of Experimental Psychology: Human Perception and Performance38(4), 822.

Schürmann, M., Caetano, G., Hlushchuk, Y., Jousmäki, V., & Hari, R. (2006). Touch activates human auditory cortex. Neuroimage30(4), 1325-1331.