Virtual violins and computer-based guitars need more than just the right sounds if they’re to work for deaf musicians. Richard Burn describes his research into accessible virtual instruments.

Photo of a man playing a guitar

In general, musicians who play acoustic instruments enjoy a multi-sensory experience. They don’t just hear the sound of their instrument – they also feel their vibrations and see how they move.

Virtual instruments, on the other hand, generally only produce sound. Players often describe them as cold or lifeless. For hearing musicians, this doesn’t necessarily affect the usability of the instrument, but for deaf musicians the lack of feedback through touch and sight can limit their enjoyment – or even ability – to play virtual instruments to the same level as acoustic ones.

The aim of my research is to discover how dependent deaf musicians are on the more physical aspects of how an acoustic instrument feels and moves when played. I hope to identify the way touch and sight are experienced by the player and then develop systems that can provide similarly useful feedback for virtual software instruments. If successful, they will allow deaf musicians to get over some of the problems normally associated with playing virtual instruments and open up new performance avenues.

My research is based in the field of human computer interaction (HCI). I’ll be trying to answer a number of questions:

  • How do deaf musicians interact with acoustic instruments? It will be necessary to understand how deaf virtuoso players, such as Dame Evelyn Glennie, use sensory information
  • What types of feedback through sight and touch are the most rewarding? This will involve surveying a large number of players of all abilities
  • What physical or psychological limitations should be considered to ensure a good user experience? Comfort, ease of use, complexity, physical size and even affordability will all be assessed.

Recent research has investigated the way deaf people experience the act of listening to music, and other research has explored how deaf musicians interact and maintain timing and relative pitch when playing acoustic instruments in an ensemble. Evelyn Glennie describes using her whole body as a huge ear to accurately interpret vibrations throughout her body as she plays.

Evelyn Glennie describes using her whole body as a huge ear and she is able to accurately interpret vibrations throughout her body as she plays


Delivering artificial sound patterns to the body through the skin will be challenging. The skin’s sensitivity is generally quite coarse, so simply converting sound to vibration and expecting it to be able to convey basic qualities of pitch and volume will be difficult. The huge sound pallet available in software synthesisers will be the most difficult to model and it seems likely that simple frequency to vibration conversion will not be a suitable way of simulating these sounds.

I have been experimenting with visually representing sound in a meaningful way, and it is becoming clear that the sound needs to be represented in real time and as accurately as possible. However, the representation needs to be simple enough to be quickly interpreted and I plan to avoid the use of overly complex displays. I’m looking into using simple animated line figures that respond to the sound generated instead.

Throughout the whole process, I’ll be maintaining a solid user-based focus and ensuring that the design conforms to best practice in terms of user experience and inclusive design. 

The combination of both vibration and visual feedback will help to ‘fill in the gaps’ in hearing for deaf musicians. I’ll be working very closely with users of the system throughout the project to make sure their needs and desires are catered for.

Richard Burn is a Research Student at Birmingham City University. The research is funded by the Arts and Humanities Research Council and the Midlands3Cities Doctoral Training Partnership.

Link to Author(s):