Instrument interface and control

Instrument interface and control

I knew early that I needed a very intuitive way of controlling this complex instrument I was developing, in order to achieve any meaningful performance results. This is a common issue when designing digital musical instruments (DMI’s), and dependent on what Moore first conceptualized as “control intimacy” (Moore,  1988).
The standard personal computer interfaces of screens, keyboards and pointer devices are far too narrow, slow and single task oriented. Even when using external controllers with physical knobs and switches, there is an unsatisfactory spatial and cognitive split of attention between the physical control input and the “virtual” reality of the screen as a visual feedback. To overcome this I had to make a setup of physical interfaces with complete access to all parameters needed during a performance, to be able to put the screen away and achieve a degree of embodiment of the instrument (Fels, 2004).

Like I have described earlier, for many parts of my software patch this could be achieved with a combination of various general purpose midi controllers, but after experimenting with a lot of different ways to control the selection of large collections of recorded speech segments, including using keyboards, tablet computers, touch screens etc, I found that the best solution was to make a purpose built physical controller centred around a xy plot touch interface where segments are selected in a two dimensional space of mean pitch and tempo. This arrangement allows for quickly recalling segments of roughly the kind of speech genre and musical quality needed in a given musical situation. In addition I needed a logcal way to control the pan position of signals between my systems 16 channel outputs. They are grouped like sections of similar instruments, in four groups of four outputs each: four exciters mounted on string instruments (piano, guitar, mbira, zither), four drums (bass drum, snare drum, tabla, tambourine), four cymbals (wind gong, cymbal, toy cymbal, larger gong) as well as four loudspeakers (stereo monitors and two old radios). I found that by using two joysticks, one for panning between groups and the other for panning within the groups, I could easily control where sound was going. This could have been achieved with two potentiometers as well, but only the joysticks enable gradual transitions between any two outputs in the system.

I built two such controllers using Teensy microcontrollers, which are tiny computers based on the arduino platform that amongst other things can act as a generic USB MIDI device. The case was laser cut from 3 mm plywood. Arduino microcontrollers have been widely used for musical instruments and controllers and relevant documentation is easily available, for instance in Brent Edstrom’s Arduino for musicians (Edstrom, 2016). The source code I wrote for these controllers can be seen below as well as downloaded here: controller_v2_2.ino


When placing a combined setup of these controllers on the music stand of a Piano, my experience is that I manage to integrate this new complex instrument into my existing musical relationship with keyboard instruments and can seamlessly switch between playing the piano and the digital instrument even within the same musical gesture. The additional way of interacting with the instrument through sound input, either voice or piano, makes this integration even tighter.
In this way the instrument is embodied and with some practice can function as an extension of my musicality, in accordance with Moore’s and Fels’ measure of intimacy.

This is not to say that this is a general musical instrument with an interface that will be intuitive and easy to use for any untrained person. Like any instrument it needs practice to master, and like most complex digital systems this instrument is based on certain preconceived notions about what kind of things will be interesting to do, and what kind of music will result from it, with aesthetic choices embedded in every step in the design and construction of the instrument. There is an obvious overlap of the way I already approach improvisation an think musically on the keyboard, and what musical output is possible to create with the digital instrument. The idea was never to design a general purpose musical instrument, but to realise a personal artistic vision, an extension of musical ideas that I was unable to realise on keyboard instruments alone.

References

Edstrom, B. (2016). Arduino for Musicians: A Complete Guide to Arduino and Teensy Microcontrollers. Oxford University Press.

Fels, S. (2004). Designing for Intimacy: Creating New Interfaces for Musical Expression. Proceedings of the IEEE, 92(4), 672–685. https://doi.org/10.1109/JPROC.2004.825887

Moore, F. R. (1988). The dysfunctions of MIDI. Computer Music Journal, 12(1).