Interface design

Interface design

System and interface design

The methods for interfacing and controlling my software became really important when I first had developed a stable minimum of functionality and started using it for improvisation. It was then clear that it needed a very clear layout with few unnecessary functions in order to quickly navigate and manipulate. This led to a process of simplification which was very helpful as I had to consider what I really needed access during performance and how that would best be controlled. From the beginning I had programmed and tested the different stages of analysis, transformation and sound output inside the programming environment of Max/msp, relying on Ircam’s nice ftm/gabor and mubu/pipo libraries. As I thought it would be a good idea to use an already existing framework for routing audio, playing files and doing standard audio processing I went on to put these max patches inside Ableton Live, trying to keep a modular approach. The problem with Live is that it only transmits either audio or midi signals between modules/devices, so to keep the modular approach I had to combine the modules in Max and load them combined into one Live device. For a while this was the way I kept developing the software, programming in Max and testing it in Live. However, due to the routing limitations, frequent crashes, and the fact that I had to switch between tracks and views in Live all the time to watch what I was doing, I decided that I had to make this work as a standalone patch in Max instead. Even though that meant having to program the whole host environment with a system for navigating and playing files, mapping controllers and routing audio.

Below is some screenshots showing the evolution of a synthesis device.

Skjermbilde 2014-12-19 kl. 11.31.50

Simple formant analysis/synthesis

 

Skjermbilde 2014-12-19 kl. 11.30.25

Adding envelope, voiced/unvoiced and transformation controls.

 

Skjermbilde 2014-12-19 kl. 11.35.13

A combination of modules for segmentation, analysis, transformation and output

Skjermbilde 2014-12-19 kl. 11.38.55

Simplification of controls, modules centered around xy-squares for easy touch screen control.

Skjermbilde 2014-12-19 kl. 11.41.11

Controls laid out as color coded rows of dials and faders.

 

Physical Control

It was also very soon evident that screen and mouse is not a good enough interface for improvising music on software instruments. I tried using touchscreens such as the iPad, but still felt the need for something more physical. One could of course build custom controllers with loads of dials and buttons, but for now I have decided to use small ready made controllers with a limited set of functions in order to retain the overview, and instead use one dedicated controller for each “instrument” such as the midi-controlled mechanical piano, the transducer-powered acoustic drums etc. The perceptual benefit of having each physical controller linked to a separate instrument, in addition to using colour labels for grouping individual controls into subgroups, seems to work quite well.
The system will no doubt continue to evolve both in terms of functions and interface, but as of now this is a baseline system that works for improvising with little time wasted on trying to navigate hundreds of parameters and settings.

labeled midi controllers

labeled midi controllers

Whole system  in a max patch

Whole system in a max patch