Computer interfaces have been driven by the mouse/keyboard paradigm for the last 20 years. Music software, naturally, follows this convention for its interaction, yet music performance demands many and varied types of control. The Musical Instrument Digital Interface (MIDI) is a protocol developed in 1983 to standardize digital musical instrument control and, naturally, has been integrated into personal computer music software. Most software packages however, only allow simple uses of MIDI data, note data from a MIDI keyboard for example, but not control of more complex operations such as the cuing of different songs or tracks. These higher-level operations were developed well after the design of the MIDI specification and their control resides largely in the realm of the typical interface of the programming platform: mouse/keyboard.
As a result of this interaction, performers using live instruments with computer accompaniment are forced to transition between playing their instrument and using the computer to setup the next song. There is a distinct rift between the audiences perceptions of these two actions. The physical action of the live instrument is easy to grasp – the effect is tangible. The action on the computer is not abstract and the result is harder to gage. The total result is the effect of the performance is limited as the energy built by the live instrument is drained when the performer is forced to interact directly with the computer.
danomatika 2007-01-27