The design of interactive computer systems allows for sonic artists to extend the limits of a musical instrument. This may be by expanding the timbral palette with processing, extend the tessitura with pitch shifting, extend polyphony with sampling, delay lines and much, much more.
In cases extending an acoustic instrument, the performer must learn the augmented instrument's abilities and behaviors so that a composition can be performed. Some augmented instruments lend themselves to improvisation, others towards very specific performance instructions to achieve the desired results. In both cases, an extension to the written musical language of notation has to be devised and communicated.
In some cases the computer functions as a meta-instrument – a collection of physical, virtual or metaphysical inputs mapped onto a synthesis engine creating an instrument that can be performed. With changes in software, peripherals and synthesis, every piece can be a different instrument. When composing for Laptop Orchestra or Mobile App Orchestra, the first hurdle to get over is teaching the new instrument to the performers. Only after the instrument is understood and practiced can a composition be learned and rehearsed.
### Tools Developed for Interactive Composition:
- Bio
- NexusUI
- MaxMSP/Jitter work
### Interactive Composition Examples:
- Mouth Harp
- [[Preparing the Land]]
- [Critical Mass](Critical%20Mass.md)
- Krumpus
- Preparing the Land