People:
Kristen Loree
Peter Gilbert
Karola Obermüller
Jane daPain
Artists Statements:
Loree
Ox
Gilbert
Obermüller
daPain
Ursonate links:
Scores from Schwitters performance:
Ox scores for painted visualization:
Theme images for Visualization of Ursonate
Movement I
Movement II
Movement III
Our current grant is to develop a powerful tool for interactive performance—a newly designed application for the iPad that will utilize the tablet devise as a powerful interface for control and transformation of sound, image, and video. In order to have a fully customized tablet-controller tailor-made for our performance, we will employ a code programmer to work with the specific challenges of writing iPad application code. We will then go back and forth with the programmer through several phases of testing and development, as we identify all of the specific requirements our vision entails. We can then rehearse with and perfect the interface for performance.
The interface will serve a number of purposes and as such will have a variety of performance screens, fitting the needs of each collaborator for each part of the performance.
The interface screen mock-up shows the score in the large box on the left and shows the currentscreen projection in the upper-right box, giving her an immediate connection to the current projection. Across the bottom of her interface screen are a number of the syllabic materials she would want to use for the cadenza, available for her to select at a touch.
Her selection of thematic material will then be communicated over the
network to the interface for Jack Ox, who can then synchronize her projections with Loree’s improvisational choices in realtime. In fact, for the projection of Ox’s images throughout the entire piece, the interface will allow for the spontaneous arrangement and movement of image throughout the projection, something heretofore predetermined and inflexible.
Using the motion capture system at the University of New Mexico's Art, Research, Technology, Science Laboratory (ARTS Lab), the movements of Loree's body were isolated as she performed Ursonate. A 2010 RAC grant provided funds to apply the results of this motion capture data to a section of the third movement of Ursonate. However, to date we have only had the resources to apply this to 4 minutes of the 35-minute piece. Our new software will incorporate manipulation of visual image and should be able to use this data more extensively to great advantage.
An aspect of the application will also be a control surface for manipulating audio. Gilbert and Obermüller will do the programming for audio manipulation in Max/MSP7, a software
programming environment for manipulating live-audio, and the interface will communicate with Max/MSP via OSC (Open Sound Control). This will allow Gilbert and Obermueller to take the
audio of Loree’s live performance and diffuse it throughout the space using Zirkonium, a cutting-edge tool for manipulating a surround-sound space.8 The program can be set-up to adapt
to different rooms, so that it will be responsive to whatever speaker configuration we might find ourselves in. The tool will also have a set-up for a more freewheeling formation and
improvisation with sound in the newly created work following the Ursonate performance.
Notes: