Props, Costumes and Instruments (Performance)

Having considered the use of robotics (see The Cast (Performance)), it is perhaps logical to look at areas of performance practice that make use of extensions or additions to the human body, a sort of ‘augmented-reality’ in contrast to the full-on ‘virtual reality’ of the robot as human surrogate. Neither prop, costume nor instrument (or perhaps all three at once), Stelarc’s ‘Extra Ear’ project is an attempt to graft a prosthetic ear made of soft tissue and cartilage onto his own forearm. Though staggering in terms of its audacious vision of anatomical transformation, it is the second phase of the process that is more relevant to the focus of this paper, dealing as it does with the embedding of technological devices into the prosthesis (and elsewhere in the body) to enable the artist to demonstrate a unique new mode of channelling digital communication signals. When complete, the artist’s new prosthetic ear and one of his teeth will contain equipment that will act in the manner of a distributed Bluetooth-enabled headset. The artist will be able to talk into the ear positioned on his forearm and will receive messages back via a speaker positioned in a tooth cavity. The expectation is that witnesses standing near the artist will be able to hear the voices of other people coming out of the artist’s open mouth when he is engaged in dialogue using the equipment.

An organism enhanced by the addition of mechanical or electrical equipment is referred to as a cyborg, and as well as being a recurring theme in science-fiction writing, this has also been explored in the realm of art/science, an area of practice where it is arguably difficult on occasions to perceive the precise point where ‘valid’ science stops and ‘art’ and ‘drama’ begin. Kevin Warwick, professor of cybernetics at Warwick University, implanted an RFID (radio frequency identification) tag under his skin in 1998 which enabled him to influence mundane computer-controlled objects such as doors, lights and heaters by his mere proximity. This was followed in 2002 with a more complex electronic device being inserted that interfaced with Warwick’s nervous system, to the extent where a remote robotic arm could be made to mimic the movements of his own arm in real-time.

Other figures that have adopted cyborg attributes, who are more explicitly related to the realm of performance art, include Eduardo Kac and Marcel.lí Antúnez Roca, the latter known for a work entitled Afasia (1998) where he appears onstage wearing an exoskeleton of various body plates and arm and leg appendages which pass sensory signals to an offstage computer via an ‘umbilical’ cable.

Fig. 5 Antunez Roca performing Afasia (1998)Fig. 5 Antunez Roca performing Afasia (1998)

Roca shares the stage with a group of robots and a 20’ x 15’ screen which respond to signals produced by each and every type of movement he performs, provoking kinetic responses from the robots and a dazzling variety of graphic and unnerving images on the screen positioned above and behind him.

The use of sensors to gather and transmit movement is of central importance to the sort of wearable technology that enables participation in 3D virtual reality environments, including items such as datagloves and eye and head-tracking systems. The force feedback systems available in items like electronic gloves are designed to provide a level of haptic interactivity that demands an active engagement on the part of the viewer of the work, thereby bringing a sense of performance to what might otherwise be defined as a purely passive process of viewing.

In the field of musical performance, the use of gesture alone to control the signal output of an instrument has been around for some considerable time in the form of the Theremin, an instrument created in 1919 by Lev Sergeivich Termen and introduced to Europe and America in the course of the 1920’s. The sounds emitted by the fully electronic system depend on the distance of the performer’s hands from an antenna which is connected to two radio frequency oscillators. Typically, the position of the right hand (in space) controls the pitch of the signal and the left hand similarly controls the volume, resulting in a uniquely expressive and microtonal instrument that is played without being touched.

More recent innovations in electronics-enabled musical performance include Laurie Anderson’s Talking Stick and the development of the Hyperbow by Diana Young at MIT Media Lab. The first of these was used by Anderson in the ‘Song of Moby Dick’ (1999) and was (in the context of this work) consciously meant to represent a harpoon-like object as well as being an instrument in its own right. About six foot in length, the ‘stick’ is sensitive to movement and touch and sends MIDI (Musical Instrument Digital Interface) commands to an audio processing system, allowing the performer to produce a dynamic range of manipulated sounds using unorthodox performative gestures. The Hyperbow looks like a standard carbon fibre bow that one might use in conjunction with a violin or cello, but includes accelerometers, gyroscopes and force sensors integrated into the frog that measure spatial position, force, speed and bow-bridge distance. Designed originally as an expert analysis aid, its potential to augment techniques for playing traditional bowed instruments is in the process of being examined and exploited.

Syndicate content