Music, Audio, Graphics, Lights and Effects (Performance)

One of the most widely used tools across the performing arts is Max/MSP developed by Cycling ’74 which is a graphical programming environment for music, audio and multimedia. It works on the principle that users manipulate ’objects’ that represent actions and entities and when these are moved around into different sequences, the embedded code moves with the objects. Max represents the basic environment and MSP features a set of audio processing objects that enable the user to carry out a wide range of specific signal manipulation tasks. Cycling ’74 also produce Jitter which extends the Max/MSP environment to support the real time manipulation of video and 3D graphics. Though optimized for very fast graphics handling, Jitter can in fact handle any kind of numerical data that it is possible to input into a computer and does so by abstracting all data as multidimensional matrices. Bundled together, the power and range of Max/MSP/Jitter is very significant and there is a large and active international community using this software for live music events and performance and installation artworks.

A recent interesting joint announcement by Cycling ’74 and Ableton, who are responsible for a powerful piece of software called Ableton Live, states that the two companies have formed a strategic partnership and will be looking at ways to leverage their respective strengths to add value to future software releases. Ableton Live is a music creation, production and performance platform used by a roster of very prestigious international performers for all phases of musical creativity, from rough audio ‘sketching’ to complex finished compositions. In contrast to these commercial offerings, the open source system SuperCollider, authored by James McCartney, is another object-oriented programming tool that potentially has a range of applications for audio and video but is particularly suited to the creation and manipulation of music. The adherents to this system also form an active and enthusiastic user-community. (There is also a Windows-compatible version called Psycollider).

The use of MIDI show controllers (MSC’s) for handling complex sequences of lighting, audio, pyrotechnics, scenery movements, atmospheric conditions, etc., has been commonplace in performance venues for many years. Like the MIDI music protocol, MSC’s interface between devices that recognise commands to cue or end specific encoded instructions and as such, can potentially work with any piece of equipment that can accommodate MIDI input. Given their long history of use, it is perhaps more interesting in this context to focus on the use of lighting and special effects in virtual environments and the types of tools that designers can use to enhance these spaces. Radiance is an example of an open source product that handles sophisticated virtual lighting schemes by calculating the amount of light passing through a specific point in a specific direction, a technique known as ‘ray tracing’. Fig. 5 shows an example of the type of lighting simulation that can be produced and compares a corner of a room in the House of the Vettii in Pompeii as it might have appeared prior to 79 A.D. when it was highly likely that illumination would have been supplied by olive oil lamps. This is compared to lighting under modern conditions and the resultant effects on the visibility of the frescoes are discussed by Devlin et al in their report.

Fig. 6 House of the Vettii with simulated modern lighting (left) and olive oil lamp (right) – Kate Devlin et alFig. 6 House of the Vettii with simulated modern lighting (left) and olive oil lamp (right) – Kate Devlin et al

The software is complex and demands high levels of user knowledge but can produce stunning results. A commercial alternative comes in the form EIAS Animator developed by the Electric Image Technology Group. In conjunction with EIAS Camera, these products claim to significantly reduce the time needed to render animations whilst still producing impeccable results. An interesting account of a project from 2001 that produced a computer model of the Crystal Palace as it would have appeared in 1851 refers to both the use of Radiance and an Electric Image software product called Universe. This summary usefully highlights some of the challenges facing VR simulation projects and indicates the resources required to effectively illuminate and render complex 3D models.

Syndicate content