Real-Time and Real-Space Interface

The Aesthetic Lab's technological environment provides a real-time and real-space interface to enable the researchers involved in the project (composers, musicologists, philosophers, choreographers, and dancers) to carry out the kind of intermedial explorations particular to EGM. The real-space interface tracks the dancers' movements by means of a Vicon motion capture system consiting of an array of 15 infrared cameras and covering a tracking volume of about 100 m3. Within this volume, the location and orientation of body joints can be traced with very high temporal and spatial resolution (less than 1 mm in position, at a rate of 120 Hz) using a special suit (see photo to the right). The thus reachable tracking precision with respect to the movements happening in real-space is an important prerequsite for intermedial embodiment to occur, as – depending on the kind of intermedial exploration – the most subtle movements of the dancers may need to be transposed consistently into sound.

Room to Move

Another particularity of the Aesthetic Lab technology is the extent of the tracking volume, allowing for a stage space of more than 40m2 to be tracked reliably, leaving enough space to choreographic work and unfolding of dance improvisation (see photo below).

Software Environment

A specially designed program (QVicon2OSC) translates the Vicon tracking data to the Open Sound Control format such that it can be used in the real-time motion and audio processing component developed with the SuperCollider sound synthesis language. The thus produced sound is projected through hemispherircal loudspeaker arrays (of 24 or 29 loudspeakers, depending on where the lab is installed) arranged around the tracking volume and using Higher Order Ambisonics (HOA) spatialisation approaches (CUBEmixer software) or others directly driving the speaker array.