Software

The basic infrastructure of the Aesthetic Lab consits of 15 Vicon M2 tracking cameras, a V624 data station, a hemispherical loudspeaker array with 24 or 29 speakers and a planar loudspeaker array with 48 or 64 speakers, the Vicon iQ 2.5 tracking application and IEM's CUBEmixer sound projection software.

These elements have been extended and connected though software components developed in the context of the project. The components implemented in SuperCollider form the EGM Toolkit.

EGM Toolkit

QVicon2OSC

This is a relay application translating the Vicon tracking data into the Open Sound Contol data stream (QVicon2OSC) in order to connect the tracking system to a motion data processing and sound synthesis system implemented in SuperCollider. QVicon2OSC, which has originally been developed in the SonEnvir project, has been extended substatially in the context of EGM.

EGMParameter

This SuperCollider class allows a parameters in the EGM Toolkit to be controlled and monitored via a Behringer BCF2000 motor fader box. This feature allows the researchers to perform certain aspects of EGM scenarios manually, in concert with the tracked movements of the dancers. This option allows for a more improvisatory exploration of scenarios, as they may be reconfigured interactively.

EGMTrackingInput

This SuperCollider class receives OSC tracking data from the QVicon2OSC application, which can also be controlled through this class via OSC (network configuration, selection of data channels, etc.). Callbacks can be registered with EGMTrackingInput selecting data channels to be received. Grouping allows data channels to be structured and treated as aggregates. Data can be received through callbacks or explicit polling. A sigature describing the format of the data channels selected in the QVicon2OSC application is analysed by EGMTrackingInput and used to decode the OSC data stream.

EGMTrackingPlayer

This SuperCollider class allows to play back tracking data received with EGMTrackingInput and recorded in a SuperCollider data structure. Upon playback, the EGMTrackingPlayer connects to the EGMTrackingInput and simulates the protocol of the QVicon2OSC application, enabling EGM szenarios to be used without the tracking system being online but rather with recorded tracking data.

EGMOperators

EGMOperators are SuperCollider classes defining processing nodes in a motion data flow network. Motion data flow networks are used in EGM to condition and analyse tracking data, before it can be mapped to signal processing parameters. Typical nodes are format converters (e.g. between different rotation formats), filters (e.g. median, lowpass), analysers (speed, acceleration, fft, etc.) and statictial operators (mean, median, stdev, skew, etc.). Networks of EGMOperators are composed as function call chains which are evaluated at each tracking data frame (typically 120 times a second).

EGMDoc

This SuperCollider class controls the EGM integrated documentation system which records tracking data in sync with the video and audio signals aquired in exploration sessions in the Aesthetic Lab. The documentation systems uses a Max/MSP/Jitter patch to store the data in a synchronized manner. The data can be played back with the Max/MSP/Jitter patch EGMPlayer. Because of its dependency on a particular hardware configuration, EGMDoc is not released as part of the EGM Toolkit. The EGMDoc system has been used to create the EGM Archive.