Sound Programming 2 - Semester assignment

Presentation

For my semester assignment I am investigating the musical potential of a Polhemus Patriot. The Polhemus Patriot is a device for tracking position and orientation based on two markers in an electromagnetic field.

Polhemus Patriot
Polhemus Patriot

I have chosen to approach this by making a "Polhemus Musical Instrument". Programming of the instrument is done is Max/MSP. I have used a GDIF approach to communicate data between the Polhemus Patriot and the sound generator. The sound is generated as sonification of the movement data from the Polhemus. I made two simple "handles" as an interface. The Polhemus markers may be placed inside these handles.

Interface for the Polhemus instrument
Interface for the Polhemus instrument

example video of the Polhemus instrument

Method

By using methods described in (Jensenius et al 2008) and (Nymoen 2008) I have recorded data from the Polhemus to SDIF files. I have arranged this data to different GDIF layers: raw data, pre-processed data, body-related data, and instrument-related data. The latter is an abstract way of describing the relationship between the hands (e.g. spacing and mean position).

The Polhemus data is organized and sorted into the following GDIF namespace:

/raw/1
/raw/2
/cooked/marker/1/position
/cooked/marker/1/orientation
/cooked/marker/1/velocity
/cooked/marker/2/position
/cooked/marker/2/orientation
/cooked/marker/2/velocity
/body/hand/left/position
/body/hand/left/orientation
/body/hand/left/velocity/total
/body/hand/left/velocity/up
/body/hand/left/velocity/outwards
/body/hand/left/velocity/forward
/body/hand/right/position
/body/hand/right/orientation
/body/hand/right/velocity/total
/body/hand/right/velocity/up
/body/hand/right/velocity/outwards
/body/hand/right/velocity/forward
/instrument/position/x
/instrument/position/y
/instrument/position/z
/instrument/expansion
/instrument/growth

This namespace is used to map movement parameters to sound. For instance, the z position of the instrument is linked (though, not mapped directly) to pitch.

The sound and the mappings in the instrument have has been developed using trial-and-error. As a point of departure I used the Polhemus data to write sample values to an audio buffer, which is used for the audio synthesis. The rest of the DSP processes in use are amplitude modulation, filtering, simple delay lines with feedback, and real-time convolution. The movement parameters from the body and instrument layers in the GDIF setup controls the parameters of these processes.

Max Patch for the Polhemus instrument
Max Patch for the Polhemus instrument

References

Jensenius, A.R., K. Nymoen and R.I. God?y. "A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians", International Computer Music Conference, ICMC 2008, Belfast. Submitted 2008.

Nymoen, K. 2008: "A setup for synchronizing GDIF data using SDIF-files and FTM for Max", COST-SID Short-Term Scientific Mission report. McGill University, Montreal. March 2008.

Weekly_assignments

Assignement 1 - Interactive video

Assignement 2 - Jamoma

Assignement 3 - Javascript in Max


Last update: 2008-05-25 by kristian.nymoen@imv.uio.no