The Musical Gestures Toolbox (MGT) is a Matlab toolbox for analysing music-related body motion, using sets of audio, video and motion capture data as source material.
It was primarily developed for music research, with a particular focus on studying the motion of musicians and dancers. But it can be used for any type of motion-related analysis based on video recordings.
Functions
The Musical Gestures Toolbox contains a set of functions for the analysis and visualization of video, audio, and mocap data. There are four categories of functions:
- Data input and edit functions
- Data preprocessing functions
- Visualization functions
- Middle and higher level feature extraction functions
Tutorial
Please refer to the wiki pages for more information about installing and usage. Much of the same information is also available in the material for the software carpentry workshop Quantitative Video analysis for Qualitative Research.
Dependencies
-
Many of the basic things work with a standard Matlab install, but to use all features you will also need these extra Matlab packages:
- Computer Vision System Toolbox
- Image Processing Toolbox
-
Some of the functions build on code from these third-party toolboxes:
History
The toolbox builds on the Musical Gestures Toolbox for Max, which has been developed by alexarje since 2004, and parts of it is currently embedded in the Jamoma project.
A large chunk of the code was written by benlyyan as part of his M.Sc. thesis at the University of Oslo.
There is now also a Musical Gestures Toolbox for Python with more or less the same functionality.
The software is maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
Reference
If you use this toolbox for research purposes, please reference this publication:
- Jensenius, Alexander Refsum (2018). The Musical Gestures Toolbox for Matlab. Proceedings of the 19th International Society for Music Information Retrieval Conference, Late Breaking Demos Session. Paris, France.