Videos can be used to develop new visualisations to be used for analysis. The aim of creating such alternate displays from video recordings is to uncover features, structures and similarities within the material itself, and in relation to, for example, score material.
About MGT for Python
The Musical Gestures Toolbox for Python includes video visualization techniques such as creating motion videos, motion history images, and motiongrams. These visualizations allow for studying video recordings from different temporal and spatial perspectives. The toolbox also includes basic computer vision methods, and it is designed to integrate well with audio analysis toolboxes.
Usage
It is possible to run the toolbox from the terminal:
Many people would probably prefer to run it in a Jupyter notebook:
The MGT was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.
Features
MGT can generate both dynamic and static visualizations, as well as some quantitative data:
- dynamic visualisations (video files)
- motion video
- motion history video
- static visualisations (images)
- motion average image
- motiongrams
- videograms
- motion data (csv files)
- quantity of motion
- centroid of motion
- area of motion
History
This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. The latest version was primarily developed by Bálint Laczkó, Frida Furmyr, and Marcus Widmer.
The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
Citation
If you use the toolbox in your research, please cite:
- Laczkó, B., & Jensenius, A. R. (2021). Reflections on the Development of the Musical Gestures Toolbox for Python. Proceedings of the Nordic Sound and Music Computing Conference.