Nettsider med emneord ?Motion capture?
A workshop for exploring advanced methods for capturing and analysing human music-related motion.
RITPART is a partnership project with the aim of connecting RITMO to world-leading research groups in USA, Canada and Japan.
By Finn Upham (McGill University), visiting researcher at RITMO during February 2020
The Musical Motion Capture Database is being developed for storing all sorts of musical motion capture data.
The goal of the innovation project SoundTracer is to develop an app for searching in large music libraries through moving a mobile phone in the air.
An intensive PhD-level training course on sound and motion analysis with experts in sound and music computing from the Nordic countries.
Postdoctoral researcher Kyrre Glette participated in (and won!) the 64kB intro competition at the Assembly computer festival in Helsinki. A 64kB intro is an executable program in 64kB which includes realtime generation of graphics and music.
The animation includes a dancing robot, where the motion is based on data recorded with our new Qualisys infrared motion capture system.
Graphics programming done by Kim Kalland, Thomas Kristensen and Kyrre Glette. Sound programming and music by Gergely Szelei-Kis.
This week's version of the student newspaper Universitas has an article about music and movement, featuring visiting researcher Yago de Quay .
fourMs-researchers were heavily present at the annual VERDIKT conference yesterday. In addition to a lecture and poster presentations, we also contributed a performance with the iPhone ensemble and a motion capture performance. More info below.
We have uploaded a few pictures from the Open Lab last week, and research fellow Kristian Nymoen has made a few short videos presenting the use of the Qualisys motion capture system.
Videos from the Semester opening concert are now available on YouTube: iPhone trio and Dance jockey. See below for more information.
Our new motion capture system is presented in the Qualisys newsletter from May. We have been working with Qualisys to create an integrated solution for handling recording and streaming of music-related body movement data, and look forward to working with the new system in the coming years!
Motion Capture recording of William Westney performing excerpts from piano pieces by Franz Liszt and Harold Arlen.
This software enables realtime streaming of mocap data from the MoCap toolbox for Matlab, synchronised with audio.
The MoCap Synthesiser is a set of generic tools for making real-time motion tracking devices into musical instruments, including the SoundSaber. One feature extraction module and two sound modules are included.
Research fellow Kristian Nymoen will defend his dissertation on Friday 25 January 2013.
Methods and Technologies for Analysing Links Between Musical Sound and Body Motion
The Musical Motion Capture Database is being developed for storing all sorts of musical motion capture data.
As preparations for the Motion Capture Workshop next week, we will be giving an introduction to using the NaturalPoint Optitrack system that is currently set up in the fourMs Lab.
fourMs-researchers wil participate in the Department of Musicology's semester opening concert.
Kristian Nymoen, Anders Tveit, Alexander Refsum Jensenius: Bloom and Scrambler (for iPhone and small speakers)
Yago de Quay, St?le Skogstad: Posture (with Xsens motion capture)
fourMs-researchers wil perform at the VERDIKT conference:
iPhone ensemble playing Bloom and Scrambler (for iPhone and small speakers). Featuring Alexander Refsum Jensenius, Kristian Nymoen, Anders Tveit, Arve Voldsund and Viet Phi Uy Hoang.
Dance Jockey by Yago de Quay and St?le Skogstad (using Xsens inertial motion capture)
In the Dance Jockey project, we have used a full-body inertial motion capture system, the Xsens MVN suit, for musical interaction.
FourMs will host an international workshop on motion capture in music 12-16 October 2009, with guests from McGill University and University of Jyv?skyl?. There will be three public lectures: