When
Thematic Session 5: Mapping and Control (Tuesday, 15:30)
Abstract
We present initial efforts in a pilot project combining machine learning and wearable sensing devices in order to develop intuitive audiovisual displays that accurately reflect physical activity and the felt experience of human movement. Tracking human motion is important for a range of activities and applications, from dance and music performance to rehabilitation and human-robot interaction. We utilise wearable devices from the company BioX which capture real-time physiological sensor data with arm and leg bands. These provide precise data from embedded force-sensing resistors and inertial measurement units, but provide very little information about how a person feels during the activity. This project combines a real-time interactive audiovisual system combined, movement analysis and machine learning techniques to develop algorithms that impart additional high-level information about the mover’s emotional and affective states. The goal is to improve algorithms for movement and effort tracking by incorporating people’s felt experience of movement.
PROJECT PARTICIPANTS: Cumhur Erkut, Elizabeth Jochum, Dan Overholt, Robin Otterbein, Sofia Dahl, George Palamas, and Shaoping Bai
PROJECT LINK
Bio
Dan Overholt is an Associate Professor at Aalborg University Copenhagen. His research interests include advanced technologies for interactive interfaces and novel audio signal processing algorithms, with a focus on new techniques for creating music and interactive sound. He is involved in the development of tangible interfaces and control strategies for processing human gestural inputs that allow interaction with a variety of audiovisual systems. Dan is also a composer, improviser, inventor and instrument builder who performs internationally with his new musical instruments and custom sound synthesis and processing algorithms. Dr. Overholt received a PhD in Media Arts and Technology from the University of California, Santa Barbara, and a M.S. in Media Arts and Sciences from the Massachusetts Institute of Technology. He has about 70 peer reviewed publications and two patents (one provisional).
Robin Otterbein is a composer, performer, developer, dancer, digital artist, and researcher based in Copenhagen, Denmark. As a recent graduate in Sound and Music Computing, he is currently involved in interdisciplinary research at Aalborg University Copenhagen, focused on sound and movement interaction, neural audio synthesis and movement analysis using machine learning, carried out in collaboration with dancers and movement experts. In 2022, he conducted his master’s thesis in collaboration with the American software development company Cycling ’74 in New York City, on neural audio synthesis using differentiable models in the visual programming environment Max/MSP.