Abstract
While audio-motor synchronization has been intensely researched, visuo-motor synchronization and cross-modal (multisensory) interaction in audiovisual synchronization are still emergent fields of research. Specifically, visuo-motor synchronization has mostly used either modality-inappropriate stimuli (stationary flashlights lacking movement) or non-biological motion (bouncing balls). Given that human rhythm is deeply rooted in biological movement, this may possibly limit the ecological validity of findings. However, using motion capture data, we can curate naturalistic audiovisual stimuli representing relevant types of human movement, such as hand-clapping, locomotion, and musicking/dancing, to explore audiovisual perception and multisensory interaction/integration in more ecologically valid ways.
In this presentation, I will outline the research plan for my internship at RITMO and master thesis project at University of Padua. I will use sets of motion and audio recordings that were curated as part of the DjembeDance project to create a set of stimuli that systematically vary several parameters, notably movement complexity, tempo, and cultural specificity. Two experiments will serve to validate the stimuli. While experiment 1 contrasts auditory, visual, and audiovisual conditions of stimuli presentation, Experiment 2 tests how participants resolve artificially introduced asynchronies between the auditory and visual sensory channels in AV signals.
Bio
Vito Piccione is a master’s student in Applied Cognitive Psychology from Padua, Italy. He is deeply interested in improvisation, reactive music systems and the intersection between music and narrative. With a passion for piano, synths and everything that can make sounds, he is currently writing his thesis at RITMO in an attempt to bridge his musical practice with an academic path.