Abstract
Computational models of musical similarity are an essential enabling factor for many intelligent music production systems, such as recommendation systems, intelligent searching interfaces and automatic music generation tools. In the case of drum loops in particular, much work exists to date in modelling perceptually significant factors of similarity, such as syncopation, density and low-level rhythm similarity, to enable these kinds of tools that could be integrated within a digital audio workstation (DAW) to make the process of writing drum tracks quicker, easier and more enjoyable. Currently however, most state-of-the-art features and models for estimating drum loop similarity work on quantized drum patterns, neglecting any microtiming information. To successfully model expressive drum loops, we must therefore go beyond this approach and develop new features that can account for expressive timing.
Bio
Fred Bruford is a PhD Student at Queen Mary University of London, supervised by Mark Sandler and working in collaboration with the music technology startup ROLI. The overall aim of his work is to apply techniques from Music Information Retrieval (MIR) within the music production workflow to make the process of making music faster, easier and more enjoyable. Primarily, his current project concerns the development of computational models of similarity for drum loops, to power intelligent tools for searching and navigating within large drum loop libraries. Fred will be at RITMO for the next 5 weeks, working with Olivier Lartillot and the TIME project.
In this talk, I will provide a short summary of some of my PhD work up until now, along with a further discussion of some of the open challenges and questions we will be dealing with over the course of my stay here at RITMO.