We will work in smaller teams on selected interactive sonification designs according to the teams' preferences, for instance (a) in contexts such as online sonification of body motions, or (b) interactive navigation of parameter mapping sonifications so that interesting auditory views can be selected quickly. For the implementation of such systems the BRIX2; physical computing platform for rapid prototyping will be provided, together with software templates to sonify sensor data in real-time. Likewise, sonification software templates in SuperCollider3 and python as well as some data sets will be provided to experiment with data analysis and data review sonifications. This will allow participants to practice sonification design procedures. Finally, after project team presentations, we will discuss the results and revisit and discuss design guidelines for interactive sonification.
Program (preliminary)
Time | Content | Location |
---|---|---|
Thursday | ||
10:15 | Workshop Part 1: Practical Session / Preparations for Sonification experiments | 103 |
12:00 | Lunch break | |
13:00 | Workshop Part 2: Sonification Techniques in a nutshell | 103 |
14:15 | Lecture: Interactive Sonification for Biofeedback Applications and Exploratory Data Analysis | Sem 2 |
16:00 | Workshop Part 3: Hands-on Session in project teams | 103 |
17:30 | Team status presentations & Discussion | |
18:00 | End of day | 103 |
Friday | ||
10:15 | Workshop Part 4:Model-based Sonification, Interactive Sonification, Blended Sonification | 103 |
12:00 | Lunch break | |
13:00 | Discussion and Conclusion | 103 |
14:00 | End of workshop |
Bring a laptop + headphones!
Lecture: Interactive Sonification for Biofeedback Applications and Exploratory Data Analysis
The lecture will start with an overview of the research areas of auditory display and sonification and the available sonification techniques, illustrated by a number of example applications to showcase how sound can be productively used to facilitate understanding and interaction in various contexts. Subsequently, we will focus on interactive sonification, the research focus on how users can profit from being integrated within a tightly closed interaction loop. Interaction allows users to query multiple 'auditory views' of the underlying data. Selected examples – from sports to exploratory data analysis – illustrate how interactive sonification provides task-oriented information. Throughout the workshop, guidelines for the design of interactive sonification systems will be given.
Bio
Dr. Thomas Hermann studied physics at Bielefeld University. From 1998 to 2001 he was a member of the interdisciplinary Graduate Program “Task-oriented Communication”. He started the research on sonification and auditory display in the Neuroinformatics Group and received a Ph.D. in Computer Science in 2002 from Bielefeld University (thesis: Sonification for Exploratory Data Analysis). After research stays at the Bell Labs (NJ, USA, 2000) and GIST (Glasgow University, UK, 2004), he is currently assistant professor and head of the Ambient Intelligence Group within CITEC, the Center of Excellence in Cognitive Interaction Technology, Bielefeld University. His research focus is sonification, datamining, human-computer interaction and cognitive interaction technology.