Professor Plucky

Expressive body movement in human-robot musical ensembles

Illustration of a human and robot playing guitars together

Illustration: Michael Krzyzaniak

Robots, including those that play music, are stereotypically stiff in their movement. Many of the world's most sophisticated musical robots scantly move at all. Others display some ancillary movement, although the primary focus is on the mechanics of playing, and not the visual appeal of the movement itself. Human musicians by contrast use a wide variety of expressive movements, gestures, and signals when they play together. From this perspective it seems that many musical robots are missing an entire layer of important musical behaviour. This project therefore asks:

How does the way a musical robot moves affect how its human musical partners interact with it?

To explore this, we built a prototype guitar plucking robot called Professor Plucky, which is designed to have visually appealing movement, even at the expense of mechanics. Professor Plucky, is shown in Figure 1.

Image of a classical guitar equipped with some fancy-looking plucking mechanisms. Figure 1: Professor Plucky, prototype guitar plucking robot, with six kinematic plucking mechanisms mounted between the sound-hole and bridge.


It consists of a guitar equipped with kinematic plucking mechanisms that move in a visually appealing way, and control plucking mechanisms (not shown in Figure 1) that do not visually move. Professor Plucky is not a complete guitar robot and has no way of fretting the strings or damping them. Maybe people interact differently, move differently, or play differently when Professor plucky uses the kinematic pluckers instead of the control pluckers.

Each string of the guitar is equipped with its own kinematic plucking mechanism. One of these pluckers is depicted in Figure 2.

Left: A strange and beautiful crank and rocker linkage with a guitar pick on the end of it. Right: Several superimposed images of the same, showing the crank at many different angles. Figure 2: Left -- One of the kinetic plucking mechanisms. Right -- A composite image showing how the mechanism moves. In the forward direction, shown, the guitar pick (1) slowly plunges down next to the string, and then (2) rapidly plucks it with an upward flicking motion. The crank moves clockwise.


The pluckers' visual design is based loosely on the animation seen in the video Resonant Chamber by Anamusic. The mechanical design is based loosely on the characters presented in this video by Disney. The pluckers consist of a 4-bar Grashof linkage in a crank and rocker configuration. The distal end of the rocker arm holds a guitar plectrum. The crank is driven by a small DC servomotor through a series of gears. When the crank moves at a constant angular velocity in the intended forward direction, the plectrum plunges slowly down next to the string and then plucks it with a rapid upward flicking motion. If the crank is driven in reverse, the plectrum slaps the string and then slowly pulls up away from it.

Along with the kinematic pluckers, the guitar was also equipped with a second set of plucking mechanisms to serve as an experimental control. These have little visual appeal in their motion, and are depicted in Figure 3.

A boring and not at all kinetic plucking mechanism mostly enclosed inside a black plywood housing. Figure 3: The control plucking mechanism, showing guitar picks mounted directly onto the horns of servomotors.

They are based loosely on the pluckers used in the LEMUR GuitarBots and MechBass. They consist of a guitar picks mounted directly on the horns of standard hobby servomotors. There is one for each string. They pluck the respective string by moving back and forth. The movement is small and mostly obscured by the plywood that houses these pluckers.

Professor Plucky can be seen in the video below:

We wanted to know if people interact differently with the robot when it plays with the kinematic pluckers versus the control ones. So we invited guitarists to play duets with the robot. We tracked their eye and body movements with eye-tracking glasses and mocap respectively. We asked them to improvise along with the robot. Figure 4 shows the experimental setup.

A human guitarist with eye-tracking glasses and motion-capture suit improvising a duet with Professor Plucky as part of a user study. Figure 4: A human guitarist with eye-tracking glasses and motion-capture suit improvising a duet with Professor Plucky as part of a user study.

Overall we found that guitarists tend not to like the movement of the kinematic pluckers. The less-experienced players tend to find it distracting, while the more-experienced players feel neutrally but willfully ignore it. At the same time, everyone had anecdotes about times when they purposely used movement when playing with another human guitarist in the past, although these are usually intermittent cueing gestures. So it seems that the continual movement of the pluckers is the issue. It would be interesting to repeat the experiment with variable-length pauses in the music so the guitarist is forced to watch the robot for cues. On the other hand, there is theory that suggests that some amount of distraction might be good for motor-learning tasks, so it would also be interesting to repeat the experiment and have people try to learn a short passage instead of improvising.

Interestingly, some of the guitarists did not notice that the kinematic pluckers were used in one condition and the control pluckers in the other. One participant in particular was looking directly at the respective plucking mechanisms while playing and moved significantly more with the kinematic pluckers, facts we know due to the eye tracking glasses and motion capture. This suggests that part of how musicians respond to movement might be subconscious, so their self-report of how much they liked it might not be the whole story.

Below is a video of someone improvising with the robot.

Published Dec. 18, 2021 5:38 AM - Last modified Nov. 18, 2024 10:42 AM

Participants

  • Michael Joseph Krzyzaniak University of Oslo
  • Laura Bishop University of Oslo
Detailed list of participants