In recent years artificial intelligence (AI) has affected music to an increasing extent, for instance in music production: While a human writes the main melody the machine, through AI, may produce the background arrangement.
“However, music produced by AI today may not often be very surprising, as surprise is not AI’s priorority. It is more about giving you what you order,” ?a?r? Erdem says.
As a researcher at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, he wanted the machine to become a partner.
“I wanted us to make music together. Then the machine needs to have some kind of agency, a capability to act on its own,” he explains.
The machine as a partner
He recently completed a PhD where he investigated what he called ?shared control,? where he developed several interactive systems.
One of the systems he named CAVI. This is an instrument controlled by both a musician and a machine, and where both actors are able to make choices.
?a?r? Erdem and six self playing guitars play assisted by CAVI
“In this link six ?self-playing guitars? can listen and react based on what they hear. It becomes a kind of guitar choir, where the guitars give their own contribution to the piece,” Erdem says.
In November, Erdem will be organizing a workshop on musical AI. More info here
The movements of the body
He also investigated how our bodies can collaborate with AI.
“If you look at the AI systems in music today, they do not collaborate with the human body. In general they mostly base their actions primarily on sound. However, when humans play together, they communicate both through sound and movement.”
?a?r? Erdem invited 36 guitar players to his lab. He collected a dataset that gave information about how their movements were linked to the music being played.
“I found that there is a close connection between sound and movement, particularly the movements of the right hand. The muscular force we exert when hitting the guitar strings reflects the sound almost perfectly,” he says.
Erdem used machine learning algorithms to code movement and sound. Later he could give the machine information about movement only and the machine would produce sound based on that input.
His system “Playing in the “air““ is now able to make music while he plays air guitar.
In the longer term, he believes that this kind of technology can make it easier for humans and machines to collaborate.
Instruments for the future
Erdem works in a niche field, but he believes his research to be important.
“In music history, there are many examples that new instruments have influenced the music being made,” he says.
When the piano was invented, it was originally named “pianoforte” since it allowed you to play both silently (Italian: piano) and loudly (Italian: forte).
“You see the effect of it on pieces written afterwards. Tools affect the music we make. New instruments may build a foundation for the music of the future,” Erdem says.
A more creative AI
Today the tech field is led by engineers, not artists, and this is highly visible in today’s AI systems, he adds.
“If you write the word “cat” in a search engine, you get pictures based on other cat pictures. This makes sense, but it also means that the algorithm often does not show you the photos of rare cats.”
In order for AI to make broader musical expressions more artists must get their hands dirty with AI technologies, he states.
“People working on the crossroads of art, technology and science, like myself, need to investigate how algorithms may contribute to broadening the art and music of the future.”
AI in all phases of music production
As a tool, AI already contributes immensely to people’s everyday musical experiences. For instance, when your streaming platform suggests artists for you, it is AI. When you listen to film music with big orchestra arrangements, they may be made by AI, perhaps based on a simple melody.
Erdem has no doubt that in 50 years, AI will be strongly present in music production.
“I think it will be indispensable in all phases of music production, such as sound
synthesis, songwriting/composition, arranging, recording, mixing, mastering, distribution/streaming, promotion, as well as live performances”, he says.
He also believes that we will find AI musicians with huge fanbases. The robot Shimon is already out there. It doesn’t have a fanbase yet, but you can book it for your event.
“Then, what will happen with copyrights? We don’t know yet. For example, after CAVI’s premier, we could not figure out how to deal with that in the Norwegian system,” ?a?r? Erdem says.