When Ludwig von Beethoven died in 1827, he was three years removed from the completion of his Ninth Symphony, a work heralded by many as his magnum opus. He had started work on his 10th Symphony but, due to deteriorating health, wasn’t able to make much headway: All he left behind were some musical sketches.
Ever since then, Beethoven fans and musicologists have puzzled and lamented over what could have been. His notes teased at some magnificent reward, albeit one that seemed forever out of reach.
Now, thanks to the work of a team of music historians, musicologists, composers and computer scientists, Beethoven’s vision will come to life.
I presided over the artificial intelligence side of the project, leading a group of scientists at the creative AI startup Playform AI that taught a machine both Beethoven’s entire body of work and his creative process.
A full recording of Beethoven’s 10th Symphony is set to be released on Oct. 9, 2021, the same day as the world premiere performance scheduled to take place in Bonn, Germany — the culmination of a two-year-plus effort.
Around 1817, the Royal Philharmonic Society in London commissioned Beethoven to write his Ninth and 10th symphonies. Written for an orchestra, symphonies often contain four movements: the first is performed at a fast tempo, the second at a slower one, the third at a medium or fast tempo, and the last at a fast tempo.
Beethoven completed his Ninth Symphony in 1824, which concludes with the timeless “Ode to Joy.”
But when it came to the 10th Symphony, Beethoven didn’t leave much behind, other than some musical notes and a handful of ideas he had jotted down.
There have been some past attempts to reconstruct parts of Beethoven’s 10th Symphony. Most famously, in 1988, musicologist Barry Cooper ventured to complete the first and second movements. He wove together 250 bars of music from the sketches to create what was, in his view, a production of the first movement that was faithful to Beethoven’s vision.
Yet the sparseness of Beethoven’s sketches made it impossible for symphony experts to go beyond that first movement.
In early 2019, Dr. Matthias Röder, the director of the Karajan Institute, an organization in Salzburg, Austria, that promotes music technology, contacted me. He explained that he was putting together a team to complete Beethoven’s 10th Symphony in celebration of the composer’s 250th birthday. Aware of my work on AI-generated art, he wanted to know if AI would be able to help fill in the blanks left by Beethoven.
The challenge seemed daunting. To pull it off, AI would need to do something it had never done before. But I said I would give it a shot.
Röder then compiled a team that included Austrian composer Walter Werzowa. Famous for writing Intel’s signature bong jingle, Werzowa was tasked with putting together a new kind of composition that would integrate what Beethoven left behind with what the AI would generate. Mark Gotham, a computational music expert, led the effort to transcribe Beethoven’s sketches and process his entire body of work so the AI could be properly trained.
The team also included Robert Levin, a musicologist at Harvard University who also happens to be an incredible pianist. Levin had previously finished a number of incomplete 18th-century works by Mozart and Johann Sebastian Bach.
In June 2019, the group gathered for a two-day workshop at Harvard’s music library. In a large room with a piano, a blackboard and a stack of Beethoven’s sketchbooks spanning most of his known works, we talked about how fragments could be turned into a complete piece of music and how AI could help solve this puzzle, while still remaining faithful to Beethoven’s process and vision.
The music experts in the room were eager to learn more about the sort of music AI had created in the past. I told them how AI had successfully generated music in the style of Bach. However, this was only a harmonization of an inputted melody that sounded like Bach. It didn’t come close to what we needed to do: construct an entire symphony from a handful of phrases.
Meanwhile, the scientists in the room — myself included — wanted to learn about what sort of materials were available, and how the experts envisioned using them to complete the symphony.
The task at hand eventually crystallized. We would need to use notes and completed compositions from Beethoven’s entire body of work — along with the available sketches from the 10th Symphony — to create something that Beethoven himself might have written.
This was a tremendous challenge. We didn’t have a machine that we could feed sketches to, push a button and have it spit out a symphony. Most AI available at the time couldn’t continue an uncompleted piece of music beyond a few additional seconds.
We would need to push the boundaries of what creative AI could do by teaching the machine Beethoven’s creative process — how he would take a few bars of music and painstakingly develop them into stirring symphonies, quartets and sonatas.
As the project progressed, the human side and the machine side of the collaboration evolved. Werzowa, Gotham, Levin, and Röder deciphered and transcribed the sketches from the 10th Symphony, trying to understand Beethoven’s intentions. Using his completed symphonies as a template, they attempted to piece together the puzzle of where the fragments of sketches should go — which movement, which part of the movement.
They had to make decisions, like determining whether a sketch indicated the starting point of a scherzo, which is a very lively part of the symphony, typically in the third movement. Or they might determine that a line of music was likely the basis of a fugue, which is a melody created by interweaving parts that all echo a central theme.
The AI side of the project — my side — found itself grappling with a range of challenging tasks.
First, and most fundamentally, we needed to figure out how to take a short phrase, or even just a motif, and use it to develop a longer, more complicated musical structure, just as Beethoven would have done. For example, the machine had to learn how Beethoven constructed the Fifth Symphony out of a basic four-note motif.
Next, because the continuation of a phrase also needs to follow a certain musical form, whether it’s a scherzo, trio or fugue, the AI needed to learn Beethoven’s process for developing these forms.
The to-do list grew: We had to teach the AI how to take a melodic line and harmonize it. The AI needed to learn how to bridge two sections of music together. And we realized the AI had to be able to compose a coda, which is a segment that brings a section of a piece of music to its conclusion.
Finally, once we had a full composition, the AI was going to have to figure out how to orchestrate it, which involves assigning different instruments for different parts.
And it had to pull off these tasks in the way Beethoven might do so.
In November 2019, the team met in person again — this time, in Bonn, at the Beethoven House Museum, where the composer was born and raised.
This meeting was the litmus test for determining whether AI could complete this project. We printed musical scores that had been developed by AI and built off the sketches from Beethoven’s 10th. A pianist performed in a small concert hall in the museum before a group of journalists, music scholars and Beethoven experts.
We challenged the audience to determine where Beethoven’s phrases ended and where the AI extrapolation began. They couldn’t.
A few days later, one of these AI-generated scores was played by a string quartet in a news conference. Only those who intimately knew Beethoven’s sketches for the 10th Symphony could determine when the AI-generated parts came in.
The success of these tests told us we were on the right track. But these were just a couple of minutes of music. There was still much more work to do.
At every point, Beethoven’s genius loomed, challenging us to do better. As the project evolved, the AI did as well. Over the ensuing 18 months, we constructed and orchestrated two entire movements of more than 20 minutes apiece.
We anticipate some pushback to this work — those who will say that the arts should be off-limits from AI, and that AI has no business trying to replicate the human creative process. Yet when it comes to the arts, I see AI not as a replacement, but as a tool — one that opens doors for artists to express themselves in new ways.
This project would not have been possible without the expertise of human historians and musicians. It took an immense amount of work — and, yes, creative thinking — to accomplish this goal.
At one point, one of the music experts on the team said that the AI reminded him of an eager music student who practices every day, learns, and becomes better and better.
Now that student, having taken the baton from Beethoven, is ready to present the 10th Symphony to the world.
This article is republished from The Conversation, a nonprofit news source unlocking ideas from academia, under a Creative Commons license. Read the original article. Ahmed Elgammal is Director of the Art & AI Lab at Rutgers University
The story you just read is accessible and free to all because thousands of listeners and readers contribute to our nonprofit newsroom. We go deep to bring you the human-centered international reporting that you know you can trust. To do this work and to do it well, we rely on the support of our listeners. If you appreciated our coverage this year, if there was a story that made you pause or a song that moved you, would you consider making a gift to sustain our work through 2024 and beyond?