We tend to think of robots as suitable for repetitious and mundane mechanical tasks, and that creative endeavors like art or music are things only humans can create. Well, think again: With advancements in artificial intelligence, computers are now capable of making music that is difficult to discern whether it’s created by man or machine. And the latest example of this is Coditany of Timeness, a black metal album made entirely by an artificial neural network.
As first reported by The Outline, Coditany of Timeness was created by using deep learning software that, over a short period, was trained to analyze and reproduce the style of music based on what the scientists feed it. In this case, it was an album by the black metal band, Krallice. This isn’t the first experiment in A.I. music creation, but the project is particular noteworthy because, unlike previous experiments that replicated classical music, black metal is “characterized by its ultra-long progressive sections, textural rhythms, deep screams, and melodic weaving over a grid of steady, aggressive rhythmic attacks,” and has “extreme characteristics [that] make it an outlier in human music,” wrote the project’s creators, Zack Zukowski and CJ Carr, who go by the name, Dadabots.
In short: music that isn’t easy to recreate, yet the computer was able to make something that sounds like it came from the band. According to The Outline’s author, Jon Christian, “If I didn’t know it was generated by an algorithm, I’m not sure I’d be able to tell the difference.”
Here’s how the album was created. The Krallice album, Diotima, was first separated into 3,200 eight-second segments of raw audio data. Most of the audio was used for training the algorithm what the music sounded like, but other segments were used to test the software by making it guess what came next. Successful guesses would strengthen the A.I.’s neural network, which operates similarly to the human brain. If the training merely produced unintelligible noise, it was restarted. After three days and millions of repetitions, they ended up with 20 sequences, each one four minutes in length.
“Early in its training, the kinds of sounds it produces are very noisy and grotesque and textural,” Carr told The Outline. “As it improves its training, you start hearing elements of the original music it was trained on come through more and more.”
As if that weren’t enough, the names of the songs and the title of the album were generated by a probability equation known as a Markov Chain. Even the album cover artwork was created by an A.I program.
This project is just the latest in A.I. research around music training. Researchers at the the Birmingham City University in the U.K. are developing a neural network project that could predict what a piece of music might sound like if it had been created by an earlier artist, say Pink Floyd covering a Jay-Z tune. Sony’s Computer Science Laboratory division created two Beatles-esque pop songs after its A.I. project learned various musical styles from a massive database. And the A.I. from Google’s Magenta team created a 90-second musical piece all by itself, thanks to machine learning.
Coditany of Timeness is Dadabots’ first album (you can listen to the albums on Bandcamp), and the results will be included in their research paper, “Generating Black Metal and Math Rock: Beyond Bach, Beethoven, and Beatles,” which will be presented at the Neural Information Processing Systems conference.