The First Track of the First Album Composed and Produced by AI

The first pop album composed and produced by AI, and Taryn Southern.


At Big Think, we’ve been intrigued for a while about the intersection between artificial intelligence/machine learning and creativity. There was a crazy AI-written movie script a while back, and a couple of attempts by AI at producing music . In one case, it generated a melody and basic chords (hiding, admittedly, somewhere inside a splashy arrangement contributed by a human arranger and performers). In another case, Google AI produced some, shall we say, abstruse piano fragments. But we haven’t seen anything like this before. Singer/songwriter — and YouTube starTaryn Southern is working on an collection of songs whose instrumental backings were created entirely by an AI program (Southern writes the melodies and lyrics).

The appropriately futuristic, palindromic title for the album is: I Am AI. The first single, “Break Free,” came out a couple of weeks ago.

The video effects at the end are also AI-generated, by Google’s Deep Dream. (TARYN SOUTHERN)

One of the things that’s immediately striking about “Break Free” is that it sounds so normal. It’s not bloops and bleeps, nor is it... well... as completely insane as previous attempts have seemed.

The AI app Southern is using is called Amper. It’s a web-based tool in which you set some basic parameters for the music you want, and Amper creates, or renders, it for you. If you don’t like what Amper comes up with, you can render again. Its creators promise the app never repeats itself, presenting you with a new chunk of music with each render. Southern tells Big Think that she’s done this as many as 134 times to get what she was looking for in a particular piece of music. 

The few criteria you provide Amper are pretty simple. You can choose from among four styles.

(AMPER MUSIC)

You can select your instrumentation.

(AMPER MUSIC)

You can also pick your tempo — in minutes and seconds, though unfortunately not bars — and you can select the key you want.

And that’s it. Amper is not really designed for writing songs, per se. It’s more about generating music backgrounds for videos, games and augmented reality projects. Southern says the company told her she was the first artist using it for songwriting as far as they knew, and they’ve been learning about their own program from what she’s doing with it. One obvious question that comes to mind is who exactly owns the Amper-created songs, Southern or Amper? How about Southern and the person/people who wrote the Amper algorithms? When asked about this, Southern laughs, saying, “We are working through those issues.”

Southern is a self-described “music hacker” who settled on Amper for its high quality after investigating a few other AI-music platforms. She says she loves expressing herself through words and melodies, but after dealing with the expense and logistics involved in recording her music with other humans, she became intrigued with the possibilities of AI: “I got excited that I could do this on my own.”

While the process is cumbersome — especially looking for the right music to fit around a pre-conceived melody, as with “Break Free” — it holds obvious promise. She’s enjoying working with Amper as a creative challenge, thinking of it as a “fun box to play in for an album.” An easier approach would be to let it devise instrumental backings on top of which melodies would be crafted, and that’s a method Southern’s currently exploring. “Right now, I’m writing more to the AI.” She’s also considering layering more human performances on top of the Amper material to add more “interesting textures,” making the AI more “just an inspiration point.”

Amper rendering music (AMPER MUSIC)

There’s a video at the bottom of Amper’s home page that reflects just how new all of this is, and its somewhat hyperbolic claims reveal the unclarity many of us share in trying to envision AI’s rightful place in our lives and in the creative process. Amper’s co-founder and CEO Drew Silverstein says that the creators of Amper believe “the future of music will be created by the collaboration between humans and AI.” It’s hard to know what to make of this.

On one hand, music, art, and other media may be the ultimate forms of human expression, communicating feelings and ideas, as they do, on a unique level. What would AI have to do with that — does AI have a need to express itself?

(ADVENT via SHUTTERSTOCK)

On the other hand, using it as Southern does unquestionably makes sense. You’ll have to decide for yourself, though, if AI is really her collaborator, or simply a tool her own talent and musical taste allow her to effectively employ.

It should also be said that Amper’s primarily aimed at people who express their creativity visually, and simply need original (and free) music that complements their vision, “instantly and with no experience required.” This may be why you set the length of Amper music in minutes and seconds, to match a video’s length. Statements in the video like “We are enabling millions and millions of people to express themselves, and to express their creativity…” and promoting the benefits of “lowering the bar” only really make sense if Silverstein’s addressing visually oriented people.

It would seem that AI music generation would be of the least use to a musician who specializes in composing, and wouldn’t want to shift the responsibility elsewhere. For that person, lyric-writing AI could be a heaven-send. Still, in 2017, it’s as likely to be just about as ridiculous as the movie script we mentioned earlier.

Southern points out an issue that comes from making the creative process so frictionless. “There’s a rub in using something like AI to fuel a creative project,” she says, because, “there’s no need to learn how instrumentation works. We’re seeing this across every creative field. Does it make the person less creative? Does it bring the art form down? I don’t know.” Still, she says, “I’d rather see people have tools to express themselves than not, because that’s an entry point into a different way of thinking and a different way of expressing.”

Ready for the future (TARYN SOUTHERN)

When AI is really fully capable of creating entire songs and recordings — with music, lyrics, and performances — it’s likely we’ll all be at least initially curious to hear what it sounds like. And the field is still so young. As AI becomes more and more alive and individuated — we’re thinking Data on Star Trek TNG — it may well have something very much like feelings and a soul. And it may have something of its own that it absolutely has to express.

Related Articles

Scientists discover what caused the worst mass extinction ever

How a cataclysm worse than what killed the dinosaurs destroyed 90 percent of all life on Earth.

Credit: Ron Miller
Surprising Science

While the demise of the dinosaurs gets more attention as far as mass extinctions go, an even more disastrous event called "the Great Dying” or the “End-Permian Extinction” happened on Earth prior to that. Now scientists discovered how this cataclysm, which took place about 250 million years ago, managed to kill off more than 90 percent of all life on the planet.

Keep reading Show less

Why we're so self-critical of ourselves after meeting someone new

A new study discovers the “liking gap” — the difference between how we view others we’re meeting for the first time, and the way we think they’re seeing us.

New acquaintances probably like you more than you think. (Photo by Simone Joyner/Getty Images)
Surprising Science

We tend to be defensive socially. When we meet new people, we’re often concerned with how we’re coming off. Our anxiety causes us to be so concerned with the impression we’re creating that we fail to notice that the same is true of the other person as well. A new study led by Erica J. Boothby, published on September 5 in Psychological Science, reveals how people tend to like us more in first encounters than we’d ever suspect.

Keep reading Show less

NASA launches ICESat-2 into orbit to track ice changes in Antarctica and Greenland

Using advanced laser technology, scientists at NASA will track global changes in ice with greater accuracy.

Firing three pairs of laser beams 10,000 times per second, the ICESat-2 satellite will measure how long it takes for faint reflections to bounce back from ground and sea ice, allowing scientists to measure the thickness, elevation and extent of global ice
popular

Leaving from Vandenberg Air Force base in California this coming Saturday, at 8:46 a.m. ET, the Ice, Cloud, and Land Elevation Satellite-2 — or, the "ICESat-2" — is perched atop a United Launch Alliance Delta II rocket, and when it assumes its orbit, it will study ice layers at Earth's poles, using its only payload, the Advance Topographic Laser Altimeter System (ATLAS).

Keep reading Show less