AI and brain interfaces may be about to change how we make music

Computer control in the form of AI and brain-computer interfaces is being introduced to the art of composing.

When Yamaha demonstrated their AI allowing a dancer to play the piano with movement in a Tokyo concert hall in November 2017, it was the latest example of the ways in which computers are increasingly getting involved in music-making. We’re not talking about the synthesizers and other CPU-based instruments in contemporary music. We’re talking about computers’ potential as composers’ tools, or as composers themselves.


Yamaha’s use of AI is quite different. In the recent performance, world-renowned dancer Kaiji Moriyama was outfitted with electrodes on his back, wrists, and ankles, and set free to express himself as AI algorithms converted his movements into musical phrases for transmission to Yamaha’s Disklavier piano via MIDI messages. (MIDI is a computer language through which musical instruments can be controlled.)

(Yamaha Corporation)

Yamaha’s AI, which they’re still developing, worked with a database of linked musical phrases from which it selected and drew melodies for sending to the instrument based on Moriyama’s motions. 

Moriyama was accompanied onstage by the Berlin Philharmonic Orchestra Scharoun Ensemble.

(Yamaha Corporation)

We’ve written previously about other musical AI, the web-based AI platform called Amper that uses AI to compose passages based on descriptors of style, instrumentation, and mood. Singer/songwriter Taryn Southern was using Amper as her primary collaborator in writing an album. 

Another fascinating avenue being explored is the use of brain-to-computer (BCI) interfaces that allow wearers to think music into existence. It’s a fascinating way for anyone to play music, but it’s especially promising for people whose physical limitations make the creation of music difficult or even impossible otherwise.

Certain electroencephalogram signals correspond to known brain activities such as the P300 ERP (for “event-related potential”) that signifies a person’s reaction to some stimulus. It’s previously been by brain-computer interface (BCI) applications in spelling, operating local environmental controls, operating web browsers, and for painting. In September 2017, researchers led by BCI expert Gernot Müller-Putz from TU Graz's Institute of Neural Engineering published research in PLOS/One describing their “Brain Composer” project that leveraged P300 to bring musical ideas directly from composers’ mind to notated sheets of music. They work in collaboration with MoreGrasp and "Feel Your Reach”.

The researchers’ first step was training the BCI application to recognize alphabetical letters before moving on to note length and pitch, as well as notation values such as rests, ties, and such. In this video, the researchers demonstrate the successful outcome of just 90-minutes’ work for a motor-impaired subject.

(i-KNOW Conference)

This is all exciting stuff — and a heaven-send for musical souls with physical limitations — even if the results are a little odd, as in Yamaha’s case. (The BCI example sounds remarkably like the theme from Fringe.)

We’ve been augmenting our natural musical capabilities with technology ever since we picked up our first rock — and certainly by the time we honked steam-punk-looking saxophones. We should have no issue with adding AI and BCIs to our toolbox.

If AI comes up with music we might not, that’s fine. The workings of music remain pretty mysterious, in any event, so here’s an intriguing thought. Though many of us basically prefer music that’s catchy and sonorous, the stuff that really gets us in the gut tends to have something of the unexpected to it, a surprising dissonance or rhythm, and odd “hair” out of place, that makes the moment leap from our speakers or the stage and into our lives as more of an experience than a piece of art, leaving us a little startled and even moved. So if AI can beat a flesh-and-blood chess player by not thinking like a human, imagine what we’re about to hear.

 

Plants have awareness and intelligence, argue scientists

Research in plant neurobiology shows that plants have senses, intelligence and emotions.

Getty Images
Surprising Science
  • The field of plant neurobiology studies the complex behavior of plants.
  • Plants were found to have 15-20 senses, including many like humans.
  • Some argue that plants may have awareness and intelligence, while detractors persist.
Keep reading Show less

Vaping changes blood vessels after one use, even without nicotine

E-cigarettes may be safer than traditional cigarettes, but they come with their own risks.


John Keeble
/GETTY
Surprising Science
  • A new study used an MRI machine to examine how vaping e-cigarettes affects users' cardiovascular systems immediately after inhalation.
  • The results showed that vaping causes impaired circulation, stiffer arteries and less oxygen in their blood.
  • The new study adds to a growing body of research showing that e-cigarettes – while likely safer than traditional cigarettes – are far from harmless.
Keep reading Show less

Space is dead: A challenge to the standard model of quantum mechanics

Since the idea of locality is dead, space itself may not be an aloof vacuum: Something welds things together, even at great distances.

Videos
  • Realists believe that there is an exactly understandable way the world is — one that describes processes independent of our intervention. Anti-realists, however, believe realism is too ambitious — too hard. They believe we pragmatically describe our interactions with nature — not truths that are independent of us.
  • In nature, properties of Particle B may be depend on what we choose to measure or manipulate with Particle A, even at great distances.
  • In quantum mechanics, there is no explanation for this. "It just comes out that way," says Smolin. Realists struggle with this because it would imply certain things can travel faster than light, which still seems improbable.
Keep reading Show less