The Next Hit Song Might Be Created By Google


tensorflow In an age where cars drive themselves and virtual reality is becoming the reality, it’s no surprise that robots are accelerating towards doing our work for us. Thanks to Google’s TensorFlow, an advanced music-creating technology, our beloved musicians across all genres might be out of their jobs pretty soon.

Before you panic, you should learn a little more about why this technology is actually important; although selfishly, we hope that Drake will jump the gun on the potential to redesign the music industry and come out with a new “One Dance” to blow us away. Researchers have experimented with AI-generated (artificial-intelligence-generated) music for years, but backed by a tech giant as powerful as Google, Project Magenta has grown from an interesting idea to a functioning program in a matter of months.

Project Magenta, the official name for the TensorFlow initiatives, first demonstrated the program last May at Moogfest, a four-day music and technology festival in North Carolina. Douglas Eck and Adam Roberts, the primary researchers for the project, showed off the digital synthesizer in which the advanced computer running the program could listen to notes played, analyze them, and then play back a more complete melody from those given notes.

Soon after, Google launched its first effort to make this technology accessible: a program that allows users to upload music data from MIDI music files into TensorFlow. The technology then, ideally, should learn basic music theory through patterns and algorithms that are present in many of the files. Some of the “lessons” that the technology picks up from thousands of files include: music notation, pitch, velocity, volume, cues, and clock signals, which create tempo.

In a greater sense, the TensorFlow technology uses this collected data to help computer systems achieve the ultimate holy grail of artificial intelligence: cognition—can someone say robots taking over??

Now while all you tech geeks out there are fangirling over this huge innovation, some of us are understandably worried about the demonic effect of robots changing music as we know it. We adore the (human) artists behind our favorite songs, and we can’t imagine a world where we buy tickets to see machines rather than people in concert (Though EDM essentially does this already). But this music/robot technology isn't all techno - check out the song Google already made using TensorFlow:

As you can hear, it’s still very basic, and relies on a considerable amount of effort beforehand. It takes time to give the computer enough files to even work at all. However, in June, this software began backing Apple's iOS mobile platform, formally putting it in the public domain.

Music is supposed to reflect our era, and plainly said, computers are an integral part of modernizing society. Artists who recoil from progressing technology will only become useless or trivial faster, but most musicians probably already know that. It’ll be interesting to see if any of them will attempt to team up with Project Magenta to produce tracks in the near future, or if they will declare musical war.

Likewise, it might be a good move on Google’s part to seek out a big-name artist to officially launch Magenta and TensorFlow into the spotlight. And, if anything else, it’s a ~mellow~ way of introducing information technology to the world of art.

MusicKate KozuchComment