Drake to Ariana Grande: How TikTokkers are making those viral AI cover songs
Who knew Ari singing Passionfruit would be such a bop
It’s not uncommon to discover new songs on TikTok. But, scrolling through your For You Page lately, you might have noticed many viral hits produced by one new artist alone: Artificial intelligence.
Yup, AI song covers are flooding the app and basically no major artist has been left out of the trend: Ariana Grande covered Drake’s Passionfruit, Rihanna covered Tems’ Free Mind and Beyonce’s Cuff It, Kanye West covered Justin Bieber’s Love Yourself, and Drake even covered Ice Spice’s Munch— a violation he described as “the final straw”.
Because none of these covers has consent from the artist who wrote the song or the pop star who’s artificially singing them. A robot has stolen their voice and run with it. And, especially with artists like Rihanna who haven’t released a new album since 2016, this practice is becoming insanely popular. So much so, music labels are seeking intervention.
So, how do you actually create an AI cover of a song?
Last month, an AI software developed at The Chinese University of Hong Kong, called DiffSVC, made headlines after it was used to cover songs in Ariana Grande’s voice. To do this, a fan simply fed the model authentic Ariana songs from her back catalogue, clips of her singing acapella, and other audio from YouTube until DiffSVC could sing (almost) exactly like Ariana can.
But DiffSVC can replicate literally any human voice. So, obviously, fans’ experimentation hasn’t stopped with Ari. Everyone from Britney Spears to Dua Lipa – and Michael Jackson posthumously – has now been fed into DiffSVC, and other software like it, to perform other artists’ music and entertain us on TikTok and Twitter, which is a bit problematic.
Are artists and record labels angry about AI song covers?
Obviously, to have your whole voice reproduced by a robot is scary – and it also calls into question the matter of money. Who’s getting paid if a robot is creating song covers, a fan is producing them and the artist’s back catalogue has made the whole thing possible? At the moment, it’s a legal mess for musicians.
And, according to a report in the Financial Times, Universal Music Group has asked major streaming platforms – including Spotify and Apple – to block AI companies from using UMG artists’ music to train the technology.
“We have become aware that certain AI systems might have been trained on copyrighted content without obtaining the required consents from, or paying compensation to, the rights holders who own or produce the content,” UGM said in an email.
“We will not hesitate to take steps to protect our rights and those of our artists,” they added to the FT. “We have a moral and commercial responsibility to our artists to work to prevent the unauthorized use of their music and to stop platforms from ingesting content that violates the rights of artists and other creators.
“We expect our platform partners will want to prevent their services from being used in ways that harm artists.”