Skip to main content

Drake’s AI clone is here — and Drake might not be able to stop him

Major record labels are going after AI-generated songs, arguing copyright infringement. Legal experts say the approach is far from straightforward.

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Illustration by Hugo Herrera for The Verge

A certain type of music has been inescapable on TikTok in recent weeks: clips of famous musicians covering other artists’ songs, with combinations that read like someone hit the randomizer button. There’s Drake covering singer-songwriter Colbie Caillat, Michael Jackson covering The Weeknd, and Pop Smoke covering Ice Spice’s “In Ha Mood.” The artists don’t actually perform the songs — they’re all generated using artificial intelligence tools. And the resulting videos have racked up tens of millions of views.

For Jered Chavez, a college student in Florida, the jump from messing around with AI tools one night to having a wildly viral hit came in late March. He posted a video featuring Drake, Kendrick Lamar, and Ye (formerly Kanye West) singing “Fukashigi no Karte,” a theme song of an anime series. It’s collected more than 12 million views in the month since.

Chavez has been generating new clips at a steady rate since then, getting millions more views across dozens of videos by running a cappella versions of songs through AI models that are trained to sound like the most recognizable musicians in the world. TikTok loves them, and they are cheap, quick, and simple to make.

“I was very surprised [at] how easy it was. Right out of the AI, it sounds pretty good. It sounds real,” Chavez says of the process. “It’s honestly kind of scary how easy these things are to do.”

So far, platforms haven’t removed Chavez’s videos, but the threats could be coming soon — if big artists and labels can figure out how to stop him. 

Music industry power players are already getting other AI-generated music pulled from streaming services by citing copyright infringement. But that argument is far from straightforward, legal experts say. There’s no precedent for whether real Drake can stop robot Drake on the basis of copyright — and yet copyright has once again become the most effective way to yank something off the internet that someone doesn’t like.

“It’s easy to use copyright as a cudgel in this kind of circumstance to go after new creative content that you feel like crosses some kind of line, even if you don’t have a really strong legal basis for it, because of how strong the copyright system is,” Nick Garcia, policy counsel at Public Knowledge, says.

That’s the case with perhaps the most notable AI-generated song so far, “Heart on My Sleeve,” which went viral earlier this month for its somewhat convincing pantomime of a Drake and The Weeknd song. The song, posted by an anonymous TikToker going by the name of Ghostwriter, amassed millions of streams before Spotify, Apple Music, TikTok, and YouTube removed it. In the case of YouTube, the culprit for removal was what felt like an unforced error: the otherwise original song inexplicably included a Metro Boomin production tag at the beginning. Universal Music Group claimed it was an unauthorized sample and successfully got the song pulled. In this case, a copyright claim worked — but just barely. Other original songs, like an AI Drake song called “Winter’s Cold,” have been pulled from streaming platforms, too, based on alleged copyright infringement.

Original songs using voice cloning aren’t copying anything concretely protected by the law

“Heart on My Sleeve” is exactly the kind of thing UMG wants streaming platforms to crack down on, saying AI companies are violating copyright law by training their models on artists’ songs (both Drake and The Weeknd have deals with UMG). That’s the same argument being put forward in other creative industries: Getty Images, for example, is suing the makers of the art generator Stable Diffusion, saying Stability AI “unlawfully copied and processed millions of images protected by copyright” when it trained its AI system. Online publishers are also heading down that path, saying they should be compensated for their content that is used to train chatbots. 

The problem with going down the copyright path to remove songs like “Heart on My Sleeve” and “Winter’s Cold” is that the tracks aren’t copying anything concretely protected by the law. Both songs appear to be written by a human who isn’t Drake and fed into voice cloning software, so the compositions are new, original works. An artist’s voice, style, or flow is not protected by copyright (for the most part). If an up-and-coming artist wrote their own lyrics, made a simple beat, recorded the vocals and put it through The Weeknd machine, there’s no individual existing work that’s being copied. Promoting the new track as a song by The Weeknd would get dicey, but that would be closer to a trademark issue rather than copyright.

Advancements in the AI voice technology currently being used also make the sampling issue stickier. Unlike older technology that chopped up and rearranged pre-existing recordings, many AI systems are creating new sounds that resemble a target voice. Even if tiny pieces of a recording were somewhere in the new song, it would likely be so small a portion that it would fail to rise to the level of copyright infringement, Garcia says.

Big-picture issues in artistic industries tend to get filtered through a copyright lens because the law can act as a “very big hammer that can hit many, many nails,” Meredith Rose, senior policy counsel for Public Knowledge, says. Making the argument that an AI Drake song is infringing on real Drake’s copyright isn’t clear cut, but it has become the primary way labels — and thus the public — think through the potential problems with AI songs.

“Copyright is a concern, but it’s very much a second-tier concern over some of the bigger, more existential questions about economic displacement, and upending business models, and deep fakes,” Rose says.

What if AI-generated “unreleased” Drake tracks surfaced and diverted revenue from actual Drake? What if the AI songs are just bad, and people were convinced Drake lost his magic touch for hits? What if a creator made AI Drake sing a white nationalist anthem? The problem quickly expands beyond the scope of copyright and into Drake’s personhood and identity — and unlike the wave of unknowns with AI, there is some precedent with how a person’s likeness is used. 

A person’s right of publicity allows them to control how their name or likeness is used to make money. But even before wading into how AI tools change things, there are underlying discrepancies in what course of action individuals have. Modern-day copyright law is at the federal level, and as part of that, DMCA takedowns offer a relatively quick and easy avenue to get material pulled without involving a lawyer or filing a lawsuit. The right of publicity (which is, in a confusing twist, also sometimes called the right of privacy) is more complicated and only exists at the state level. 

Only some states have this kind of law on the books, but importantly, California and New York — the two with the most sizable entertainment industries — both do. After all, people love to riff on celebrities’ images, almost as much as celebrities hate to have their images riffed on. Real Drake might very well sue over robot Drake using the same law that real Vanna White used to sue over robot Vanna White in 1992. (In the 1992 case over a Samsung advertisement, robot Vanna was a metallic android in a brightly colored gown, a mid-length blonde wig, and jewelry, standing next to a game show board with letters, rather than the output of an AI generator. The Wheel of Fortune co-host’s actual name never appears in the ad in question.) 

Vanna White standing next to the Wheel of Fortune game board.
Human Vanna White, as shown in the court filings
Robot “Vanna White” from Samsung’s VCR ad.
Robot “Vanna White” creator Samsung lost the case.
Image: United States Court of Appeals, Ninth Circuit

Many of the AI clone songs are being marketed as “Drake” or “Kanye West” AI tracks, but Garcia says that their right of publicity could extend even if the creator doesn’t explicitly name them in the promotion. After all, listeners likely recognize this voice singing “Paparazzi” with or without a picture of Ariana Grande to prompt them. It makes sense that musicians known for their voice are the first test cases for advanced voice cloning technology. This prompts a lot of questions about art, fair use, celebrity, and pop culture — and as complicated as these questions are, at least some version of this debate has been going on for many decades. But what about the inevitable future instances where AI voice clones of people who aren’t necessarily known for their vocal talents?

“We’re already in very murky waters, that’s when we go into the swamp.”

“[Let’s say] you have a Ron DeSantis hip-hop track that flies out there, God help us. Would the arguments look the same? Maybe, maybe not,” Rose says. “That’s when we get into even murkier territory. We’re already in very murky waters, that’s when we go into the swamp.”

More than copyright, Rose believes the fight involving AI tools will veer toward reexamining things like right of publicity laws within the next five to 10 years. Currently, if someone is the victim of a fabricated voice recording or deepfake, their experience getting it taken down will depend largely on where they live. The rapid accessibility of powerful AI tools could force the legal system to fill in gaps that already exist with or without something like voice cloning software.

“Do we start taking things like the right to publicity and making them a federal law so that everybody within the United States has access to whatever tools we decided to bake into this?” Rose says. “Right now, for better or for worse, it’s just luck of the draw where you happen to live.”


One problem with AI voice clone songs is that, unfortunately for the subject, they are funny. Nobody asked to hear AI Joe Biden say, “He say that I’m good enough, grabbin’ my duh-duh-duh / Thinkin’ ‘bout shit that I shouldn’t’ve,” but it’s become part of internet culture — the audio has been added to TikToks of people cleaning their bathrooms, making salads, and dancing.

Most of the viral AI covers and original songs are being created without the subject’s consent, a tension many listeners are picking up on. Comment sections regularly include some version of, “This has to be illegal,” or “Waiting for this to get taken down” — there’s an undercurrent of grossness to the voice clone content that is inescapable even as the songs become more and more absurd.

Experts worry that AI voice clones will become a problem sooner rather than later

Experts who have worked on other types of nonconsensual sharing of material online worry that AI voice clones will become a problem sooner rather than later. And though the focus right now is on AI tools spoofing famous, wealthy individuals, it could quickly become a nightmare for the average person experiencing things like domestic abuse.

“I’m anticipating we’re going to start seeing voice cloning used to trick schools to get access to the kids or to say an ex-boyfriend tried to reach out to you,” says Adam Massey, a partner at C.A. Goldberg who specializes in technology-facilitated abuse like nonconsensual distribution of intimate images (often called “revenge porn”). In order to get fabricated content removed, Massey says victims might start with a cease-and-desist letter alleging they were infringing on your right of publicity, impersonation, or fraud, but success will depend on whether the entity disseminating it is responsive. If the unauthorized material is a synthetic product of an AI tool, the subject won’t necessarily have the copyright over the deepfake.

Like the right of publicity, laws against sharing intimate images are on a state-by-state basis and are only beginning to address fabricated content, Massey says — just a handful of states have laws specifically around deepfake porn, for example. Websites that host nonconsensual explicit deepfakes operate openly, an NBC News report last month found. Though there’s no federal law against nonconsensual intimate images, deepfake porn has gotten so prevalent that Google has a DMCA-like system that victims can use to issue takedown requests.

Without the backing and legal support that big artists have, unsigned independent musicians will likely be forced to wade through any voice clones that pop up on their own. There’s no standardized way to report unauthorized AI-generated material, and artists would have to make the case that it was damaging their ability to make money off of their right of publicity, Massey says. 

Chavez doesn’t just make AI-assisted mashups of popular artists; he also is a musician himself, recording and sharing songs under the “aesthetic rap” genre. He is not worried about someone cloning his voice — he’s more concerned about labels releasing posthumous albums without the input of the deceased artist. 

Since his TikToks started blowing up, streams of his own music have ballooned, too, and his YouTube subscriber count has doubled. 

“So far, a lot of the responses are pretty positive, which I’m very happy about,” he says. 

The “AI stuff” isn’t what he’s actually passionate about, but people keep asking for new songs. Chavez plans to keep making some here and there — but truthfully, he is starting to get pretty bored.