You are on page 1of 2

Be wary of AI celebrity clones peddling phoney 'free money' on YouTube.

Scammers on YouTube are using AI voice cloning technology to make it appear as if celebrities such as
Steve Harvey and Taylor Swift are encouraging fans to fall for medical benefits scams. This week, 404
Media was the first to report on the trend. These are just a few of the most recent examples of
scammers using increasingly accessible generative AI tools to target often economically impoverished
communities and impersonate famous people in order to make quick money.

A tipster contacted 404 Media and directed the publication to over 1,600 videos on YouTube featuring
deepfaked celebrity voices as well as non-celebrities promoting the scams. Those videos, many of which
are still available at the time of writing, have reportedly received 195 million views. The videos appear to
violate several YouTube policies, including those concerning misrepresentation, spam, and deceptive
practices. PopSci's request for comment was not immediately responded to by YouTube.

How does the ruse operate?

The scammers attempt to fool viewers by using cut-up clips of celebrities and voiceovers created with AI
tools that mimic the celebrities' own voices. Deepfake versions of Steve Harvey, Oprah, Taylor Swift,
podcaster Joe Rogan, and comedian Kevin Hart can all be heard promoting the scam. Some of the videos
appear to use a recurring cast of real humans pitching different variations of a similar story rather than
celebrities deepfakes. YouTube accounts with misleading names like "USReliefGuide,"
"ReliefConnection," and "Health Market Navigators" frequently post the videos.

"I've been telling you guys for months to claim this $6,400," says one of the deepfake clones
impersonating Family Feud host Steve Harvey. "Anyone can get this even if you don't have a job!" That
video, which was still available on YouTube at the time of writing, had received over 18 million views.

Though the exact wording of the scams varies from video to video, they all follow a basic template. First,
the deepfaked celebrity or actor addresses the audience, informing them of a $6,400 end-of-year holiday
stimulus check provided by the US government, which will be delivered via a 'health spending card'. The
celebrity voice then says that anyone who is not already enrolled in Medicare or Medicaid can apply for
the stimulus. Viewers are then usually directed to a link where they can apply for the benefits. The video,
like many effective scams, creates a sense of urgency by convincing viewers that the bogus deal will not
last long.

In reality, victims who click on those links are frequently redirected to URLs with names like
"secretsavingsusa.com" that are not affiliated with the US government. PolitiFact reporters called a
signup number listed on one of those websites and spoke with a "unidentified agent" who asked for
their income, tax filing status, and birth date; all sensitive personal information that could potentially be
used to commit identity fraud. In some cases, the scammers allegedly request credit card information as
well. To entice victims, the scam appears to use confusion over legitimate government health tax credits.

There are numerous government programmes and subsidies to help people in need, but generic claims
from the US government offering "free money" are generally a red flag. Lowering the costs of generative
AI technology capable of creating somewhat convincing mimics of celebrities' voices can make these
scams even more convincing. In a blog post last year, the Federal Trade Commission (FTC) warned of this
possibility, citing simple examples of fraudsters using deepfakes and voice clones to engage in extortion
and financial fraud, among other illegal activities. Deepfake audio can already fool human listeners
nearly 25% of the time, according to a recent survey conducted by PLOS One last year.

This isn't the first instance of deepfake celebrity scams, and it's unlikely to be the last. Tom Hanks
recently apologised on Instagram to his fans after a deepfake clone of himself was spotted promoting a
dental plan scam. Soon after, CBS anchor Gayle King claimed that scammers were using similar deepfake
techniques to make it appear as if she was endorsing a weight-loss product. Scammers reportedly used
an AI clone of pop star Taylor Swift's voice alongside real images of her using Le Creuset cookware to
persuade viewers to sign up for a kitchenware giveaway. The gleaming pots and pans were never
delivered to fans.

To address the growing issues, lawmakers are scrambling to draft new laws or clarify existing legislation.
Several proposed bills, such as the Deepfakes Accountability Act and the No Fakes Act, would grant
individuals greater control over digital representations of their likeness. Recently, a bipartisan group of
five House lawmakers introduced the No AI FRAUD Act, which seeks to establish a federal framework to
protect individuals' rights to their digital likeness, with a focus on artists and performers. Nonetheless,
it's unclear how likely those are to pass in the midst of a flurry of new, hastily drafted AI legislation
entering Congress.

You might also like