The US Federal Trade Commission (FTC) proposed making new updates on an artificial intelligence (AI) deepfake rule on February 16. The government agency said the proposed rule changes would protect users from AI impersonations.

According to the ‘Rule on Impersonation of Government and Businesses’ document, AI deepfakes that impersonate businesses and governments could face legal action.

No AI Deepfakes Allowed for Businesses and Government Agencies

The FTC said the changes are necessary due to the prevalence of impersonations of businesses, government officials, and parastatals.

The endgame is to protect customers from possible harm incurred from generative AI platforms.

The updated rule will come into effect 30 days following its publication in the Federal Register.

For now, public comments are welcome for the next 60 days. Once the rule is enacted, the FTC will be empowered to go after scammers who defraud users by impersonating legitimate businesses or government agencies.

The AI industry has come a long way since the famous launch of ChatGPT in November 2022 by the OpenAI team. The company, led by Sam Altman, has recently launched a new product called Sora.

Sora uses AI prompts to generate realistic videos with highly detailed scenes, complex camera motions, and vibrant emotions.


Powerful AI tools like those offered by OpenAI and Google have increased productivity for many people and businesses.

However, they have also become an effective tool in the hands of cybercriminals. With the tool, criminals can easily alter the appearance or voice of someone to deceive a target audience.

The FTC rule change will come down hard on these criminals to ensure they face the full weight of the law.

While there is no concrete rule that makes AI-generated recreations illegal, US Senators Chris Coons, Marsha Blackburn, and Thom Tillis have taken steps to address the issue.

Impersonator Scams Stole $2.7 Billion in 2023

Impersonator scams, though not often featured in tabloids, pose a major threat to the US.

Speaking on the issue, the FTC Chair Lina Khan noted that voice cloning and AI-driven scams were rising.

Khan proposed that updating the rules would strengthen the agency’s ability to address AI-enabled scams that impersonate individuals.

Putting a figure on the potential hazard impersonator scams carry, Khan noted that US citizens lost upwards of $2.7 billion in 2023.


The new rules would also enable the agency to return the stolen funds to the affected victims.

Meanwhile, the head of the Federal Communications Commission (FCC), Jessica Rosenworcel, has proposed categorizing all calls with AI-generated voices as illegal.


The announcement came after reports surfaced that US citizens were getting robocalls imitating President Joe Biden.


In the call, US voters were advised not to vote in the US Presidential elections.

Meanwhile, in the crypto industry, AI deepfakes are a menace.

According to Michael Saylor, about 80 deepfake videos of himself are removed daily. Most videos show him asking users to send their Bitcoin to a posted wallet address.


New ones emerge daily, however. Saylor, who serves as the Chairman for Microstrategy, has warned crypto investors about the trend.

By ailf

Leave a Reply

Your email address will not be published. Required fields are marked *