New Rule Can Come Into Effect as Early as October
On 14 August, the United States (US) Federal Trade Commission (FTC) announced a final rule banning marketers from using fake reviews, including those generated by artificial intelligence (AI), as well as deceptive practices like paying for bots to inflate follower counts.
This rule strengthens the FTC's enforcement powers, allowing it to impose civil penalties on violators.
Notably, the rule could impact crypto influencers and others who boost their social media presence with fake followers, views, and likes.
The rule addresses six key areas of fraudulent marketing:
Fake or False Consumer Reviews: The rule prohibits reviews and testimonials from non-existent entities, such as AI-generated reviews or those from individuals with no experience with the product. Businesses are also banned from purchasing, soliciting, or disseminating these reviews if they are aware of their falsity.
Compensated Reviews: Businesses can no longer offer compensation or incentives in exchange for reviews that express a specific sentiment, whether positive or negative.
Insider Reviews: The rule bans reviews from company insiders that do not clearly disclose their connection to the business, including those solicited from employees or relatives.
Misleading Review Sites: Businesses are prohibited from creating or presenting reviews on websites that falsely appear to be independent, a tactic commonly used in the tech industry to promote their own products.
Fake Social Media Influence: The rule also prohibits anyone, not just companies, from selling or buying fake indicators of social media influence, such as bot-generated followers or views.
Review Suppression: The rule bars businesses from suppressing negative reviews through intimidation, legal threats, or false accusations. It also prohibits misrepresenting that reviews on a company's website reflect the majority when negative reviews have been hidden.
The FTC unanimously voted 5-0 to approve the rule, which will take effect 60 days after its publication in the Federal Register, likely in mid- to late October.
Violations could result in significant fines, especially for businesses with large numbers of reviews.
FTC Chair Lina M. Khan said in a statement:
“Fake reviews not only waste people's time and money but also pollute the marketplace and divert business away from honest competitors. By strengthening the FTC's toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice, and promote markets that are fair, honest, and competitive.”
The FTC has been pursuing regulations against fake reviews and inflated social metrics for years, and this rule marks a decisive step in that effort.
Are Automated Chatbots like ChatGPT to Blame?
As e-commerce, influencer marketing, and generative AI continue to grow, more advertisers are relying on automated chatbots like ChatGPT to generate user reviews quickly.
This trend has led to consumers sometimes being misled by false praise or deceptive claims.
While fake reviews are already illegal, some companies have taken steps to combat the issue independently.
For instance, in July 2022, Amazon sued over 10,000 Facebook group administrators for allegedly facilitating fake reviews.
The new rule is supported by some of the largest online review platforms and marks a significant shift from self-regulation to stricter government oversight.
General Counsel for the major online review platform Yelp, Aaron Schur, noted:
“While Yelp's policies have long prohibited practices outlined in the FTC's final rule, we believe the enforcement of this new rule will improve the review landscape for consumers and help level the playing field for businesses.”
Instead of relying on the Department of Justice to prosecute individual cases, the FTC will now have streamlined authority to enforce the ban directly.
However, the question remains: will this rule effectively reduce fraudulent online engagement, or will it merely be a small step in addressing a much larger problem?