Fake product reviews waste people’s time and money. It also dilutes and distorts competition among sellers.
Regulators around the world are finally taking action after years of malpractice and fraud. In the U.S., fake reviews are now illegal under federal law and the Federal Trade Commission (FTC) can seek civil penalties against violators.
In the UK, the Competition and Markets Authority (CMA) is also trying to clamp down on the practice, calling on platforms - including Amazon - to do more and banning fake reviews under the Digital Markets, Competition and Consumers Act 2024.
In response, Amazon recently promised it will enhance its existing systems to spot and take down fake reviews, including banning users who post fake reviews and punishing sellers who cheat the system.
It may seem that both countries are taking a similar approach, but there is one very important difference. The FTC is clear that AI-generated reviews come under its definition of ‘fake reviews’. On the other hand, the CMA doesn’t specify or even mention the words ‘AI’, ‘Artificial Intelligence’ or ‘LLMs’ in its 109-page draft guidance. Neither does the DMCC Act which aims to protect consumer rights and competition in digital markets.
Do UK regulators define reviews that score a high probability score for being AI-written as fake? It’s hard to tell. If they don’t, this could be a huge, missed opportunity to tackle what I believe is going to be a massive problem, very soon. If they do, further clarification is urgently needed.
Co-founder and CEO of Pangram Labs.
AI may break customer trust
The proliferation of AI-generated reviews has the potential to break trust in the customer review system once and for all. Amazon was the first online retailer to introduce a review feature, but the system is now threatened by large language models (LLMs) like ChatGPT, Claude and Gemini. And this could be deeply damaging for the e-commerce giant and the wider industry.
Our mission at Pangram Labs is to mitigate new issues caused by the coming wave of powerful generative AI models and ensure they will be a net positive on the world. So, we tasked our researchers with investigating the scale at which AI-generated reviews are featuring on Amazon.com and the results surprised us.
We analysed nearly 30,000 customer reviews for 500 best-selling products across 10 categories including beauty, baby, health, appliances and furniture. And we found widespread evidence of AI-generated customer reviews.
Baby products are the most likely to have AI-generated reviews – 5.2% of the 3,037 reviews analysed contained text produced by LLMs. This has implications for parents using “Earth’s biggest baby wish list”, with many relying on the platform’s Prime service for quick delivery of vital supplies like nappies, wipes, and formula.
‘Beauty’ and ‘Wellness & Relaxation’ categories closely follow in second and third place - 5% and 4.4%, respectively. In this category the power of influence is strong, with established and challenger brands competing for buyers’ attention in a very saturated market.
If you’re shopping for the garden, you’re least likely to encounter an AI-generated review on the top 50 best-selling products – just 1.7% are currently.
AI generated reviews are likely to increase
While the overall percentage of AI vs human-written reviews were quite low (but we’d argue still concerning) at just over 5% in some product categories, we discovered positive reviews are the most likely to be written using AI and negative reviews are more likely to be human written. 74% of AI-generated reviews identified in the study were 5-star, compared to 56% of non-AI reviews. Only 10% of AI-generated reviews were 1-star vs 21% of non-AI reviews.
What’s more, 93% of the reviews suspected of being fully or partially AI-generated were ‘verified purchases’. This means they have a stamp of approval from Amazon as being published by legitimate buyers, increasing their sense of authenticity and trustworthiness.
As LLMs become more widely available and cheaper to use, the issue of AI-generated reviews is only going to get bigger, so it’s important that regulators address this early on, and retailers start to take action now.
AI tools analyze existing review data to produce new, human-like content that mimics genuine customer feedback in seconds. In some cases, the intention is for sellers to boost their product’s star rating and influence buyers’ purchasing decisions, especially if authentic customer feedback is sparse, or overly negative. They can use existing positive feedback as an example of what the LLM should replicate or prompt an LLM to generate a review based on the information it provides, such as the product’s benefits and unique features.
In other cases, customers may use LLMs to write reviews for them if they are short of time or unsure what to say. But this can misrepresent their real experience of a product and mislead others. While this practice doesn’t have malicious intentions, it shouldn’t be encouraged either.
Protecting the online buying experience
The findings of our study also call into question the usefulness of Amazon’s AI-generated customer review highlights, introduced in 2023. This latest innovation provides a short paragraph on the product detail page for customers to understand at a glance the common themes across hundreds or even thousands of published reviews, including product features and buyer sentiment. If the percentage of phony AI reviews increases on product pages, the AI overview becomes essentially useless.
To help protect the online buying experience and the wider internet from inauthentic and inaccurate information, I urge regulators and platforms including Amazon to take bold action in stopping AI-generated content being published on its platform. The first step is to acknowledge this is a problem that is only going to get bigger. The second is to make sure the definition of a fake review includes those that are AI-generated. And if you’re an Amazon customer leaving a product review, please don’t outsource the task to an LLM.
We've featured the best CRM for small business.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro