Free ChatGPT is surprisingly expensive, here's why

7 hours ago 7
energy (Image credit: David Reed from Pixabay)

  • ChatGPT’s enormous global user base has created infrastructure demands that far exceed subscription income
  • OpenAI’s rising compute and energy expenses are pushing the company to explore ads and new revenue models.
  • OpenAI is reshaping its business model and strategy to sustain AI access as usage continues to grow

Running the world’s most widely used AI system is staggeringly expensive. ChatGPT's popular "free" tier is anything but free for OpenAI, which has been reshaping its business to keep pace with its own success.

ChatGPT is used by hundreds of millions of people, and each of their requests costs in computational time, electricity, water, and other resources scattered across data centers. That means there is no such thing as a “free” request on the receiving end. The servers must work relentlessly to keep up with and stay ahead of demand.

Even with multiple subscription tiers and enterprise deals, the costs of sustaining global AI access at this scale have ballooned to around $17 billion a year, levels that, of necessity, shape nearly every one of OpenAI’s decisions.

A Washington Post analysis once calculated that the energy required to generate a single weekly 100-word AI email over the course of a year could amount to about 7.5 kilowatt-hours. Multiply that example by hundreds of millions of weekly users, and the numbers quickly explode. Much of what people do with ChatGPT feels lightweight at the interface level, but the backend operations require powerful chips feeding on large volumes of electricity.

Sam Altman and ChatGPT logo.

(Image credit: Shutterstock/Mijansk786)

To manage that scale, OpenAI has undergone several structural transformations led by CEO Sam Altman. Founded in 2015 as a nonprofit designed to steward safe and beneficial AI, the organization eventually recognized that philanthropic funding alone could not sustain frontier-level research.

In 2019, OpenAI adopted a capped-profit model, leading to major backing from Microsoft, which now owns approximately 27% of the company, along with investments of billions of dollars from the likes of SoftBank and Nvidia. OpenAI is worth around $500 billion now, with speculation building that an initial public offering could come as early as later this year.

Ads for AI

Even with that backing, OpenAI must continue to generate substantial revenue from its consumer and business products. Subscription tiers bring in some money, but $20 a month for ChatGPT Plus, $200 a month from ChatGPT Pro, and the Team and Enterprise tiers are only a small slice of ChatGPT's total usage. And API costs paid by companies per token brought more than $20 billion in annualized revenue by 2025.

Sign up for breaking news, reviews, opinion, top tech deals, and more.

But that's not enough to keep up with infrastructure demands. Hence, the move into advertising in ChatGPT. The ads have begun rolling out for free users and those on the $8-a-month ChatGPT Go tier. These ads are labeled and separated from chat responses, but their presence signals OpenAI’s need to diversify revenue as compute expenses continue to soar.

For everyday users, the introduction of ads raises questions about how the product may evolve. Free access may be constrained over time, with more features placed behind subscription tiers. Ads could become more common for non-paying users. Businesses relying heavily on the API may see price changes as the company balances cost recovery with market competition.

The challenge OpenAI faces is not unique: the economics of generative AI differ from those of traditional consumer technology. When a social network grows, each additional user usually costs very little. Here, each new user can generate dozens or hundreds of costly computations per day.

As AI becomes more embedded in everyday life, the cost of providing these capabilities will shape how companies design them. Users may see shifts in pricing, limits on free access, or incentives to upgrade. ChatGPT's move from research project to a global phenomenon offers a lens into how cutting-edge technology evolves from novelty to infrastructure. But it's worth remembering that behind every clever answer and helpful suggestion is a network of data centers humming away, consuming power, and costing someone, no matter what it says about being free.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.


Purple circle with the words Best business laptops in white

Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read Entire Article