Google Chrome 'silently' downloads 4GB AI model to your device without permission, report claims — researcher says practice may violate EU law, waste thousands of kilowatts of energy

2 hours ago 9
Chrome (Image credit: Getty / NurPhoto)

Security researcher Alexander Hanff, also known as "That Privacy Guy," has published a new analysis claiming that Google Chrome is silently downloading a roughly 4GB on-device AI model to users' machines without notice or consent. According to Hanff, the behavior mirrors a separate issue he recently identified involving Anthropic's desktop software, and together the two cases point to a broader pattern of how large tech companies deploy AI features.

Hanff's earlier report focused on Anthropic's Claude Desktop app, which he says quietly installed a browser integration bridge across multiple Chromium-based browsers on a system, including five browsers he did not even have installed. According to the researcher, this happened without any user prompt or meaningful disclosure, and the integration would reinstall itself if removed. He argues that this kind of silent modification of a user's environment violates both user expectations and, in his view, European privacy law.

Article continues below

Swipe to scroll horizontally

Environmental cost of Gemini Nano deployment in Chrome

Devices receiving the push

Total bytes pushed

Total energy

Total CO2e

100 million (~3% of Chrome users)

400 petabytes

24 GWh

6,000 tons CO2e

500 million (~15% of Chrome users)

2 exabytes

120 GWh

30,000 tons CO2e

1 billion (~30% of Chrome users)

4 exabytes

240 GWh

60,000 tons CO2e

(Data above calculated by Alexander Hanff)

A key focus of Hanff's post is the environmental cost of silently distributing a 4GB AI model, where he highlights the perils of distributing a file of this size on a global scale. If deployed across hundreds of millions or billions of devices, Hanff estimates the total emissions impact of simply distributing the file (not even using it) could reach tens of thousands of tons of CO2 equivalent, an amount similar to the annual output of tens of thousands of cars. That estimate depends heavily on possibly dubious assumptions about scale and energy mix, but his broader point, that pushing large binaries to user devices is not free and the cost is externalized, is completely valid regardless of the math.

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

For many users, the more immediate concern is bandwidth. A 4GB download is trivial on an unlimited fiber connection, but that is very much not the global norm, nor is it common even in the United States. For users whose data is capped, metered, or expensive, including most of the developing world, silently transferring gigabytes of data can have real financial consequences. Even in developed markets, users on mobile hotspots or rural connections may feel the impact acutely. Hanff argues that downloading files of this size without clear notice or consent crosses a very clearly demarcated line, regardless of the feature being delivered.

Taken together, the two cases reinforce a familiar criticism of large technology platforms. According to Hanff, both Anthropic and Google acted first and left users to discover the consequences later. Whether it is silently registering deep system integrations (in the case of Claude Desktop) or downloading multi-gigabyte AI models in the background, the pattern is the same: the user's device is being treated as a deployment target rather than something the user actively controls. That framing may sound harsh, but it aligns with long-standing complaints about "dark patterns" in software design. Features that benefit the platform at the user's cost are enabled by default, buried behind obscure settings, or implemented in ways that make them difficult to remove. Hanff's reporting suggests that the shift toward on-device AI is not changing that dynamic, and in fact may be accelerating it.

Google has not publicly responded in detail to Hanff's findings at the time of writing, and the company may argue that these downloads are tied to legitimate product features and improve privacy by keeping AI processing local. Even so, the core question remains unresolved. If a browser is going to download gigabytes of data onto a user's machine, should that require an explicit opt-in? Hanff's answer is clearly yes. Whether regulators or users ultimately agree may determine how far companies can push this kind of behavior in the future.

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Zak is a freelance contributor to Tom's Hardware with decades of PC benchmarking experience who has also written for HotHardware and The Tech Report. A modern-day Renaissance man, he may not be an expert on anything, but he knows just a little about nearly everything.

Read Entire Article