Micron CEO Sanjay Mehrotra said that cars will eventually require more than 300GB of RAM as automakers introduce vehicles that have L4 autonomy. According to The Register, Mehrotra said this after Micron released its quarterly earnings report, with the company reporting $23.86 billion in revenue for the second quarter of this year — a huge 200% jump from the $8.03 billion it posted in 2Q25. This massive jump is still driven by the incredible demand for premium HBM chips from AI hyperscalers combined with “structural supply constraints and Micron’s strong execution across the board.”
Go deeper with TH Premium: Memory
As the company is raking cash from the AI infrastructure build out, it’s also expanding its output with several planned fabs in Japan, Singapore, and even a “megafab” in New York. These projects are expected to come online between 2028 and 2029, and the Micron CEO said that it’s looking to boost output by 20% in 2026, which could help alleviate some of the pressure on the supply side. However, even as these new factories start production, Mehrotra predicts that there will be a new market that demands massive amounts of high-speed memory — self-driving cars.
There are six levels of vehicle autonomy, starting at L0 for cars that have no driving automation whatsoever. A vehicle with a single automated system (such as cruise control) counts as L1, while those equipped with advanced driver assistance systems (ADAS) that both control steering and acceleration, such as Tesla’s Autopilot and Cadillac’s Super Cruise, are considered as L2. On the other hand, vehicles with L4 autonomy basically do not need human intervention in any task, like overtaking or deciding when to cross a busy intersection. However, it still gives the driver the option to take control and manually drive the vehicle.
Article continues below
Nvidia announced that it’s working with Chinese carmakers BYD and Geely and Japanese marques Isuzu and Nissan to adopt the Nvidia Drive Hyperion platform. This is the AI chip maker’s end-to-end autonomous vehicle platform meant to deliver an L4 system to car manufacturers. Since this is an AI system, it will likely demand a lot of high-speed memory to be able to run effectively.
Most modern vehicles require at least 16GB of memory, but if car makers introduce L4 autonomy, it will definitely need a lot more RAM. We’ve seen this with the shortage of high-end Macs with up to 512GB of Unified Memory as many users have become interested in running the likes of OpenClaw on their own systems. It has even gotten to the point that Apple pulled the $4,000 512GB Mac Studio from its online store and raised the 256GB version to $2,000. So, if carmakers started churning out hundreds of thousands, if not millions of vehicles with AI-powered driverless features, Micron expects demand for automotive memory to pick up as well.
This might take some time to happen, though, especially as vehicles with these features are quite expensive and regulations haven’t quite caught up with L4 autonomy just yet. But if and when people start buying cars like these, we hope that memory chip makers have excess capacity to absorb the increase in demand. Otherwise, we might be seeing another round of memory chip shortages, this time driven by AI supercomputers on wheels.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

3 hours ago
5








English (US) ·