Musk insists that 'the lowest cost way to generate AI compute will be in space' within three years after SpaceX acquired xAI, but that timeline is more science fiction than strategy

2 hours ago 4
A SpaceX rocket over earth next to Elon Musk at the 2025 U.S.-Saudi Investment Forum
(Image credit: Getty Images)

Elon Musk has added another line to his history of technological predictions that sail far beyond optimistic and into the delusional. As part of announcing the acquisition of his xAI company by (the also Musk-run) SpaceX, he declared that not only was space ideal as a cheap location for running AI servers, but that it would happen faster than most kitchen renovations on Earth.

“My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space,” Musk wrote in the announcement. " This cost-efficiency alone will enable innovative companies to forge ahead in training their AI models and processing data at unprecedented speeds and scales, accelerating breakthroughs in our understanding of physics and the invention of technologies to benefit humanity."

But a grand, interplanetary vision isn’t the same thing as a realistic business plan – especially not one that delivers within 36 months. The infrastructure isn’t ready. Merging an AI company with a rocket company doesn’t fast-forward the Earth’s rotation. If you believe Musk will have AI data centers in orbit before 2030, I've got a used Tesla humanoid robot to sell you.

Imaginary booster rockets

Space offers uninterrupted solar radiation, ambient cold for thermal dissipation, and the ultimate perk for remote work: zero zoning restrictions. Musk’s point isn’t entirely unfounded. Data centers are energy-devouring creatures, sucking up power, land, and water, and sparking political battles.

Meanwhile, in orbit, you’re above the clouds and below the radar. No utility bills. No water rights battles. There are many reasons to be intrigued by orbital compute. But there are many more to be skeptical of its imminent arrival.

Even assuming record-setting rocket launch schedules that are all successful, getting mass to orbit still isn’t cheap. Launching a full data center’s worth of equipment into space, with radiation shielding, thermal management, fault tolerance, and redundancy, is not something that can be done affordably in any timeline under a decade. And that assumes zero maintenance or upgrades. Terrestrial centers swap out dead GPUs like old lightbulbs. Up there, your only hope is robotic servicing or tons of redundancy.

Sign up for breaking news, reviews, opinion, top tech deals, and more.

And all that sunlight energy comes with plenty of less enticing radiation. Cosmic rays, solar flares, and the general hostility of space are not side issues. They’re central to why most satellites are hardened, expensive, and decades behind in chip design. GPUs built for inference and training are fragile. They aren’t designed to float above the Van Allen belt.

Not to mention the space trash. Putting thousands of compute satellites into low-Earth orbit could cause a cascade of collisions. SpaceX is already dominant in orbital traffic. Layering a second orbital network of AI computers could raise significant regulatory and environmental backlash, even wittout constant danger of crashes.

Decades, not years

As a long-term plan, space data centers could be a great option. They could offload pressure from power grids, avoid zoning fights, and scale globally without boiling local lakes. The physics aren’t impossible, but the equations translate to complex, difficult, expensive engineering. Three years for a functioning AI data center in orbit is not serious, and people who say it will happen shouldn't be taken seriously.

Not because people don’t want to make the orbiting AI data centers happen, but because large-scale infrastructure, especially in space, requires patience, iteration, and a willingness to admit when Earth is still the better option. Admitting mistakes and backing down from grandiose fever dreams are not habits for Musk. But, like his robots, his fleet of self-driving cars, and his video game prowess, the orbiting AI centers are laughable nonsense. Give the project to real engineers and ask them about a real timeline, and we'll see how the first satellites are doing in a decade or so.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.


Purple circle with the words Best business laptops in white

Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read Entire Article