The fact that two of Elon Musk’s companies are merging, has nothing to do purely with accounting or resources allocation or efficiency or any of the typical corporate speak one often hears. The reason for bringing together SpaceX and xAI, Musk says, is to build orbital data centres, or simply put, data centres in space. Cornerstone for success would be to effectively harness the sun’s energy to power these space-based data centres. Or as Musk admits in a note released this week, global electricity demand on earth for artificial intelligence (AI) simply cannot be met, even in the near term, without imposing hardship on communities and the environment.

Musk and the combined synergies of SpaceX and xAI believe that launching a constellation of a million satellites that come together as orbital data centres begins the process of becoming a Kardashev II-level civilisation, one that can harness the Sun’s full power. At the core remains a pitch to push a case for AI of course, which as the official statement says, “supporting AI-driven applications for billions of people today and ensuring humanity’s multi-planetary future.”
The Space Bureau of the US Federal Communications Commission (FCC) has accepted SpaceX’s application for Orbital Data Center. In a post earlier today on X, FCC chairman Brendan Carr wrote, “The proposed system would serve as a first step towards becoming a Kardashev II-level civilisation and serve other purposes, according to the applicant.”
The Kardashev II-level civilisation, simply put, is an idea that emerges from the Kardashev Scale, proposed in 1964 by Soviet astrophysicist Nikolai Kardashev, as a way to classify civilisations by energy consumption, not culture or intelligence. Musk, and indeed the FCC chairman, are talking about this hypothetical civilisation capable of capturing and using most or all of the energy output of its parent star.
“In the long term, space-based AI is obviously the only way to scale. To harness even a millionth of our Sun’s energy would require over a million times more energy than our civilisation currently uses! The only logical solution therefore is to transport these resource-intensive efforts to a location with vast power and space,” he explains. For SpaceX, volume isn’t a problem considering launching thousands of satellites has become a norm, delivering Starlink’s Low Earth Orbit (LEO) satellites into orbit.
More to that point, in 2026, SpaceX expects Starship will begin delivering more powerful V3 Starlink satellites into space, as well as the next generation of direct-to-mobile satellites, which will deliver the much-expected complete cellular coverage everywhere on Earth.
Elon Musk estimates that within two to three years, the lowest cost method to generate AI compute will be in space, and that cost efficiency alone would force companies to train their AI models and process data at speeds and scale not seen previously. “ The basic math is that launching a million tons per year of satellites generating 100 kW of compute power per ton would add 100 gigawatts of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 1 TW/year from Earth,” he says.
Speaking of numbers, on an average, a small enterprise data centre requires up to 2 megawatt of power, while a mid sized data centre usually used for shared infrastructure and regional hub setups can use up to 20 megawatt. Hyper scale data centres, which cloud services and AI compute require, can consume between 150MW and 500MW. The standard gauge is, 1MW is the same power that is required to light up around 1000 average homes in any country. In terms of weight, something the mission for Orbital Data Centers will have to work with at the very foundation, hyper scale data centres can weigh as much as 300,000 tonnes, and more.
In fact, recent trends suggest AI workloads are pushing the need for larger power infrastructure with bigger substations, heavier and more capable cooling systems that require a lot of water, as well as the premium of land area. On the face of it, Musk’s idea bypasses the challenges data centres have faced on earth.
AI, the costs and returns conundrum
Nevertheless, there has been a lot of conversation around data centre investments, particularly in the US. Google plans a new $15 billion AI-focused data centre campus in Andhra Pradesh, being billed as its largest in India. There however have been multiple protests and instances of local resistance against data centres, including in Arizona, Pennsylvania, Michigan and as many as 25 projects have been cancelled in 2025 due to this resistance. There is concern among citizens that these data centres will drive up energy use as well as water use in their regions, impacting local usage, standard of life and raise environmental concerns.
A piece of statistic that shouldn’t be ignored is that while SpaceX is profitable, having generated $8 billion in profit on estimated $16 billion in revenue last year, xAI is reportedly spending as much as $1 billion a month to compete in an ecosystem that also has heavy investments from Google, Anthropic and OpenAI, though the latter often finds itself in a ‘circular financing’ conversation on the basis of recent investments by companies including Nvidia and Oracle which OpenAI does business with in return. ChatGPT will soon include ads for free users, and those on the Go subscription tier.
It remains to be seen how investors feel about these investment plans, and what the larger tax payer base thinks of these big numbers. There are fears of a bubble that may be about to deflate, and the big money that is moving hands, is the reason for economic concerns in such a scenario. Last month, US Senator Elizabeth Warren (D-MA), in a letter to OpenAI CEO, Elon Musk sought to seek assurances that OpenAI won’t seek a government bailout if it doesn’t turn a profit. The concern, as she wrote, is OpenAI “has committed to more than a trillion dollars in spending despite not yet turning a profit”. In November last year, CFO Sarah Friar had suggested taxpayers would “backstop” the massive infrastructure spending.
Microsoft CEO Satya Nadella recently said that AI must “ do something useful” if it is to continue enjoying social permissions for the heavy electricity consumption and infrastructure requirements. In a quarterly earnings call recently, Microsoft insisted that their Copilot AI user base has grown “nearly 3x year-on-year”, but The Register reports that only 3.3% of Microsoft 365 and Office 365 users actually pay for the AI subscription — that’s at odds with Microsoft’s infrastructure expenditures through 2025 which touched $88.7 billion, up from a $80 billion projection.






