SpaceX’s orbital AI vision turns “basic math” into a supply-chain marathon

Is it possible to orbit a data center? Elon Musk has contended that it can, and the arithmetic is simple. In a memo to his staff, he explained that the fundamental math of the “basic math” is SpaceX building 100 gigawatts of AI processing power per year -a goal that is connected to the acquisition of xAI and the more general direction of getting large-scale compute out of the atmosphere.

Image Credit to wikipedia.org

Those analysts who tried to cash in that ambition into hardware, launches, and capital hit upon figures that were more of a business plan than a national infrastructure program. An upper estimate of a 1 million-satellite constellation organized into a MoffettNathanson model, which predicts the potential cost was more than 1 million-satellite constellation, sketched an upper limit of $4 trillion to $5 trillion in capital expenditure per year in case the orbital compute were off-the-shelf accelerators. This same structure suggested a replacement cycle that required an estimated 200,000 satellites per year, which translated to an estimated 3,300 launches per year – almost nine a day – on publically discussed mass and Starship lift assumptions of next-generation Starlink. Although the Spacecraft flew 100 times prior to retiring, the model needed at least 30 rockets annually to be built even when it was used in other engagements. That was not the friction point only with rockets but also with factories and materials, test capacity, and what the author calls the “radically expanded” supplier ecosystem to keep the machine fed.

This scale is more comfortable to appreciate when contrasted to the alternative on the ground. McKinsey has estimated that, by 2030, the global data centers will need to invest a total of $6.7 trillion worldwide around the world to meet the demand on compute power with 5.2 trillion of that amount being AI-capable data centers. That prediction supposes that the industry is able to continue to construct power delivery, cooling and chips quickly enough on Earth–a presumption already becoming stressed by grid limits and the sheer power density of current accelerators.

Supporters of space-based compute position orbit as the solution to congestion on land: fewer local permitting battles, no reliance on water cooling, and solar energy that doesn’t have to be negotiated one substation at a time. But orbit substitutes such limitations with others, radiation, thermal control, and reliability in an area where a repair is not a common undertaking.

Radiation, especially, has been the silent duty on any “cloud in space” conception. The minor yet significant growth is that there is the development of shielding methods that would allow newer commercial processors to be able to survive during long-duration missions. An example is Cosmic Shielding, which claims that a nanocomposite enclosure has allowed an Nvidia Jetson Orin NX GPU to work in orbit indefinitely and error free in a demonstration of a cubesat with Aethero. When this is somehow extrapolated to more expensive hardware, the performance disparity between radiation-hardened space parts and the current state-of-the-art in commercial silicon chips decreases, which is one of the conditions to anything approaching data-center-scale AI in space.

A million satellites are not the direction other players are taking. Project Suncatcher by Planet and Google is designed as a two-spacecraft demonstration, which will be launched by early 2027, with Google TPUs and demonstrating high-bandwidth intersatellite links and heat rejection. One example of an architecture proposed by Google is based on closely spaced satellites, with one example using an 81 satellite cluster with 100-200 meter distances between them, to achieve optical connections fast enough to support distributed machine-learning applications.

The argument in the industry is thus no longer whether it is physically possible to have orbital compute, but rather whether the economics will ever work. A calculator-based analysis by a single analyst pegged orbital data centers at about three times the cost per watt under a base case, capturing the summarized problem of the issue in a nutshell: “If you run the numbers honestly, the physics doesn’t immediately kill it, but the economics are savage.”

The second layer to SpaceX is an addition of the SpaceX story: orbit as a manufacturing bridge. Musk has also opined that Starship will be able to make landings with large volumes of cargo to Moon which can be factories capable of making satellites and launching them deeper into space. The orbital data centers in that framing are not the end, they are the cash engine and backbone of a space industrial base that requires a large amount of power, a robust compute, and a supply chain that can function outside of the Earth.

spot_img

More from this stream

Recomended

Discover more from Modern Engineering Marvels

Subscribe now to keep reading and get access to the full archive.

Continue reading