Altman’s Abandoned Bid to Fuse AI and Space Rocketry

What’s harder: creating a fully reusable orbital rocket-or getting a rocket startup to relinquish control? For Sam Altman, the answer in 2025 was both. The OpenAI CEO quietly pursued a multibillion-dollar equity deal to acquire or dominate Stoke Space, a Kent, Wash.-based company started by ex-Blue Origin engineers. The plan was audacious: make Stoke’s Nova rocket the launchpad for an entirely new frontier-AI computing infrastructure in orbit-while opening a fresh front in his escalating rivalry with Elon Musk.

Image Credit to Wikimedia Commons | Licence details

Discussions began over the summer, then accelerated through the fall, with a structure proposed of successive investments eventually reaching into the billions. Controlling ownership of Stoke would have given Altman a shortcut into the launch market, by avoiding the decade-long grind that most rocket startups put themselves through before reaching flight readiness. Nova, which is slotted for an inaugural launch in 2026, is designed for full reusability a feat which so far only SpaceX has achieved at orbital scale. The engineering challenge is to survive the brutal thermal and mechanical stresses of re‑entry while keeping the turnaround times short enough to make launches economically viable.

The interest in rockets by Altman is inseparable from his vision of space-based AI computing. He has warned that Earth-bound data centers will sooner or later reach scaling and energy limits. In orbit, solar energy is ample and uninterrupted, especially in Sun-synchronous orbits, where arrays remain in constant sunlight. Without atmospheric losses or night cycles, panels can generate far more power per square meter than on Earth. The vacuum also offers a unique advantage for thermal dissipation-although in practice, radiators must handle all heat removal, and NASA studies show these can account for more than 40% of a spacecraft’s total power system mass at high output levels.

The idea is gaining traction among tech titans. Musk has hinted that future Starlink satellites could host AI data centers. Jeff Bezos has expressed similar ambitions. Google’s Project Suncatcher would deploy an 81‑satellite solar‑powered AI cluster in low Earth orbit by 2027 using laser links to split workloads across nodes. Each satellite will carry TPU chips adapted to survive radiation doses nearly three times what they’ll face in orbit, according to lab tests. But scaling these types of systems requires solving orbital networking, maintenance, and debris avoidance-especially in the congested orbital shells where collision risk is rising toward Kessler syndrome.

Nova was more than just a rocket to Altman – it was a delivery mechanism for off‑world compute clusters. Launching AI data centers into orbit could slash carbon emissions by up to tenfold compared with terrestrial equivalents, even factoring in launch emissions, startup Starcloud said. This matters as the energy appetite of AI surges: US data centers consumed over 4% of national electricity in 2023 and may reach 12% by 2028. Google alone used 30.8 million megawatt-hours for data centers last year, more than double its 2020 figure. Yet, the path to orbital AI infrastructure has its pitfalls, too. Space debris travels 17,500 mph, and even a blueberry‑sized fragment can destroy a satellite.

Dense constellations like Suncatcher’s, with nodes spaced under 200 meters apart, would require active collision‑avoidance systems capable of doing synchronized maneuvers-a capability not yet standard. In the first half of 2025, SpaceX’s Starlink performed 144,404 avoidance maneuvers, a number underlining the operational burden. Without autonomous “reflexes” to dodge debris, one impact can cascade into catastrophic losses. Thermal management, radiation hardening, and high‑bandwidth laser communication are similarly formidable. Optical links also have to keep precise alignment across fast‑moving satellites in orbit, immune to orbital drift and interference in ground links.

In‑orbit maintenance requires robotic servicing or replacement missions, each adding significant cost and complexity. Economic viability depends on launch costs falling below US$200 per kilogram by the mid‑2030s seven to eight times less than today. Altman’s infrastructure push on Earth has already been staggering. OpenAI’s cloud commitments approach $600 billion, spread across AWS, Microsoft, Oracle, CoreWeave, and even Google Cloud. These deals lock in hundreds of thousands of Nvidia GPUs and tens of millions of CPUs, forming a multi‑cloud lattice designed to keep AI training uninterrupted.

Owning a rocket company would have extended that control beyond the atmosphere, integrating launch capability directly into OpenAI’s compute supply chain. The talks with Stoke Space are finished, but the strategic logic stays. In the race to dominate the physical infrastructure of AI – whether Earth-based or in orbit-the ability to deploy compute anywhere may define this decade’s victors. For now, Altman’s bid to fuse AI and rocketry is on hold, leaving Musk, Bezos, and Pichai to further their own orbital ambitions.

spot_img

More from this stream

Recomended

Discover more from Modern Engineering Marvels

Subscribe now to keep reading and get access to the full archive.

Continue reading