Data Centers in Space
Think Off Planet
“How can the sky be the limit when there’s footprints on the moon?”
The last decade taught us that data centers are the new power plants. They turn electrons into intelligence. They are the silent engines behind every model, simulation, and transaction. Yet as their energy demands grow, the limits of Earth’s infrastructure become obvious.
Perception is reality. The current perception is that electricity prices are going to rise to create data centers. And these data centers will put Americans out of work. Despite this not being accurate, the perception will stand in the way our energy and compute needs.
Space changes these constraints.
This conversation isn’t just speculative. It’s commercial, test-flown, and funded. The movement gained speed with Google’s Project Suncatcher, a collaboration who’s aim is to capture more than one hundred trillion times humanity’s total electricity production and use it to power data centers in orbit.
In partnership with Planet, they plan to launch prototype satellites by early 2027, carrying TPU AI chips designed for high-density compute in constant sunlight.
As of the date of this publication, this marks the point where the discussion regarding off-planet computing stops being a curiosity and starts resembling an industry roadmap.
Starcloud, NVIDIA, and Good Will Hunting
Suncatcher isn’t alone. Starcloud, after its partnership announcement with NVIDIA and the successful launch of its own prototype, ignited a global debate about whether off-planet compute makes sense at all.
Critics mocked it as a vanity stunt, an expensive way to move heat into the void. Supporters counter that this is how every technological inflection starts, by overreaching, failing, learning, and iterating until the economics align.
Andrew McCalip of Varda had a goated Good Will Hunting breakdown which still manages to remain technically sharp. He points out that even if orbital compute seems inefficient today, the energy surplus in space and the falling cost of launch mean the crossover point might come sooner than expected.
A New Kind of Factory
Every industrial era builds its own kind of factory. The 19th century had steam and steel. The early 20th century had the Ford assembly line, where energy, labor, and mechanical precision converged into mass production.
The 21st century’s factories are data centers. They do not stamp metal or refine oil. They refine data into cognition. Each rack of GPUs performs a process equivalent to assembly lines of inference, shaping synthetic neurons instead of steel.
The expansion of compute into orbit resembles the geographic expansion of industry in the early modern period. Just as proximity to rivers or coal once defined economic power, proximity to abundant energy and cooling capacity now defines it.
The implications are staggering. Control over orbital compute could soon mirror control over oil reserves in the last century. The same dynamics of power, regulation, and scarcity that shaped the 20th century’s geopolitics may reappear, only now centered around compute density, orbital slots, and energy transmission bandwidth.
Musk, the Pentagon, and the Expanding LEO Economy
It didn’t take long for Elon Musk to join the fray. SpaceX will also be launching orbital data centers, he confirmed.
The company’s experience in reusable launch, Starlink’s communication backbone, and Starfactory’s orbital manufacturing to a potential end-to-end infrastructure for off-world compute makes SpaceX a vertically integrated power and data company.
Coupled with the foundational model Grok, its easy to see a vertically integrated giant emerging onto the field.
Meanwhile, traditional satellite projects are pulling massive government capital into the same orbit. The Wall Street Journal reports that SpaceX is set to win a $2 billion Pentagon contract as part of the Golden Dome project.
The military sees in these systems the same thing cloud companies see. Resilience, redundancy, and energy independence. Space is no longer the margin, It’s a parallel grid. One that does not have to compete with irrational nimbyism.
Astranis is moving in the same direction with its new Vanguard platform, which CEO John Gedmark describes as a self-forming broadband network for defense and disaster zones. The same logic applies here. Compute and connectivity move where infrastructure fails, and space offers the ultimate disaster-resilient tier.
Vast Space, founded by Jed McCaleb, adds the habitation and research layer. Its Haven demo succeeded, and the first crew-ready Haven-1 station is on track for 2026. With Starlink integration and a built-in lab, it’s easy to imagine data modules piggybacking on such stations to handle research workloads on-site rather than uplinking everything back to Earth.
Space Power and the Star Catcher Breakthrough
If compute is the brain, power is the bloodstream.
That’s why the recent Star Catcher announcement matters so much. They proved energy can be sent across the vacuum of space as light, captured efficiently, and converted back to electricity without wires.
That’s the missing piece space-based solar power has been seeking, and by extension, what will enable orbital data centers. A cluster that can share power through beamed light becomes modular and self-balancing. It can move excess power where it’s needed, decoupling compute density from solar geometry.
See Bloomberg coverage
Mars, Inversion, and the Broader Arc
It’s not only about orbit. Rocket Lab’s ESCAPADE mission will study how Mars lost its atmosphere, a prerequisite for long-term terraforming and potential colonization.
Energy and compute ecosystems in deep space will face the same thermodynamic questions being solved now in orbit. Namely, how to harvest, store, and deploy power efficiently.
And companies like Inversion Space, showing off its new ARC reentry vehicle, are closing the logistics loop. If compute, materials, and samples can come back cheaply, orbital industry becomes economically circular rather than purely extractive.
All of this, data centers, power beaming, modular stations, and return vehicles, now fits under a coherent industrial strategy for orbit.
Why It Matters and What Comes Next
Every modern AI model consumes power at a scale that ties it directly to national grids. Building more terrestrial capacity means water disputes, permitting fights, infrastructure stress.
Moving compute to orbit sidesteps that bottleneck while unlocking new energy economics. Constant sunlight, no weather, no night cycle, and radiative cooling into deep space make for a near-perfect thermodynamic setup.
Skeptics will argue that space data centers are a distraction from terrestrial renewable build-out. But that misses the point. This isn’t an either-or scenario. Orbital infrastructure complements Earth-based systems by providing uninterrupted clean energy generation and overflow compute capacity for surges that Earth’s grids can’t handle.
It’s the same logic that made undersea cables inevitable in telecommunications. Once latency and reliability hit acceptable levels, the gravitational pull of efficiency does the rest.
The End of the Precautionary Era
The reflex to delay until certainty arrives has become its own drag coefficient. Every time someone says “let’s wait until the technology matures,” the technology matures somewhere else.
Space compute doesn’t need perfect economics on day one. It needs iterative launches, transparent data, and the humility to learn in orbit. That’s how aviation evolved, how satellites became mainstream, and how AI hardware will eventually transcend its terrestrial cage.
The shift to orbital compute is not utopian. It’s practical. It represents a simple truth. There’s more energy above our heads than we’ll ever produce on the ground. Capturing even a fraction of it would reshape the energy-AI equation.
All of this for less than the $30 billion NASA has spent on Orion. The irony is poetic. The frontier that once belonged only to exploration is now being rebuilt as infrastructure. The next generation of data centers won’t rise out of deserts. They’ll orbit quietly, bathed in eternal sunlight, beaming energy back to Earth.
Further Reading and Sources




Very interesting. Lots of advantages. Cooling problems solved for all these high powered AI chips. I think Tesla must be smiling. His dream was always to send electric power without wires. He was just 125 years too early!
Great piece, Matthew.
I like the comparison with factories of yesteryear, which took in energy and output finished products.
Viewing this all from a first principles perspective, factories take in energy and, using knowledge, output useful counterentropic forms (cars, steel, chips, etc). These forms, however, are just human knowledge encoded onto matter (which is bound energy).
In essence, a factor takes energy and knowledge, encoding it into a useful form.
A data center "factory" seems fundamentally different, which is why our mental paradigms may need to change. Yes, it is still taking in energy and still uses knowledge, but the output is pure knowledge, hopefully new knowledge.
I have an essay coming on Lunar datacenters soon, hope you will like it!