May the Schwartz be with them.
To hear Silicon Valley tell it, artificial intelligence is outgrowing the planet that gave birth to it. Data centers will account for nearly half of U.S. electricity demand growth between now and 2030, and their global power requirements could double by the end of this decade as companies train larger AI models. Local officials have begun to balk at approving new server farms that swallow land, strain power grids and gulp cooling water. Some tech executives now talk about putting servers in space as a way to escape those permitting fights.
Orbital data centers could run on practically unlimited solar energy without interruption from cloudy skies or nighttime darkness. If it is getting harder to keep building bigger server farms on Earth, the idea goes, maybe the solution is to loft some of the most power-hungry computing into space. But such orbital data centers will not become cost-effective unless rocket launch costs decline substantially—and independent experts warn they could end up with even bigger environmental and climate effects than their earthly counterparts.
In early November Google announced Project Suncatcher, which aims to launch solar-powered satellite constellations carrying its specialty AI chips, with a demonstration mission planned for 2027. Around the same time, the start-up Starcloud celebrated the launch of a 60-kilogram satellite with an NVIDIA H100 GPU as a prelude to an orbital data center that is expected to require five gigawatts of electric power by 2035.
Those two efforts are part of a broader wave of concepts that move some computing off-planet. China has begun launching spacecraft for a Xingshidai “space data center” constellation, and the European Union is studying similar ideas under a project known as ASCEND.
“Orbital data centers would benefit from continuous solar energy, generated by arrays of photovoltaic cells,” says Benjamin Lee, a computer architect and engineer at the University of Pennsylvania. “This could resolve long-standing challenges around powering data center computation in a carbon-efficient manner.” Most proposals envision orbital data centers that would be in a dawn-to-dusk, sun-synchronous orbit aligned with the boundary between day and night on Earth so that their solar panels would receive almost constant sunlight and gain an efficiency advantage outside Earth’s atmosphere.
But the same physics that make orbital data centers appealing also impose new engineering headaches, Lee says. Their computing hardware must be protected from high radiation, through either shielding or error-correcting software. To cool off, orbital platforms need large radiators that can dump heat into the vacuum of space, adding significant mass that has to be launched on rockets.
All these plans ultimately collide with one stubborn constraint: getting hardware into space. Rocket launch costs alone pose a significant challenge to building large orbital data centers, not to mention the need to replace onboard chips every five to six years. “Launch costs are dropping with reusable rockets, but we would still require a very large number of launches to build orbital data centers that are competitive with those on Earth,” Lee says. Google’s Suncatcher team estimates that liftoff costs would need to fall to under $200 per kilogram by 2035 for their vision to make sense.
