According to Ars Technica, Google has confirmed it’s working on Project Suncatcher, a “moonshot” initiative to put AI data centers in space using orbiting TPUs. The company plans to launch prototype satellites with TPUs by early 2027 and is targeting the mid-2030s for full deployment when launch costs could drop to just $200 per kilogram. Google’s research shows solar panels in orbit are up to eight times more efficient than Earth-based systems, providing nearly constant power in dawn-dusk sun-synchronous low-earth orbits. The company is already testing its latest v6e Cloud TPU (Trillium) in proton beams, finding they can handle almost 2 krad of radiation – three times more than needed for five-year operation. Early wireless testing has achieved 1.6 Tbps speeds between nodes, though satellites would need to maintain formations within one kilometer of each other.
<h2 id="the-space-gold-rush”>The space gold rush
Here’s the thing – this isn’t just Google thinking outside the box. The entire tech industry is hitting physical limits on Earth. We’re running out of places to put these power-hungry AI data centers that communities actually want near them. And when you’ve got Bezos and Musk both talking about space-based computing, you know there’s serious money looking for solutions.
But let’s be real – we’ve heard these grand space visions before. Remember when everyone was going to mine asteroids? Or build hotels in orbit? The gap between concept and reality in space projects is… astronomical. Google’s own comparison to their self-driving car project actually highlights the problem – it took them 15 years to get from moonshot to almost-autonomous vehicles, and they’re still not everywhere.
Physics is still a thing
So Google says their satellites can maintain formations “several hundred meters apart” with “modest station-keeping maneuvers.” That sounds neat until you remember we’re talking about objects moving at thousands of miles per hour in three dimensions. Current satellite constellations like Starlink operate much farther apart – keeping precise formations at orbital speeds is basically like trying to coordinate a high-speed ballet with cars on a highway.
And about that radiation testing? Sure, TPUs might handle more radiation than expected. But we’re talking about running mission-critical AI workloads on hardware that wasn’t designed for space. One solar flare at the wrong time could turn your billion-dollar AI cluster into very expensive space junk.
The economics don’t add up yet
Google’s banking on launch costs dropping to $200/kg by the 2030s. That’s optimistic, to put it mildly. Even if SpaceX and others drive prices down, we’re still talking about launching and maintaining infrastructure in the most hostile environment imaginable. The repair bill for a single failed TPU module in orbit would make your eyes water.
Think about it – on Earth, if a data center goes down, you send a technician. In space? You’re looking at months of planning and millions in costs for any physical intervention. And what about the energy cost of beaming data back to Earth? That’s not nothing.
But maybe they’re not crazy
Look, I’ll admit the solar power argument is compelling. Eight times more efficient? Constant sunlight? That solves a huge problem for AI’s energy appetite. And if you read their research paper, they’ve clearly done more homework than I expected.
The pre-print study shows they’re thinking through the real engineering challenges, not just dreaming. And honestly, with Earth-based data centers facing growing opposition and environmental concerns, someone needs to try the crazy ideas.
So is Project Suncatcher realistic? Probably not in the timeframe they’re suggesting. But is it the kind of ambitious thinking we need to solve AI’s scaling problems? Absolutely. Just don’t bet your business on orbital TPUs showing up in the next decade.
