Get ready for an ambitious venture into the cosmos! Google's latest endeavor, Project Suncatcher, aims to revolutionize AI by harnessing the power of space. But here's where it gets controversial... they plan to place their AI chips, TPUs, on a network of satellites, tapping into the Sun's energy for near-continuous operation.
The concept is simple: solar panels in space can be up to eight times more productive than on Earth, reducing the need for traditional power sources. This could make space the ultimate arena for scaling AI compute.
To achieve this, Google envisions connecting these satellites via free-space optical links, distributing ML tasks across accelerators with high-bandwidth, low-latency connections. The challenge? Maintaining stable constellations within a sun-synchronous orbit, with satellites positioned just meters apart.
Google has already tested the radiation resilience of its TPUs, with promising results. The Trillium and v6e chips showed remarkable durability, with no hard failures attributed to radiation up to a maximum tested dose.
And this is the part most people miss: Google believes launch costs will plummet to less than $200/kg by the mid-2030s, making space-based data centers economically viable.
But is this truly feasible? Google's initial analysis suggests it's not precluded by physics or economics, but engineering challenges remain. Thermal management, high-bandwidth ground communications, and on-orbit system reliability are all hurdles to overcome.
To tackle these, Google is partnering with Planet to launch prototype satellites by 2027, testing the operation of models and TPU hardware in space.
So, is Project Suncatcher a moonshot worth pursuing? Or is it a far-fetched idea that will never see the light of day? What do you think? Share your thoughts in the comments and let's discuss the future of AI in space!