Interesting things happen to stars after they’ve left the main sequence. So-called Asymptotic Giant Branch (AGB) stars are those less than nine times the mass of the Sun that have already moved through their red giant phase. They’re burning an inner layer of helium and an outer layer of hydrogen, multiple zones surrounding an inert carbon-oxygen core. Some of these stars, cooling and expanding, begin to condense dust in their outer envelopes and to pulsate, producing a ‘wind’ off the surface of the star that effectively brings an end to hydrogen burning.

Image: Hubble image of the asymptotic giant branch star U Camelopardalis. This star, nearing the end of its life, is losing mass as it coughs out shells of gas. Credit: ESA/Hubble, NASA and H. Olofsson (Onsala Space Observatory).

We’re on the way to a planetary nebula centered on a white dwarf now, but along the way, in this short pre-planetary nebula phase, we have the potential for interesting things to happen. It’s a potential that depends upon the development of intelligence and technologies that can exploit the situation, just the sort of scenario that would attract Greg Laughlin (Yale University) and Fred Adams (University of Michigan). Working with Darryl Seligman and Khaya Klanot (Yale), the authors of the brilliant The Five Ages of the Universe (Free Press, 2000) are used to thinking long-term, and here they ponder technologies far beyond our own.

For a sufficiently advanced civilization might want to put that dusty wind from a late Asymptotic Giant Branch star to work on what the authors call “a dynamically evolving wind-like structure that carries out computation.” It’s a fascinating idea because it causes us to reflect on what might motivate such an action, and as we learn from the paper, the energetics of computation must inevitably become a problem for any technological civilization facing rapid growth. A civilization going the AGB route would also produce an observable signature, a kind of megastructure that is to my knowledge considered here for the first time.

Laughlin refers to the document that grows out of these calculations (link below) as a ‘working paper,’ a place for throwing off ideas and, I assume, a work in progress as those ideas are explored further. It operates on two levels. The first is to describe the computational crisis facing our own civilization, assuming that exponential growth in computing continues. Here the numbers are stark, as we’ll see below, and the options problematic. The second level is the speculation about cloud computing at the astronomical level, which takes us far beyond any solution that would resolve our near-term problem, but offers fodder for thought about the possible behaviors of civilizations other than our own.

Powering Up Advanced Computation

The kind of speculation that puts megastructures on the table is productive because we are the only example of technological civilization that we have to study. We have to ask ourselves what we might do to cope with problems as they scale up to planetary size and beyond. Solar energy is one thing when considered in terms of small-scale panels on buildings, but Freeman Dyson wanted to know how to exploit not just our allotted sliver of the energy being put out by the Sun but all of it. Thus the concept of enclosing a star with a shell of technologies, with the observable result of gradually dimming the star while producing a signature in the infrared.

Laughlin, Adams and colleagues have a new power source that homes in on the drive for computation. It exploits the carbon-rich materials that condense over periods of thousands of years and are pushed outward by AGB winds, offering the potential for computers at astronomical scales. It’s wonderful to see them described here as ‘black clouds,’ a nod the authors acknowledge to Fred Hoyle’s engaging 1957 novel The Black Cloud, in which a huge cloud of gas and dust approaches the Solar System. Like the authors’ cloud, Hoyle’s possesses intelligence, and learning how to deal with it powers the plot of the novel.

Hoyle wasn’t thinking in terms of computation in 1957, but the problem is increasingly apparent today. We can measure the capabilities of our computers and project the resource requirements they will demand if exponential rates of growth continue. The authors work through calculations to a straightforward conclusion: The power we need to support computation will exceed that of the biosphere in roughly a century. This resource bottleneck likewise applies to data storage capabilities.

Just how much computation does the biosphere support? The answer merges artificial computation with what Laughlin and team call “Earth’s 4 Gyr-old information economy.” According to the paper’s calculations, today’s artificial computing uses 2 x 10-7 of the Earth’s insolation energy budget. The biosphere in these terms can be considered an infrastructure for copying the digital information encoded in strands of DNA. Interestingly, the authors come up with a biological computational efficiency that is a factor of 10 more efficient than today’s artificial computation. They also provide the mathematical framework for the idea that artificial computing efficiencies can be improved by a factor of more than 107.

Artificial computing is, of course, a rapidly moving target. From the paper:

At present, the “computation” performed by Earth’s biosphere exceeds the current burden of artificial computation by a factor of order 106. The biosphere, however, has carried out its computation in a relatively stable manner over geologic time, whereas artificial computation by devices, as well as separately, the computation stemming from human neural processing, are both increasing exponentially – the former though Moore’s Law-driven improvement in devices and increases in the installed base of hardware, and the latter through world population growth. Environmental change due to human activity can, in a sense, be interpreted as a computational “crisis” a situation that will be increasingly augmented by the energy demands of computation.

We get to the computational energy crisis as we approach the point where the exponential growth in the need for power exceeds the total power input to the planet, which should occur in roughly a century. We’re getting 1017 watts from the Sun here on the surface and in something on the order of 100 years we will need to use every bit of that to support our computational infrastructure. And as mentioned above, the same applies to data storage, even conceding the vast improvements in storage efficiency that continue to occur.

Civilizations don’t necessarily have to build megastructures of one form or another to meet such challenges, but we face the problem that the growth of the economy is tied to the growth of computation, meaning it would take a transition to a different economic model — abandoning information-driven energy growth — to reverse the trend of exponential growth in computing.

In any case, it’s exceedingly unlikely that we’ll be able to build megastructures within a century, but can we look toward the so-called ‘singularity’ to achieve a solution as artificial intelligence suddenly eclipses its biological precursor? This one is likewise intriguing:

The hypothesis of a future technological singularity is that continued growth in computing capabilities will lead to corresponding progress in the development of artificial intelligence. At some point, after the capabilities of AI far exceed those of humanity, the AI system could drive some type of runaway technological growth, which would in turn lead to drastic changes in civilization… This scenario is not without its critics, but the considerations of this paper highlight one additional difficulty. In order to develop any type of technological singularity, the AI must reach the level of superhuman intelligence, and implement its civilization-changing effects, before the onset of the computational energy crisis discussed herein. In other words, limits on the energy available for computation will place significant limits on the development of AI and its ability to instigate a singularity.

So the issues of computational growth and the energy to supply it are thorny. Let’s look in the next post at the ‘black cloud’ option that a civilization far more advanced than ours might deploy, assuming it figured out a way to get past the computing bottleneck our own civilization faces in the not all that distant future. There we’ll be in an entirely speculative realm that doesn’t offer solutions to the immediate crisis, but makes a leap past that point to consider computational solutions at the scale of the solar system that can sustain a powerful advanced culture.

The paper is Laughlin et al., “On the Energetics of Large-Scale Computation using Astronomical Resources.” Full text. Laughlin also writes about the concept on his oklo.org site.

tzf_img_post