Moore’s Law, first stated all the way back in 1965, came out of Gordon Moore’s observation that the number of transistors per silicon chip was doubling every year (it would later be revised to doubling every 18-24 months). While it’s been cited countless times to explain our exponential growth in computation, Greg Laughlin, Fred Adams and team, whose work we discussed in the last post, focus not on Moore’ Law but a less publicly visible statement known as Landauer’s Principle. Drawing from Rolf Landauer’s work at IBM, the 1961 equation defines the lower limits for energy consumption in computation.

You can find the equation here, or in the Laughlin/Adams paper cited below, where the authors note that for an operating temperature of 300 K (a fine summer day on Earth), the maximum efficiency of bit operations per erg is 3.5 x 1013. As we saw in the last post, a computational energy crisis emerges when exponentially increasing power requirements for computing exceed the total power input to our planet. Given current computational growth, the saturation point is on the order of a century away.

Thus Landauer’s limit becomes a tool for predicting a problem ahead, given the linkage between computation and economic and technological growth. The working paper that Laughlin and Adams produced looks at the numbers in terms of current computational throughput and sketches out a problem that a culture deeply reliant on computation must overcome. How might civilizations far more advanced than our own go about satisfying their own energy needs?

Into the Clouds

We’re familiar with Freeman Dyson’s interest in enclosing stars with technologies that can exploit the great bulk of their energy output, with the result that there is little to mark their location to distant astronomers other than an infrared signature. Searches for such megastructures have already been made, but thus far with no detections. Laughlin and Adams ponder exploiting the winds generated by Asymptotic Giant Branch stars, which might be tapped to produce what they call a ‘dynamical computer.’ Here again there is an infrared signature.

Let’s see what they have in mind:

In this scenario, the central AGB star provides the energy, the raw material (in the form of carbon-rich macromolecules and silicate-rich dust), and places the material in the proper location. The dust grains condense within the outflow from the AGB star and are composed of both graphite and silicates (Draine and Lee 1984), and are thus useful materials for the catalyzed assembly of computational components (in the form of nanomolecular devices communicating wirelessly at frequencies (e.g. sub-mm) where absorption is negligible in comparison to required path lengths.

What we get is a computational device surrounding the AGB star that is roughly the size of our Solar System. In terms of observational signatures, it would be detectable as a blackbody with temperature in the range of 100 K. It’s important to realize that in natural astrophysical systems, objects with these temperatures show a spectral energy distribution that, the authors note, is much wider than a blackbody. The paper cites molecular clouds and protostellar envelopes as examples; these should be readily distinguishable from what the authors call Black Clouds of computation.

It seems odd to call this structure a ‘device,’ but that is how Laughlin and Adams envision it. We’re dealing with computational layers in the form of radial shells within the cloud of dust being produced by the AGB star in its relatively short lifetime. It is a cloud in an environment that subjects it to the laws of hydrodynamics, which the paper tackles by way of characterizing its operations. The computer, in order to function, has to be able to communicate with itself via operations that the authors assume occur at the speed of light. Its calculated minimum temperature predicts an optimal radial size of 220 AU, an astronomical computing engine.

And what a device it is. The maximum computational rate works out to 3 x 1050 bits s-1 for a single AGB star. That rate is slowed by considerations of entropy and rate of communication, but we can optimize the structure at the above size constraint and a temperature between 150 and 200 K, with a mass roughly comparable to that of the Earth. This is a device that is in need of refurbishment on a regular timescale because it is dependent upon the outflow from the star. The authors calculate that the computational structure would need to be rebuilt on a timescale of 300 years, comparable to infrastructure timescales on Earth.

Thus we have what Laughlin, in a related blog post, describes as “a dynamically evolving wind-like structure that carries out computation.” And as he goes on to note, AGB stars in their pre-planetary nebula phase have lifetimes on the order of 10,000 years, during which time they produce vast amounts of graphene suitable for use in computation, with photospheres not far off room temperature on Earth. Finding such a renewable megastructure in astronomical data could be approached by consulting the WISE source catalog with its 563,921,584 objects. A number of candidates are identified in the paper, along with metrics for their analysis.

These types of structures would appear from the outside as luminous astrophysical sources, where the spectral energy distributions have a nearly blackbody form with effective temperature T ? 150 ? 200 K. Astronomical objects with these properties are readily observable within the Galaxy. Current infrared surveys (the WISE Mission) include about 200 candidate objects with these basic characteristics…

And a second method of detection, looking for nano-scale hardware in meteorites, is rather fascinating:

Carbonaceous chondrites (Mason 1963) preserve unaltered source material that predates the solar system, much of which was ejected by carbon stars (Ott 1993). Many unusual materials have been identified within carbonaceous chondrites, including, for example, nucleobases, the informational sub-units of RNA and DNA (see Nuevo et al. 2014). Most carbonaceous chondrites have been subject to processing, including thermal metamorphism and aqueous alteration (McSween 1979). Graphite and highly aromatic material survives to higher temperatures, however, maintaining structure when heated transiently to temperatures of order, T ? 700K (Pearson et al. 2006). It would thus potentially be of interest to analyze carbonaceous chondrites to check for the presence of (for example) devices resembling carbon nanotube field-effect transistors (Shulakar, et al. 2013).

Meanwhile, Back in 2021

But back to the opening issue, the crisis posited by the rate of increase in computation vs. the energy available to our society. Should we tie Earth’s future economic growth to computation? Will a culture invariably find ways to produce the needed computational energies, or are other growth paradigms possible? Or is growth itself a problem that has to be surmounted?

At the present, the growth of computation is fundamentally tied to the growth of the economy as a whole. Barring the near-term development of practical ireversible computing (see, e.g., Frank 2018), forthcoming computational energy crisis can be avoided in two ways. One alternative involves transition to another economic model, in contrast to the current regime of information-driven growth, so that computational demand need not grow exponentially in order to support the economy. The other option is for the economy as a whole to cease its exponential growth. Both alternatives involve a profound departure from the current economic paradigm.

We can wonder as well whether what many are already seeing as the slowdown of Moore’s Law will lead to new forms of exponential growth via quantum computing, carbon nanotube transistors or other emerging technologies. One thing is for sure: Our planet is not at the technological level to exploit the kind of megastructures that Freeman Dyson and Greg Laughlin have been writing about, so whatever computational crisis we face is one we’ll have to surmount without astronomical clouds. Is this an aspect of the L term in Drake’s famous equation? It referred to the lifetime of technological civilizations, and on this matter we have no data at all.

The working paper is Laughlin et al., “On the Energetics of Large-Scale Computation using Astronomical Resources.” Full text. Laughlin also writes about the concept on his oklo.org site.

tzf_img_post