≡ Menu

The Ultimate SETI Signal

Robert Carrigan (Fermi National Accelerator Laboratory) drew quite a bit of attention last summer when he suggested that SETI signals could contain harmful information, perhaps created by a so-called ‘SETI hacker.’ Carrigan’s article has now appeared in Acta Astronautica, and it’s stuffed with beguiling ideas even if you find the premise unlikely.

“…will a SETI signal be altruistic, benign or malevolent?” Carrigan asks. “It would help to understand the motivations of a message before reading too much of it. Like Odysseus, we may have to stuff wax in the ears of our programmers and strap the chief astronomer to the receiving tower before she is allowed to listen to the song of the siren star.”

That’s fascinating stuff, recalling Fred Hoyle’s A for Andromeda and Carrigan’s own The Siren Stars, written with Nancy Carrigan and serialized in Analog in 1970. But this new paper is worth reading for reasons other than the hacker hypothesis; its author speculates widely on SETI itself. Ponder, for example, how the message carried by a SETI signal might vary depending on its originators. What information would an intelligent plant think to convey? A dog-like species would focus on scent and odor as more significant than the pictorial images so important to humans. And what would we make today of a signal carrying data only a quantum computer could decipher?

As to message size, look at today’s information equivalents. A desktop computer operating system is perhaps 1 gigabyte. Although the human genome contains three billion DNA base pairs, Carrigan estimates its effective information content at 0.05 gigabytes. An education through graduate school could be contained in 1-10 GB, while a lifetime of images stored once a minute might total 1000 GB.

Suppose we wanted to send the ultimate signal from Earth to the stars. Matthew Lesk speculated in 1997 that all the information in the world would require 12 exabytes of storage; Carrigan takes this further by adding profiles for all the world’s inhabitants, including their DNA information, bringing the total to 25 exabytes. The ultimate transmission might be one that includes DNA profiles not just for people but for all creatures and plants on the Earth.

Could a civilization attempt a transmission this comprehensive? If so, the signature of the signal would include short, bursty traffic to catch the attention, with long periods of electromagnetic transmission to carry the background data. Or would the message come in physical form? Carrigan notes that a 5mm sphere of DNA could store on the order of 25 exabytes of data. This seems an unlikely delivery mechanism, but the author argues that energy costs per bit for electromagnetic processes vs. delivery via what might be called ‘directed panspermia’ are similar for matter velocities on the order of .00001 of the speed of light. Future technologies, of course, could change this drastically.

But back to the paper’s central thesis: “The most important point is that large amounts of information can be transferred inexpensively at the speed of light even with current technologies. In addition, the message size can easily be so large that the underlying intent of the message would not be apparent.”

The paper, which belongs on the shelves of disks of anyone following the SETI search, is “Do potential SETI signals need to be decontaminated?” in Acta Astronautica 58 (2006), pp. 112-117. Carrigan’s own site contains earlier work on the theory.

Comments on this entry are closed.

  • Gregory Benford January 27, 2006, 12:51

    This is a good piece, based on ideas that have been in science fiction for decades. I added a twist in a story of several years back, “The Hydrogen Wall”–that SETI messages of high complexity could be AIs. Then they could further the goals of their designers, which might well not be our aims. Perhaps they would have religious propagation in mind, or furthering other memes we cannot yet grasp. Biology is a useful guide here, for species like ours certainly strive to expand their range.

  • Administrator January 27, 2006, 13:30

    Re their aims not being our aims, Carrigan talks about the temptation to apply moral arguments to SETI. It’s worth another quote: “Attempting to attribute human legal or ethical values to our explorer ant is a dangerous stretch. The one ‘ethical’ framework we could have some confidence in is a Darwinian ‘survival of the fittest.’ This behavior standard is about as far from altruistic as one can get.” And as you say, who knows what other memes might fit their agenda?

  • ljk January 27, 2006, 22:58

    Perhaps advanced ETI may not be activley hostile towards us or have conquest in mind so much as they would be indifferent to beings less evolved than themselves.

    How concerned would a Kardashev Type 3 civilization be if they were taking resources or reconstructing the galaxy and we were in their construction path? Would they go around us? Would they move us out of the way? Or would we be treated just as an ant colony would at the site of a new building under construction? The construction workers aren’t going out of their way to destroy the ant colony, but neither is it big or advanced enough for them to notice.

  • Dan January 28, 2006, 2:31

    It’s unfortunate that Carrigan frames his SETI hacker theory primarily in terms of a computer virus, since viruses by definition must exploit specific vulnerabilities in a running process within an idiosyncratic computer architecture. If we actually discovered a signal that did function as a virus, it would be strong evidence that the signal was not extraterrestrial in origin.

    An interstellar “Trojan horse” is a horse of a different color, though, a very different challenge that is more conceptually coherent. An arbitrary computing architecture could be described, along with code too vast, alien, obfuscated, or complex to be ‘denatured’. Obviously, a program running on an isolated machine is little direct threat, but assuming the signal is made public, it would soon be decoded and run on computers all across the world. Any number of sci-fi scenarios could bloom from this point, both good and bad. Given that a transmitting species would likely understand this dilemma, we should expect benevolent messages to be relatively simple.