SETI and Open Data

Are there better ways of studying the raw data from SETI? We may know soon, because Jill Tarter has announced that in a few months, the SETI Institute will begin to make this material available via the SETIQuest site. Those conversant with digital signal processing are highly welcome, but so are participants from the general public as the site gears up to offer options for all ages. Tarter speaks of a ‘global army’ of open-source code developers going to work on data collected by the Allen Telescope Array, along with students and citizen scientists anxious to play a role in the quest for extraterrestrial life.

SETI@home has been a wonderful success, but as Tarter notes in this CNN commentary, the software has been limited. You took what was given you and couldn’t affect the search techniques brought to bear on the data. I’m thinking that scattering the data to the winds could lead to some interesting research possibilities. We need the telescope hardware gathered at the Array to produce these data, but the SETI search goes well beyond a collection of dishes.

Ponder that the sensitivity of an instrument is only partly dependent on the collecting area. We can gather all the SETI data we want from our expanding resources at the Allen Telescope Array, but the second part of the equation is how we analyze what we gather. Claudio Maccone has for some years now championed the Karhunen-Loève Transform, developed in 1946, as a way of improving the sensitivity to an artificial signal by a factor of up to a thousand. Using the KL Transform could help SETI researchers find signals that are deliberately spread through a wide range of frequencies and undetectable with earlier methods.

Image: Dishes at the ATA. What new methods can we bring to bear on how the data they produce are analyzed? Credit: Dave Deboer.

SETI researchers used a detection algorithm known as the Fourier Transform in early searches, going under the assumption that a candidate extraterrestrial signal would be narrow-band. By 1965, it became clear that the new Fast Fourier Transform could speed up the analysis and FFT became the detection algorithm of choice. It was in 1982 that French astronomer and SETI advocate François Biraud pointed out that here on Earth, we were rapidly moving from narrow-band to wide-band telecommunications. Spread spectrum methods are more efficient because the information, broken into pieces, is carried on numerous low-powered carrier waves which change frequency and are hard to intercept.

What Biraud noticed, and what Maccone has been arguing for years, is that our current SETI methods using FFT cannot detect a spread spectrum signal. Indeed, despite the burden the KLT’s calculations place even on our best computers, Maccone has devised methods to make it work with existing equipment and argues that it should be programmed into the Low Frequency Array and Square Kilometer Array telescopes now under construction. The KLT, in other words, can dig out weak signals buried in noise that have hitherto been undetectable.

But wait, wouldn’t a signal directed at our planet most likely be narrow in bandwidth? Presumably so, but extraneous signals picked up by chance might not be. It makes sense to widen the radio search to include methods that could detect both kinds of signal, to make the search as broad as possible.

I bring all this up because it points to the need for an open-minded approach to how we process the abundant data that the Allen Telescope Array will be presenting to the world. By making these data available over the Web, the SETI Institute gives the field an enormous boost. We’re certainly not all digital signal analysts, but the more eyes we put on the raw data, the better our chance for developing new strategies. As Tarter notes:

This summer, when we openly publish our software detection code, you can take what you find useful for your own work, and then help us make it better for our SETI search. As I wished, I’d like to get all Earthlings spending a bit of their day looking at data from the Allen Telescope Array to see if they can find patterns that all of the signal detection algorithms may still be missing, and while they are doing that, get them thinking about their place in the cosmos.

And let me just throw in a mind-bending coda to the above story. KLT techniques have already proven useful for spacecraft communications (the Galileo mission employed KLT), but Maccone has shown how they can be used to extract a meaningful signal from a source moving at a substantial percentage of the speed of light. Can we communicate with relativistic spacecraft of the future when we send them on missions to the stars? The answer is in the math, and Maccone explains how it works in Deep Space Flight and Communications (Springer/Praxis, 2009), along with his discussion of using the Sun as a gravitational lens.

tzf_img_post

Kepler: Hold the Data?

Not long ago I sent out a ‘tweet’ on the Centauri Dreams Twitter feed talking about the number of planet detection candidates the Kepler mission was working with. Almost immediately I discovered that the story had become unavailable at the Nature News site, making me wonder whether the figures were right, but the story is back up (available here) and I can cite it once again. Thus:

Since its launch on 6 March 2009, Kepler, with its 0.95-metre telescope, has been staring at the same field of stars near the northern star of Vega, looking for tiny reductions in starlight caused by a planet passing in front of a star’s face. In January, the Kepler team announced the discovery of five new exoplanets. [Kepler principal investigator William] Borucki says that the team, as of last week, has found 328 more candidates — but that as many as 50% of these may be false positives, where objects such binary stars confuse the picture.

328 candidates, and much work ahead in weeding out the genuine planets from the false positives. But that’s what this kind of work is all about, a reminder that we’ll need to let Kepler run its three-year course before we really have the data nailed down and have identified (we assume) some terrestrial worlds around Sun-like stars, not to mention small planets around M-dwarfs that may prove to be in the habitable zone. No need to be in a hurry for results, in other words, and if you read the Nature News story, you’ll see that there’s no point in hoping for quick answers anyway.

That’s because, in addition to the inexorable rhythms of good research, the Kepler team’s new policy, as recommended by a NASA advisory panel, may be to hold back data on 400 ‘objects of interest’ until February of 2011, by which time some false positives will be eliminated. The issue is tricky: Researchers are usually allowed to keep data private until publication, which allows for the kind of rigor to be applied that will ensure the work’s accuracy. Caution in making media announcements that could lead to later retractions is completely understandable. The question is how long a delay is reasonable, given the multi-year period Kepler needs to confirm some detections. We should have a final decision soon on whether the strict policy will be put in place.

This comes up now because in June, Kepler is supposed to be releasing data, exposing the information to a wider audience that can work with the material to confirm planets. That original policy would have required the team to have turned over its first 43 days worth of data, but now we wait for a final decision on possible policy changes that is due in a week or so. According to Nature News, the Kepler team hoped to censor 500 objects until mission’s end in late 2013, so the subcommittee’s recommendation actually represents a compromise.

From the article:

Many astrophysics programmes allow researchers a proprietary period with the data. For instance, guest observers on the Hubble Space Telescope get exclusive use of their data for a year before public release. But the tradition for NASA Discovery missions — small, principal-investigator-led missions like Kepler — is to make calibrated data available immediately. That policy has already been changed once for Kepler, last year, when the team was given more than a year to pursue confirmations and work out the kinks in its data processing.

But Borucki says more time is needed because a mission launch delay meant that the team missed out on a season of the ground-based follow-up observations that are needed to verify candidate exoplanets. He also worries about releasing “half-baked” candidates that the media will jump on without an understanding of their uncertainty. “My worry is less of being scooped than it is of putting out inaccurate estimates of what exoplanets are really like out there,” he says.

The mission team’s views aren’t unanimously held by other astronomers, and the paper quotes dissenting remarks from Scott Gaudi (Ohio State), who believes that other teams could help the Kepler crew work to confirm candidate planets. ESA’s Malcolm Fridlund, project scientist for CoRoT, is working under a system where data are kept proprietary for one year, but he’s quick to note that the upcoming PLATO (Planetary Transits and Oscillations of Stars) mission will set a different agenda, one that will require the immediate dissemination of data. I’m with Fridlund in believing that with fast release, “You get a larger community and you get a bigger workforce for free. It’s clear that the more people you get involved, the more support you get.”

tzf_img_post

Life Throughout the Solar System?

Just as SETI is redefining its parameters, astrobiology has been going through a shift that widens our notion of habitable zones. Not so long ago, the concept seemed simple. Take a Sun-like star and figure out at what distance a planet could maintain liquid water on its surface. Assume, in other words, that the life you’re looking for is more or less like what’s found on Earth, and therefore needs the same conditions to persist. Now we’re finding remote venues like Enceladus that remind us liquid water can turn up in unusual places, and we’ve parachuted a probe onto a world, Titan, where it’s not inconceivable that exotic forms of life can develop.

Throw in the possibility that objects as distant as the Kuiper Belt may contain subsurface liquids and what used to be a constrained habitable zone seems to be vast indeed. And perhaps we’ve already found another living planet, as astrobiologist Dirk Schulze-Makuch tells Lee Billings in a recent interview. Along with David Darling, Schulze-Makuch is the author of the new book We Are Not Alone (Oneworld Publications, 2010), which fires an unexpected shot across the bow with its subtitle: Why We Have Already Found Extraterrestrial Life.

Image: Viking 2 image of Mars’ Utopian Plain. Credit: NASA.

The location of this ‘find,’ of course, is Mars, where the authors argue that the Viking landers probably did discover life there in the 1970s, despite all the subsequent analysis that seemed to rule it out. As Schulze-Makuch tells Billings:

In some ways the timing was bad for Viking. A lot of progress was made after its life-detection experiments were already on or on their way to Mars: The discovery of all the ecosystems at undersea hydrothermal vents, and the extremophile research of the early 1980s really changed how we think about life and its limitations. The Viking researchers thought life on Mars would be heterotrophic, feeding off abundant organic compounds distributed everywhere all over the Martian surface. That picture was wrong, and studies of extremophiles on Earth have made us think differently about Mars. Some people say Viking tried to do too much, too early, and as a result of its ambiguous results, nothing has happened with Martian life-detection experiments ever since.

Viking’s three life-detection experiments produced results that could not have been more frustrating. The Labeled Release Experiment produced a positive result, the Gas Exchange Experiment a negative one, and the Pyrolytic Release Experiment proved ambiguous. That left it up to Viking’s Gas Chromatograph Mass Spectrometer, says Schulze-Makuch, and its subsequent failure to detect organic matter led scientists to conclude Mars was barren of life.

But the latter result should raise the eyebrows:

…the results of the GC-MS were always somewhat odd. This is because we know from the Martian meteorites that there are organics on Mars. Also, it’s been shown that the same instrument could not detect organics in the Dry Valleys of Antarctica or from hydrothermal soil, places on Earth where we know that a small but significant population of microbes makes a living. So the question is, why did the GC-MS not detect the organics present on Mars? Was the concentration too low? Were they in a form that was not detectable? Or, were they all oxidized to carbon dioxide before they could be measured as organics?

Schulze-Makuch opts for the last option and fits it into a hypothesis he and Joop Houtkooper have developed, that Martian organisms use hydrogen peroxide and water as intracellular fluids. Heat, as produced by the GC-MS experiment, would cause the hydrogen peroxide to become unstable and would have oxidized organic compounds, producing carbon dioxide, which is just what GC-MS detected. Life on a dry desert world might well adapt hydrogen peroxide and water for these purposes, and that would also imply that Viking’s Gas Exchange and Pyrolytic Release experiments used too much water, overwhelming Martian microbes.

Mars, so tantalizingly close to our older notions of a habitable zone, is the place we can explore most readily with our robotic instrumentation, but the real action may be much further out in the Solar System. That’s because the Earth is close enough to both Mars and Venus to have exchanged materials with each in the past, leading to the possibility of contamination. To find an independent start to life, what some are calling a ‘second genesis,’ Titan, Europa, Ganymede and Enceladus may be better venues to explore. Of Titan, Schulze-Makuch says:

Titan is a lower priority than Mars, since it is much, much harder to get there, but for finding life that is almost certainly of independent origin, Titan should be the top priority. If we understand organic chemistry correctly and the reactions behind it, it seems reasonable to think there should be life there. Even if we don’t find life there, we can still see how far organic chemistry can evolve in its prebiotic phase. Titan is a natural laboratory for that.

As our astrobiological ideas have widened, we’ve learned not only that we have numerous worlds to explore for life on nearby planets and moons, but that our own planet may contain forms of life we have barely begun to catalog. What we find will invariably lead to speculation about what direction life may have taken on planets around other stars. The big questions will persist: Is life as tenacious elsewhere as it proves to be on Earth? And even if so, is its origin more constrained than the conditions that allow it to spread? Getting past our experience of life on Earth demands preparing ourselves for unusual answers, and it’s not inconceivable that life may exist from the upper clouds of Venus to the remote edges of the Solar System.

tzf_img_post

An Archaeological Approach to SETI

Changing approaches to SETI are getting public attention these days, as witness a new article in The Economist that makes reference to the probable cause of the interest, the publication of Paul Davies’ The Eerie Silence (Houghton Mifflin Harcourt, 2010). Sub-titled ‘Renewing Our Search for Alien Intelligence,’ Davies’ book is making accessible to the general public the kind of discussion we’ve often had in these pages, looking at the question of whether our SETI strategies at radio and optical wavelengths aren’t too limited for any chance of success. The Economist is just one sign of the new interest.

After all, technologies like spread spectrum encoding are already masking straightforward radio communications, while conventional broadcasting is giving way to such heavy use of fiber-optics that a planet like ours may go dark at radio wavelengths within a relatively short time as civilizations go, and no more than an infinitesimal flicker in cosmological terms. Thus the interest in alternatives like hunting up Dyson spheres, the search for which Dick Carrigan has so actively championed. The few searches for such spheres have been unsuccessful, but we’ve only begun to look for signs of the technologies of vastly more powerful cultures.

Dyson Spheres and Their Signature

In any case, Dyson spheres would be tough to locate, although it’s interesting to note that the Allen Telescope Array will investigate thirteen of what Carrigan calls the ‘least implausible Dyson sphere candidates’ once it becomes fully operational. Carrigan built up a list of such candidates by going through data from the Infrared Astronomy Satellite (IRAS), which spotted hundreds of thousands of infrared sources. A Dyson sphere should radiate in the infrared, but would have to be distinguished from natural sources like stars with thick dust shells, whose own infrared signal might closely mimic that of an artificial construct.

Carrigan goes into all this on his Web site, where he notes that the Calgary Group has classified all the available low-resolution spectra from IRAS into categories including some that would help identify a Dyson sphere candidate. Thus U stands for a type of object with an ‘unusual spectrum showing a flat continuum,’ while F stands for ‘featureless’ objects that may correspond to O or C stars with small amounts of dust. The C category flags possible late-type cool giant stars with circumstellar shells of carbon dust and emission in the infrared, while H is for HII regions, encompassing various kinds of nebulae.

All these could be considered Dyson sphere look-alikes, and Carrigan notes that more data are becoming available through the Spitzer Space Telescope and surveys like 2MASS. The key is to pick out an actual Dyson sphere signature from all the noise, one Carrigan describes this way:

…an advanced civilization inhabiting a solar system might break up the planets into very small planetoids or pebbles to form a loose shell that would collect all the light coming from the star. The shell of planetoids would vastly increase the available “habitable” area and absorb all of the visible light. The stellar energy would be reradiated at a much lower temperature. If the visible light was totally absorbed by the planetoids a pure Dyson Sphere signature would be an infrared object with luminosity equivalent to the hidden star and a blackbody distribution with a temperature corresponding to the radius of the planetoid swarm. For the case of the Sun with the planetoids at the radius of the Earth the temperature would be approximately 300 ºK.

Signs of a Kardashev Type II Culture

But back to The Economist, which notes that Dyson spheres are but one thing an advanced civilization might spend its time creating. A Kardashev Type II civilization is one capable of exploiting all the energy output of its star, leading to the possibility of interesting forms of stellar engineering. From the article:

Any civilisation that has built a Dyson sphere will have to have been around for a long time, of course—and in the very long run its star will start to change in unpleasant ways, ballooning to form a red giant. Another signature of advanced technology would be an attempt to slow this process down. Red giants are created when a star exhausts its supply of hydrogen at its core, with the result that the inner layer contracts and the outer layers expand, forming a redder and much larger star. If the star’s outer layers could be mixed into the core, that would slow the process of inflation down. And, presumably, a sufficiently advanced civilisation would try to do that if it could.

Let’s assume so, and fortunately for distant observers, such an engineering feat would show a particular signature, even if one that, like Dyson spheres, can be confused with natural phenomena:

Such a star would look odd, though. It would be bluer than it should be and would be of a type known to astronomers as a “blue straggler”. Although, again, there are perfectly natural reasons these might form. The universe, though, is an ancient place, so many civilisations could be very old indeed. Perhaps, then, it will be a sign like this—of a technological civilisation millions of years old—that is seen, rather than some upstart that has not even got its radio waves under control.

Leaving Assumptions Behind

New technologies will help us pursue other observational lines of research. Using space-based telescopes and, recently, instruments on the ground, we’re learning how to characterize the atmospheres of extrasolar planets. It’s a process that’s sure to pick up as we refine our techniques, but in the not so distant future we may be able to spot telltale signs of industrial activity on a distant world as well as probing its atmosphere for the chemical signs of life. Carrigan suggests chlorofluorocarbons (CFCs) as one possibility, a type of molecule that would have to correspond to a technology because no natural process produces it.

All of these approaches take us in a new direction, one that, as Dick Carrigan notes, makes no assumptions about the motivation of the originating civilization. Indeed, he likens the hunt to a form of archaeology, one closer to the search for exoplanets than the attempt to snag a radio signal. For more, see Carrigan, “Starry Messages: Searching for Signatures of Interstellar Archaeology,” FERMILAB-PUB-09-607-AD (abstract), which I’ll be discussing in more detail soon, along with cross-references to Davies’ The Eerie Silence. For as Davies shows, there are many more ways to conceive how the activities of an extraterrestrial civilization could modify the heavens.

tzf_img_post

Confirming General Relativity at Large Scales

The discovery that the universe’s expansion is accelerating has led some to wonder whether General Relativity breaks down at large scales. But new work by Fabian Schmidt and colleagues at Caltech seems to play down a rival theory known, economically enough, as f(R). If, under General Relativity, we see dark energy in terms of a cosmological constant, and thus view it as the energy of empty space, f(R) takes another tack, seeing the cosmic acceleration as the result of a necessary modification of gravitational theory. This effect would play a role in the way matter grows over time to become galaxy clusters, and that leads to a useful way to test the theory.

What Schmidt and team did was to take mass estimates of 49 galaxy clusters based on observations from the Chandra X-ray Observatory, comparing them with the predictions of theory and raw data from supernova studies, the cosmic microwave background and the large-scale distribution of galaxies. Compellingly, they found no difference in gravity as understood by General Relativity on scales larger than 130 million light years. Says Schmidt:

“This is the strongest ever constraint set on an alternative to General Relativity on such large distance scales. Our results show that we can probe gravity stringently on cosmological scales by using observations of galaxy clusters.”

In essence, the growth of galactic clusters is being used as a test of modified gravity scenarios at the largest levels, and may turn out to be useful in investigating alternate models growing out of string theory and other theories invoking higher dimensions. It’s helpful in this regard to see that a second, independent study tests gravity across cosmological distances. Tapping Chandra data along with observations from the ROSAT telescope, a team at Stanford University examined the growth of galaxy clusters to see how their evolution fits with the predictions of General Relativity. And for a second time, GR wins out, says David Rapetti (Kavli Institute for Particle Astrophysics and Cosmology), who led the study:

“Einstein’s theory succeeds again, this time in calculating how many massive clusters have formed under gravity’s pull over the last five billion years. Excitingly and reassuringly, our results are the most robust consistency test of General Relativity yet carried out on cosmological scales.”

Image: This composite image of the galaxy cluster Abell 3376 shows X-ray data from the Chandra X-ray Observatory and the ROSAT telescope in gold, an optical image from the Digitized Sky Survey in red, green and blue, and a radio image from the VLA in blue. The “bullet-like” appearance of the X-ray data is caused by a merger, as material flows into the galaxy cluster from the right side. The giant radio arcs on the left side of the image may be caused by shock waves generated by this merger. Credit: X-ray (NASA/CXC/SAO/A. Vikhlinin; ROSAT), Optical (DSS), Radio (NSF/NRAO/VLA/IUCAA/J.Bagchi).

Why are galaxy clusters so useful? Their growth is influenced by both the expansion rate of the universe and by the properties of gravity over large scales. Other methods, such as observing supernovae or the large-scale distribution of galaxies, help us to measure distances but are dependent only upon the expansion rate of the universe and not sensitive to the properties of gravity. It’s the growth rate of the cluster structure that these new studies work with, and that growth is driven by gravity, making the observations a telling indicator of how gravity operates at the cosmological level.

The first paper is Schmidt et al., “Cluster Constraints on f(R) Gravity,” Physical Review D 80, 083505 (2009). Preprint available, and abstract here. The second team’s paper has been accepted by Monthly Notices of the Royal Astronomical Society. See the preprint of Mantz et al., “The Observed Growth of Massive Galaxy Clusters 1: Statistical Methods and Cosmological Constraints,” with other papers in the series likewise available at the arXiv site. This Chandra news release is helpful.

tzf_img_post