SETI: A Detectable Neutrino Signal?

Somehow I never thought of the IceCube neutrino telescope as a SETI instrument. Deployed in a series of 1,450 to 2,450 meters-deep holes in Antarctica and taking up over a cubic kilometer of ice, IceCube is fine-tuned to detect neutrinos. That makes it a useful tool for studying violent events like galactic collisions and the formation of quasars, providing insights into the early universe. But SETI?

Perhaps, says Zurab Silagadze (Novosibirsk State University), who notes that most SETI work in the past has focused on centimeter wavelength electromagnetic signals. Says Silagadze:

Here we question this old wisdom and argue that the muon collider, certainly in reach of modern day technology… provides a far more unique marker of civilizations like our own [type I in Kardashev’s classification… Muon colliders are accompanied by a very intense and collimated high-energy neutrino beam which can be readily detected even at astronomical distances.

IceCube-Eiffel

Image: The IceCube array in the deep ice, with Eiffel Tower suggesting scale. The dark cylinder is the AMANDA detector, incorporated into IceCube. Credit: NSF.

Muons are elementary particles that, like all such particles, have a corresponding antiparticle of opposite charge. Because they have no known substructure, muons and antimuons offer interesting opportunities for a collider. Their advantage over protons is that the effective collision energy is about ten times higher than for proton beams with the same energy. Moreover, muons are much heavier than electrons and produce less synchrotron radiation. You get higher energy levels with a cheaper collider that is shorter in circumference. Here’s a short backgrounder from Physics World on this.

So it makes sense that, if we can get around formidable practical challenges, we’ll eventually want to develop a muon collider. So, presumably, would an extraterrestrial civilization. And indeed, Silagadze discusses the practical uses of a high-energy neutrino beam in, for example, the study of the inner structure of a planet, or the use of collimated neutrino beams for communications. A 1979 paper by Mieczyslaw Subotowicz went so far as to argue that advanced cultures might deliberately choose neutrino channels for interstellar communications to shut out immature emergent civilizations from the ongoing conversation.

For that matter, is it possible that neutrinos could be used to set interstellar time standards? Note the following from Silagadze’s paper, which places these ideas in the context of the Kardashev scale for measuring the growth of technological civilizations:

Neutrino SETI was also proposed earlier with somewhat different perspective… It was suggested that type II (which have captured all of the power from their host star) and type III civilizations, spread throughout the Galaxy, may require interstellar time standards to synchronize their clocks. It is argued that mono-energetic 45.6 GeV neutrino pulses… produced in a futuristic dedicated electron-positron collider of huge luminosity may provide such standards. If there is an extraterrestrial civilization of this type nearer than about 1 kpc using this synchronization method, the associated neutrinos can be detected by terrestrial neutrino telescopes with an effective volume of the order of km3 of water…

IceCube, anyone? The beauty of neutrino SETI is that it can readily run in the background of concurrent neutrino-based astrophysical studies. Thus keeping an eye out for possibly artificial high-energy neutrino signals produced in muon colliders light years away makes a certain degree of sense. Will it succeed? Silagadze quotes Cocconi and Morrison’s classic paper: “The probability of success is difficult to estimate: but if we never search, the chance of success is zero.”

The paper is Silagadze, “SETI and muon collider,” Acta Physica Polonica B39 (2008), pp. 2943-2948 (available online). The paper on neutrino channels for interstellar communications is Subotowicz, “Interstellar Communication by Neutrino Beams,” Acta Astronautica 6 (1979), pp. 213-220 (abstract).

TESS Mission Fails to Make the Cut

NASA has made its choices, and TESS is not one of them. The Transiting Exoplanet Survey Satellite would have used six telescopes to observe the brightest stars in the sky, a remarkable 2.5 million of them, hoping to find more than 1,000 transiting planets ranging in size from Jupiter-mass down to rocky worlds like our own. An entrant in the agency’s Small Explorer program, TESS could have accelerated the time-frame for discovering another habitable world, assuming all went well.

Not that we don’t have Kepler at work on 100,000 distant stars, looking for transits that can give us some solid statistical knowledge of how often terrestrial (and other) planets occur. And, of course, the CoRoT mission is actively in the hunt. But TESS would have complemented both, looking at a wide variety of stars, many of which would have been M-dwarfs. Not long ago I referred to a Greg Laughlin post that noted a 98 percent probability that TESS would locate a potentially habitable transiting planet orbiting a red dwarf within 50 parsecs of the Earth.

Were that the case, the results could have been handed over to the James Webb Space Telescope, scheduled for launch near the end of the putative TESS mission, for further investigation. JWST, so the thinking goes, could then take a spectrum and tell us something about conditions in that planet’s atmosphere. Retrieving data from the atmospheres of such planets is crucial to astrobiology and we’ll get it done one day, but perhaps not as soon as we hoped.

Getting a mission into space is no easy matter in the best of times (see Alan Boss’ The Crowded Universe for vivid proof of this). Consider that the two Small Explorer (SMEX) finalists were chosen from an original 32 submitted in January of 2008. The SMEX missions are capped at $105 million each, excluding the launch vehicle. That cost would depend on the vehicle — the last time I looked, an Atlas V would command $130 million. We’re talking relatively small investment for a solid scientific return, even if that return doesn’t include exoplanetary results on this round.

One of the two proposals now to be developed into full missions is the Interface Region Imaging Spectrograph, which will use a solar telescope and spectrograph to look at the Sun’s chromosphere. The other is the Gravity and Extreme Magnetism SMEX mission, which will measure the polarization of X-rays emitted by neutron stars and stellar-mass black holes, as well as the massive black holes found at the centers of galaxies.

Given that one of NASA’s stated aims with the SMEX program is “…to raise public awareness of NASA’s space science missions through educational and public outreach activities” (see this news release), the agency may have missed an opportunity with TESS. We’re close to the detection, through radial velocity or transit studies, of a terrestrial planet around another star. That’s going to put the study of that planet’s atmosphere for life signs high on everyone’s agenda, including the public’s. From the PR perspective, TESS was a gold-plated winner.

Brute-Force Engineering and Climate

The eruption of Mt. Tambora in Indonesia in 1815 pumped so much sulfur dioxide into the stratosphere that New England farmers found their fields frosted over in July. Climate change, it seems, can be quick and overwhelming, at least on short scales. The eruption of the Mt. Pinatubo in the Philippines in 1991 cooled global temperatures for several years by about half a degree Celsius. Sulfur dioxide works.

So how about this: We send a fleet of airships high into the stratosphere, attached to hoses on the ground that pump 10 kilos of sulfur dioxide every second. The airships then spew this mix into the upper atmosphere, a aerosolized pollutant that, turning the skies Blade Runner red, shields the planet from the Sun’s heat. Call it geo-engineering, an extreme form of human climate manipulation that is the subject of a recent story in The Atlantic.

Into the Anthropocene

Writer Graeme Wood notes that our activities have been transforming the planet for centuries now, leading some to dub our era the ‘anthropocene’ period. Says Wood:

…humans have reshaped about half of the Earth’s surface. We have dictated what plants grow and where. We’ve pocked and deformed the Earth’s crust with mines and wells, and we’ve commandeered a huge fraction of its freshwater supply for our own purposes. What is new is the idea that we might want to deform the Earth intentionally, as a way to engineer the planet either back into its pre-industrial state, or into some improved third state. Large-scale projects that aim to accomplish this… constitute some of the most innovative and dangerous ideas being considered today to combat climate change.

For it turns out that the sulfur dioxide idea is just one among many. Scottish engineer Stephen Salter discusses a strategy to use a fleet of 1500 ships to churn seawater, spraying it high into the clouds to add moisture and make the clouds more reflective. Roger Angel (University of Arizona) proposes a series of huge electromagnetic guns in the upper atmosphere that would launch a Sun-shield made up of millions of Frisbee-sized disks to the L1 Lagrangian point, effectively scattering sunlight.

Freelancing Global Climate Change

The danger here, beside the unintended consequences that could so quickly attend such schemes, is that international cooperation on climate change could quickly be rendered irrelevant. Solutions like sulfur dioxide are cheap enough — $100 billion would be enough, Wood says, to reverse anthropogenic climate change entirely, and it might cost far less — that a single country could take on the challenge itself.

Wood turns to geophysicist Raymond Pierrehumbert (University of Chicago) for thoughts on possible unintended consequences of the sulfur dioxide strategy. The geophysicist reminds him of the Greek legend of Dionysius II, who to make a philosophical point suspended a sword over Damocles’ head from a single hair:

According to Pierrehumbert, sulfur aerosols would cool the planet, but we’d risk calamity the moment we stopped pumping: the aerosols would rain down and years’ worth of accumulated carbon would make temperatures surge. Everything would be fine, in other words, until the hair snapped, and then the world would experience the full force of postponed warming in just a couple of catastrophic years. Pierrehumbert imagines another possibility in which sun-blocking technology works but has unforeseen consequences, such as rapid ozone destruction. If a future generation discovered that a geo-engineering program had such a disastrous side effect, it couldn’t easily shut things down. He notes that sulfur-aerosol injection, like many geo-engineering ideas, would be easy to implement. But if it failed, he says, it would fail horribly. “It’s scary because it actually could be done,” he says. “And it’s like taking aspirin for cancer.”

A Carbon-Cutting Alternative

Cutting carbon emissions seems like a far preferable solution, and it’s one that Freeman Dyson suggested in a geo-engineering strategy he designed as far back as 1977, one that would create forests of trees engineered to be more effective at drawing carbon from the air, trapping the carbon in the topsoil. Other carbon-withdrawing schemes call for creating large towers whose grids would be coated with a chemical solution that could bind carbon-dioxide molecules. That one, by David Keith (University of Calgary), stashes captured carbon deep underground.

Ponder the implications when any one of the 38 people on the planet who have $10 billion or more in private assets could try to reverse climate change single-handedly. There’s one more Fermi solution — technological civilizations run afoul of their own technology as the cost of tackling massive projects drops to the point where individuals or small groups can destroy an ecosystem while attempting to fix it. Is a game-changing technology to fix climate change worse than the problem? Perhaps a more judicious view is that a technological big fix is what Wood calls “…the biggest and most terrifying insurance policy humanity might buy — one that pays out so meagerly, and in such foul currency, that we’d better ensure we never need it.”

Huge Outburst from a Magnetar

We get yet another example of space-based observatories complementing each other with the recent outburst of X-rays and gamma rays detected last August. The Swift satellite first noted the event on August 22, while the European Space Agency’s XMM-Newton satellite began making detailed spectral studies of the radiation twelve hours later, followed by the Integral observatory. The outburst went on for more than four months, accompanied by hundreds of smaller bursts.

The source for these events was a magnetar, a type of neutron star that is the most highly magnetized object known, with a magnetic field some 10,000 million times stronger than Earth’s. The new magnetar, christened SGR 0501+4516, is of the type known as Soft Gamma-Ray Repeaters (SGR), and is the first such found in the last decade. Magnetars are known for spectacular periods of irregular burst activity, changing their luminosity up to ten orders of magnitude on timescales of just a few milliseconds.

magnetar_fin_L

We occasionally discuss doomsday scenarios from nearby supernovae and the like, speculating on conditions that could limit planetary habitability, but in these troubled financial times, ESA’s own comparison may be even more dire. Speculating on what effect a magnetar at half the Moon’s distance from the Earth would have, ESA notes that its magnetic field would wipe the data off every credit card on the planet (more here).

Image: Illustration of a magnetar. Magnetars are the most intensely magnetized objects in the Universe. Their magnetic fields are some 10 000 million times stronger than Earth’s. Credit: NASA.

SGR 0501+4516, though, is about 15,000 light years away, unknown until the recent outburst. We still have much to learn about magnetars, including the crucial question of how they form. With only fifteen of them known in the Milky Way, competing ideas circulate, including the notion that a magnetar is the remaining stellar core of a highly magnetic star at the end of its life. The other suggestion involves the death of a normal star, its core accelerated in its last convulsions to provide a dynamo that strengthens its magnetic field. Both ideas are still in play.

Up next for SGR 0501+4516 is to observe it in a quiescent state, comparing these data with the results of the outburst. XMM-Newton will be on the case next year. The paper is Rea et al., “The first outburst of the new magnetar candidate SGR 0501+4516,” published online in Monthly Notices of the Royal Astronomical Society (June 12, 2009). Preprint available. Interesting to see that the paper suggests the distinction between SGRs and the other type of magnetar — Anomalous X-Ray Pulsars (AXP) — may not stand up to scrutiny. We may be on our way to unifying these sub-groups into a single family of magnetars.

Two Angles on Meteorites

Meteorites are in the news in two starkly different ways this week, but I’ll lead with a story that has implications for how planetary systems like ours are born. Philipp Heck (University of Chicago) and colleagues have been analyzing interstellar grains from the Murchison meteorite, a large object that fell near the town of Murchison, Victoria in Australia in 1969.

murchison_meteorite_4

The Murchison grains are thought to have been blown into space by dying stars long before the formation of Earth. We’d like to know more about such grains because they became incorporated into the earliest solids forming in the Solar System, and hence offer a window into that era. Moreover, their composition helps us understand a bit more about their history. “The concentration of neon,” says Heck, “produced during cosmic-ray irradiation, allows us to determine the time a grain has spent in interstellar space.”

Image: A fragment of the Murchison meteorite. Copyright New England Meteoritical Services, 2001.

The interstellar travel time for these grains, however, turns out to be less than expected, falling into a range between three million and 200 million years, less than half the previous 500 million year estimate for interstellar space exposure. In fact, of the 22 grains studied, only three lived up to the longer estimate. A period of intense star formation one to two billion years before the Sun’s birth, one that produced large quantities of dust, could account for the early incorporation of the grains.

Seen in one light, objects like the Murchison meteorite are storehouses of information about the early Solar System. Seen in another, they and their larger cousins form a class of dangerous objects that could pose a threat we need to anticipate. On that score, we have, until recently, studied incoming objects by using satellites built to detect nuclear bomb tests. They’ve done priceless double-duty, offering up data on asteroids and meteoroids entering the atmosphere. Leonard David points out in a recent story that most incoming objects are never seen from the ground because they fall over ocean or in broad daylight.

meteorfull_med

Image: A fireball over Norway in 1998. Credit: Arne Danielsen.

But data on incoming bolides are now being treated as classified, and that’s bad news for the study of future impacts. The asteroid 2008 TC3, for example, was the first case of an astronomical detection of such an object before it hit. Data on the subsequent fireball from surveillance satellites proved crucial in locating the impact zone of this object in the Sudan. Tracking smaller impactors is a key part of the overall plan to study and identify dangerous Near Earth Objects, so let’s hope the scientific community can make this case clearly enough to cause policy-makers to reconsider.

The paper on the Murchison grains is Heck et al., “Interstellar Residence Times of Presolar SiC Dust Grains from the Murchison Carbonaceous Meteorite,” Astrophysical Journal 698 (2009), pp. 1155-1164 (abstract). A University of Chicago news release is available.