NASA: The Hunt for Good Ideas

Is NASA going to start pushing back into the realm of truly innovative ideas? Maybe so, to judge from what Robert Braun continues to say. Braun, who joined the agency in February, is now NASA chief technologist, a recently revived office that coordinates mission-specific technologies at the ten NASA centers. This story in IEEE Spectrum notes that Braun is soliciting ‘disruptive technologies’ through a series of ‘grand challenges.’ Most of these relate to short-term space activities such as Earth observation missions, but enhancing robotics and pushing new ideas in space propulsion has obvious implications for deep space operations.

From Susan Karlin’s story at the IEEE Spectrum site:

The grand challenges address three areas: accessing space more routinely, managing space as a natural resource, and future quests. Achieving these goals mostly boils down to improvements in spacecraft propulsion, energy use, and safety; advances in astronaut health, communication technology, and artificial intelligence; a better understanding of near-Earth environments, such as meteors, solar wind, and cosmic rays; observations of climate change and predicting natural disasters; and searching for extraterrestrial life and Earth-like worlds.

That’s a pretty wide range, but what I find encouraging is Braun’s dogged emphasis on open competition. Recall that NASA’s now defunct Institute for Advanced Concepts purposely encouraged work from outside the agency to avoid the ‘not invented here’ syndrome. Braun’s office now says the agency is looking for ideas from anywhere in the world, and that includes academia, private spaceflight and aeronautics firms, and individual inventors. All ideas will be sent through the technical peer review process and ultimately chosen by Braun and team, with grant and prize money up to $1 million per project to assist in the development of the technology.

The article quotes Braun on the process:

“I’m talking about an open competition model from an open community of innovators,” says Braun. “Not where we say, ‘Here’s a solicitation, and if you work for the government or a university, you can compete for this award.’ I’m talking about strategically defining the technologies needed over the next 10 or 20 years, putting those capabilities on the street in a competitive bid, and then having the community—folks in government, academia, industry, and citizen innovators working in their garages—form ad hoc teams on their own.”

The open model served NIAC well, and if it is true that the new mandate will also involve resurrecting NIAC itself, so much the better. Have a look some time at the still available NIAC site under NIAC Funded Studies to see the range of work Robert Cassanova and team studied at the Institute, everything from antimatter collection to redesigning living organisms for non-terrestrial environments. Getting the Institute back in operation would be a solid win for those advocating a return to the study of futuristic concepts as part of NASA’s mission.

When Deborah Gage interviewed Braun in August about NIAC, he had this to say:

Looking longer term, there’s the NIAC [NASA Institute for Advanced Concepts], and we’re proud to be bringing NIAC back—it’s one of the 10 programs in Space Technology. It’s modest dollar value, but it was a great way in my opinion for NASA to engage external innovators in small and larger businesses and academia to get their visions of the future.

One problem NIAC had previously was that it was so revolutionary, with 40-years-and-out system concepts, that there were no technology programs to carry along the innovators idea. So the innovator would win funding and study the concept for a year and there would be no place for that idea to go.

Now we have a way to transition a NIAC idea from concept to flight, and we’ve worked hard on that.

All this developing out of a National Research Council report a year ago that called for NIAC’s return, a report headed up by Braun while he was still at the Georgia Institute of Technology in Atlanta. Among the things the report noted: Over a nine year period, NASA invested $36.2 million in NIAC, covering 168 grants in that period. Some of these grants received a total of $23.8 million from outside organizations, indicating their viability at attracting partners, and 28% of 42 Phase II grant projects lived on after NIAC funding was terminated.

NIAC’s innovative ideas, in other words, have proven significant, and if NASA is going to return to a culture of innovation, an organization within the agency has to spearhead the effort. We’ll see how all this develops in the context of Braun’s ‘grand challenges’ and the mandate for finding good ideas whatever their source. More on the grand challenges will appear soon on the Office of Chief Technologist’s Web site, while the Centennial Challenges prize program for ‘citizen inventors’ is already online. Braun recommends studying the National Academies’ decadal surveys to ponder what technologies are most likely to need a new perspective.

tzf_img_post

Dust and Fast Missions

The recent debate between Jean Schneider (Paris Observatory) and Ian Crawford (University of London) is the sort of dialogue I’d like to see more of in public forums. When I began researching Centauri Dreams (the book) back in 2002, I was deeply surprised by the sheer energy flowing into interstellar flight research. True, it lacked focus and tended to be done by researchers in their spare time, as opposed to being funded by universities or government agencies, but I had not realized that the topic itself was under such serious investigation by so many scientists.

All those fascinating concepts, from laser sails to fusion runways, were the catalyst for this site, where keeping an eye on the ongoing discussion is the order of the day. In an era of short-term thinking and instant gratification through one gadget or another, taking a longer look at the human enterprise and where it is going is an imperative. One way to do that is to consider whether our species has a future in deep space, and just what the shape of that future might be. Discussions like Schneider’s and Crawford’s look long-term, at what we might one day accomplish with our technologies, and whether or not interstellar missions really are feasible.

We can surely say this much: Nothing in physics rules out interstellar flight, even though to accomplish it in the relatively near-term would require extremely long mission times for robotic vehicles whose expense would dwarf their potential utility even if we had the patience to wait out their journey. Let’s hope that by continued research we can learn to do better, and that means taking existing concepts and reworking them in light of new technologies to see what the possibilities are. All of which is fascinating stuff, and necessary as foundation-building even if we are, as seems most likely, one or more centuries away from the launch of any such mission.

Measurements of the Medium

First steps need to be taken in any enterprise. On that score, I remind you of the interstellar dust issue that Jean Schneider first raised in his original story in Astrobiology. We’re worried about a fast-moving vehicle (Schneider talks about 0.3c, though he and Crawford later drop the number to 0.1c) and the prospects of its encountering dust grains that could produce lethal damage. We’ll eventually need precursor missions to the edge of the Solar System and beyond to understand how the interstellar medium differs from interplanetary space in terms of dust.

In the afterword to his novel Flying to Valhalla, Charles Pellegrino makes a vivid case for potential disaster:

“Flying through space at significant fractions of lightspeed is like looking down the barrel of a super particle collider. Even an isolated proton has a sting, and grains of sand begin to look like torpedoes.”

Much data gathering is ahead, but the process has already begun. For now, we have New Horizons on its way to Pluto/Charon and the Kuiper Belt, taking useful readings through the mission’s dust counter, which is, by a happy choice, named after Venetia Burney, the English girl who named Pluto (Michael Byers’ account of the Pluto naming discussion in his novel Percival’s Planet is wonderful). New Horizons is now just beyond the orbit of Uranus and, as I write this morning, is 2 hours and 32 minutes light time (just over 18 AU, or 2,738,437,000 kilometers) from the Earth.

Image: An artist’s view of New Horizons approaching the Pluto/Charon flyby. Credit: SwRI.

Dust detecting instruments that have measured dust beyond the orbit of Jupiter have been rare, beginning with those aboard the Pioneer 10 and Pioneer 11 spacecraft, which were followed up by the dust analyzer aboard Cassini (more about this, and Voyager 2’s contribution , in a moment). The New Horizons dust counter is fairly straightforward, a thin plastic film on a honeycombed aluminum structure about the size of a cake pan mounted on the outside of the spacecraft. Each dust particle that strikes the detector sets off a unique signal, allowing its mass to be inferred. The fact that this is a student-built project (though with NASA engineering standards and professionally built flight instruments) brings home the excitement of this deep space mission in a way that, let’s hope, will galvanize future scientists and engineers.

Voyager 2 was also useful in dust measurements, though in a more indirect way. While the Pioneers carried dust counters, Voyager 2 measured the effects of space-borne dust by its effect on the spacecraft’s plasma wave instrument. The latter was designed to measure charged particles inside the magnetic field of gas giant planets, but usefully enough, it also registered a hit when the spacecraft encountered dust, picking up the plasma the vaporized particle created. Cassini carried a Cosmic Dust Analyzer to measure interplanetary dust grains beginning with its gravity assist at Venus in 1999 and lasting until arrival at Saturn in 2004.

It will be fascinating to see how New Horizons’ dust data compare with earlier missions (thus far what is being seen is in agreement with data from the Galileo and Ulysses missions in Jupiter space). What we learn about dust in the Kuiper Belt will be entirely new information as New Horizons speeds through this vast region, giving us clues about what an interstellar precursor mission might one day encounter. Some scientists, Schneider among them, believe dust could be a showstopper for probes moving at 10 percent or more of the speed of light. If that proves to be the case, we’re in for serious rethinking of interstellar mission concepts.

tzf_img_post

Interstellar Flight and Long-Term Optimism

It’s fascinating to watch the development of online preprint services from curiosity (which is what the arXiv site was when Paul Ginsparg developed it in 1991) to today’s e-print options, hosted at Cornell and with mirrors not just at the original Los Alamos National Laboratory site but all around the world. Then, too, the arXiv is changing in character, becoming an early forum for discussion and debate, as witness Ian Crawford’s comments on Jean Schneider’s Astrobiology paper. We looked at Crawford’s criticisms of Schneider yesterday. Today we examine Schneider’s response, likewise a preprint, and published online in a fast-paced digital dialogue.

Schneider (Paris Observatory) focuses here on nuclear fusion and antimatter by way of making the case that interstellar flight will be a long and incredibly difficult undertaking. A bit of context: Schneider’s real point in the original Astrobiology piece wasn’t to offer a critique of interstellar flight ideas, but to call attention to the gap that will occur after we have made the first detection of biosignatures on exoplanets. We’ll have evidence of life when that occurs, but it may be centuries, in Schneider’s view, before we know what that life looks like, because unlike relatively nearby places like Mars, we’ll find it a monumental undertaking to send a probe to an exoplanet.

Antimatter and Its Dilemmas

Crawford, of course, questioned whether interstellar flight was as difficult to imagine as Schneider believed, and the two remain on separate paths. Schneider’s concerns about antimatter are solid: He’s looking at what it would take to produce the antimatter needed to power up a 100-ton spacecraft, and finds that the vehicle would require 1027 erg at 0.1c, the velocity Crawford uses as a base. And the problem here is daunting, for the total energy needed to produce the requisite antimatter is 200 terawatts over ten years of continuous production. Today’s total instantaneous energy production on Earth is about 20 terawatts.

But the problem gets trickier still. Schneider doesn’t go into antimatter storage, but listen to what Frank Close says about the issue in his book Antimatter (Oxford University Press, 2009):

The propaganda for antimatter power is that to take a manned spaceship to Mars, the three tonnes of chemical propellant could be reduced to less than a hundred of a gram of antimatter, the mass of a grain of rice….However, the promoters of such hopes say less about the weight of the technology that will be required for containing the antimatter. Large numbers of antiprotons or positrons imply a large concentration of electric charge, which has to be contained. To store even one millionth of the amount needed for a Mars trip would require tons of electric force pushing on the walls of the fuel tank…

And so on. These are major issues. We’ve been able to store about a million anti-protons at once thus far, which seems like a big number, but Close points out that it’s a billion billion times smaller than what you would need for a gram. Be aware, too, that since the discovery of the anti-proton in 1955, the total of our anti-proton production is less than a millionth of a gram. None of this is to rule out future advances in antimatter production or collection (we’ve looked at James Bickford’s ideas on antimatter harvesting from space before in these pages). But you can see why Schneider is a skeptic about antimatter as rocket fuel, at least as of now.

Fusion: A Promise Deferred

The fusion argument divides between those who see the bright promise vs. those who see the frustrating history of the idea. Schneider notes how conceptually simple fusion is, but the simple fact is that seventy years after the invention of the basic concept of deuterium/tritium fusion, we still can’t make it work in any stable production facility. He continues:

The ITER fusion facility is not expected to achieve production energy at a demonstration level before 2030, that is, almost a century after the nuclear fusion concept was invented. The author correctly mentions the developments in miniaturization. As an example, he cites the National Ignition Facility (a similar, less advanced project called « Mega Joule Laser » exists in Europe). But this facility, with all its control and cooling systems, is presently quite a non-miniaturized building. In spite of the fact that presently it will only provide impulsive (non continuous) fusion energy, presently at a slow rate of one impulse per hour, one can imagine that in the future these impulses can be accumulated to provide a sufficient acceleration to the spacecraft. But it requires an initial energy of a few mega joules per 1 nanosecond impulse, and in the spacecraft this energy must come from somewhere.

Schneider is open to ideas, and notes how presumptuous it is to predict what will happen after a century or more, calling for the debate over these issues to go on. In certain respects, I don’t find his views as different from Crawford’s as they might at first appear. While he correctly cites the interstellar dust problem and the danger of high-speed, debilitating collisions with particles, both authors are aware of how much we have to learn about the interstellar medium. As we start accumulating the data we’ll need, we have to take risk evaluation into account.

Calculating the Odds

How? Current missions may launch with somewhere between 1 and 0.1 percent chance of failure, but we already know that given an unforeseen breakthrough, interstellar missions will be incredibly costly, at least in the early going. Reducing the risk is thus mandatory (Schneider would like to see it go to something like one in several thousand), and doing that increases the cost. I don’t imagine Crawford would argue with this, though the two disagree on timelines, with Crawford more optimistic about the near-term, and Schneider arguing that centuries will likely pass before we can speak about a true interstellar probe. Referring to the Technology Readiness Level classification, a key part of risk evaluation, he has this to say:

For interstellar travel, we are at best at level 1 (or even 0.5), while a « Flight proven » mission will realistically require first a precursor mission to secure the technological concept, including shielding mechanisms, at say 500 to 1000 Astronomical Units. As a comparison, I can take the nulling interferometry concept for the infrared detection of exo-Earths. It was invented in the late 70s (Bracewell 1978) and is still not foreseen for a launch by ESA and NASA before 2030, that is, 50 years after the invention of the concept for a mission at least 100 times easier and cheaper than interstellar travel.

The Millennial View

Those with a near-term bent will see a profound gulf between these two views, but I think those of us who are looking long-term will find the debate converging on common themes. Neither of these men rules out interstellar flight on scientific grounds, and both are aware of the huge difficulties that must be overcome for it to occur. If you are an optimist over a given technology, you might believe, for example, that fusion will succeed in revolutionizing space propulsion and thus pave the way for an early mission. But if you’re content with the idea that interstellar flight is going to occur, the question of just which century it occurs in carries a bit less weight.

I don’t, then, find this statement off-putting:

To deserve an interstellar travel mission, an exoplanet will require very solid clues of biosignatures (to quote Willy Benz, « Extraordinary claims require exceptional proofs »). I hope that current radial velocity monitorings will discover the existence of habitable planets around Alpha Cen A or B, and that in the coming decades these planets will reveal solid biosignatures. But what if the nearest planet with credible biosignatures lies at 10 pc? Even at a speed of 0.1c, the travel will last 400 years.

True enough, and if biosignatures do turn out to be this rare, we’ll have to re-evaluate our options (and our motives) if we don’t develop the technologies to make the journey faster than 0.1c. But the issue isn’t whether we can cross ten parsecs. It’s whether interstellar flight as an idea has any merit, and it remains to be seen what mission concepts might one day develop to much closer stars. We’re in that necessary period of evaluation that is, as we speak, opening up new solar systems on a regular basis, and what we find in them will doubtless affect the motivations for a mission.

I’m with Crawford in noting the large number of propulsion options being investigated in the literature, but like Schneider, I wouldn’t want to single out a particular one as the most likely to succeed (although I admit to a continuing interest in beamed lightsail concepts). It’s too early for that, and the excitement of this era in our investigations is precisely that we’re at the entry level, scrambling for ideas, weighing technologies, speculating about future breakthroughs like nanotech that could revolutionize the game. Let the debate continue indeed, and let’s hope we can keep it at the lively level these two scientists have established through their dialogue.

The paper is Schneider, “Reply to ‘A Comment on ‘”The Far Future of Exoplanet Direct Characterization” – the Case for Interstellar Space Probes,'” accepted at Astrobiology (preprint).

tzf_img_post

Interstellar Flight: The Case for a Probe

Back in May I looked at Jean Schneider’s thoughts on what we might do if we discovered a planet in the habitable zone of a nearby star. In an article for Astrobiology called “The Far Future of Exoplanet Direct Characterization,” Schneider (Paris Observatory) reviewed technologies for getting a direct image of an Earth-like planet and went on to discuss how hard it would be to get actual instrumentation into another solar system. His thoughts resonate given recent findings about Gliese 581g (although the latest data from the HARPS spectrograph evidently show no sign of the planet, a startling development as we investigate this intriguing system).

Whether or not Gl 581g exists and is where we think it is, Schneider’s pessimism about getting an actual payload into another solar system has attracted the attention of Ian Crawford (University of London), who is quick to point out that astronomical remote-sensing, especially for biological follow-up studies of initial biomarker detections, will be inadequate. As we have done with nearby objects like Mars, we will eventually need to send instruments into these systems to study everything from basic biochemistry to evolutionary history there.

Slower Speeds, Improved Technologies

But are such missions possible? Schneider and colleagues assumed 0.3c as a typical velocity for interstellar missions and went on to discuss the huge difficulties in accelerating a payload to such values. But Crawford is skeptical. He sees 0.3c as an overestimate, and reminds us that the better developed proposals in the literature tend to focus on values around 0.1c — as he notes, “0.3c will be an order of magnitude more difficult owing [to] the scaling of kinetic energy with the square of the velocity.” Schneider’s 0.3c is, in Crawford’s view, an arbitrary over-estimate of the speed required.

The original Daedalus study is the most detailed engineering assessment yet available, a fusion-based craft that would require fully 50,000 tons of nuclear fuel and attain 12 percent of lightspeed. Daedalus, in other words, is well beyond our capabilities. But Crawford’s point is to remind us that we’re learning more as we go, and that premature pessimism may overlook useful technological advances. On fusion, then:

…technical advances in a number of fields have occurred which may make fusion-powered vehicles of the Daedalus-type more practical as long-term solutions to the problem of interstellar travel. These include developments in miniaturization and nanotechnology, which would ensure that a much less massive payload would be required than was then assumed, and developments in inertial confinement fusion research for power generation on Earth. Indeed, the National Ignition Facility recently commissioned at the Lawrence Livermore National Laboratory in California (https://lasers.llnl.gov) is, albeit unintentionally, building up technical competencies directly relevant to the development of fusion-based space propulsion systems. For these reasons, there is a strong case for a reassessment of the Daedalus concept in light of updated scientific and technical knowledge, and at least one such study is currently underway…

Surviving Interstellar Dust

That study, of course, is Project Icarus, in which Crawford is an active player. But any vehicle, whether fusion based or using more exotic concepts like antimatter or laser-pushed lightsails, runs into the interstellar dust problem. Drop the speed from 0.3c to 0.1c and the issue is partially mitigated. Dust was a showstopper for Schneider, but Crawford notes that assuming an interstellar dust density of 6.2×10-24 kg m-3, erosion at 0.1c would erode on the order of 5 kg m-2 of shielding material over a six light year flight. So that while we are certainly adding to the mass of the probe with shielding material like beryllium, we have not ruled out the mission.

One of the problems with this discussion is that we know so little about the upper bounds of the size distribution of interstellar dust particles. Crawford sifts through the literature on the subject, discussing the effect of impacts of 100-μm grains, which could be as high as two impacts per square meter over the course of a six light year flight. Collecting the needed data on the interstellar medium will be a priority before we can seriously think about launching such a probe.

And if a spacecraft were large enough, Crawford adds, various methods for sensing potential danger could be employed, using radar, for example, to detect incoming grains and laser or electromagnetic methods to destroy or deflect them before impact. The Daedalus study examined the idea of using a fine cloud of small dust particles ejected from the vehicle that would destroy any incoming large grains before they reached the primary vehicle. The latter approach was conceived for a probe entering a denser interplanetary environment at destination, but there is no reason such measures couldn’t be deployed throughout interstellar cruise.

An Astrobiologically Driven Probe

In his conviction that interstellar flight is difficult but not impossible, Crawford echoes Robert Forward:

Journeys to the nearer stars with travel-times of decades (necessitating velocities of the order of ten percent of the speed of light) will be a considerable technological (as well as economic and political) undertaking. The magnitude of the difficulties should not be underestimated, but neither should they be exaggerated. There is a large technical literature…, which demonstrates that rapid interstellar space travel is not physically impossible and is a legitimate technological goal for the centuries ahead. Ultimately, the development of this capability may be the only way to follow-up any detections of biosignatures that may be made in the atmospheres of Earth-like planets orbiting nearby stars in the coming decades…

A primary driver for interstellar flight is likely to be astrobiology. As we develop a mature space exploration infrastructure in our own Solar System to explore life’s possibilities from Mars to Enceladus to the Kuiper Belt, we will also be creating the necessary technologies that will take us further out. Astronomical observations can only tell us so much, leaving us with the need to get an orbiter or a lander to places like Titan or Europa, and giving us the longer term model of direct probes returning data from astrobiologically interesting planets around other stars.

The paper is Crawford, “A Comment on ‘The Far Future of Exoplanet Direct Characterization’ — the Case for Interstellar Space Probes.” Accepted by Astrobiology (preprint). Jean Schneider has responded to Crawford’s comments and I’ll look at what he has to say tomorrow.

tzf_img_post

A Look Into Titan’s Haze

Voyager’s controllers thought so much of Titan that when Voyager 1 approached Saturn and the choice arose between sending it on to the outer planets or taking a sharp jog off the ecliptic to view the enigmatic moon, they chose the latter. We all know the result: Titan remained as mysterious as ever, its surface shrouded in orange haze. But you can see why they needed that look. Here was a moon that was large enough to be a planet, with a thick atmosphere and all kinds of speculation about what was on its surface. No wonder Titan was the most tempting of targets, and one of huge scientific value.

These days we routinely get Cassini imagery from Titan flybys, and the place has gained definition. I remember once asking Geoffrey Landis, having read his superb 2000 novel Crossing Mars, whether Mars had become more or less an everyday place to him, like Cleveland (he lives there, working at NASA GRC). And it was true: After you study and survey and write novels about a place, it does become tangible, acquiring familiar placenames and recognizable surface features. Titan is becoming like that, with its lakes of liquid methane/ethane, its well studied seasonal change, its steadily building catalog of placenames.

Image: Voyager 1 couldn’t see through Titan’s haze, but after numerous Cassini passes and simulations here on Earth, we’re learning much more about this world’s atmosphere and its implications. Credit: NASA/JPL/Caltech.

And now we’re gaining insights about the chemical processes at work on the distant moon. It turns out that, as we’ve long suspected, the basic ingredients for life are available here, and what’s more, they’re occurring in the outer layers of the atmosphere without the need of a surface. As revealed in simulations carried out in France by a team led by researchers from the University of Arizona, a wide variety of complex organic molecules including amino acids and nucleotide bases can be produced in a simulated Titan atmosphere. Sarah Hörst, who works the university’s Lunar and Planetary Lab, helped to lead the research effort:

“Our team is the first to be able to do this in an atmosphere without liquid water,” says Hörst. “Our results show that it is possible to make very complex molecules in the outer parts of an atmosphere.”

The five nucleotide bases life uses on Earth to make the genetic materials DNA and RNA – cytosine, adenine, thymine, guanine and uracil — all turn up in the simulations, and so do the amino acids glycine and alanine. The work was presented at the Division of Planetary Sciences meeting in Pasadena that has produced so much interesting outer system work this year, including the results on Eris and the work on Neptune’s interactions with the Kuiper Belt that we looked at on Friday. Titan’s haze, we now learn, may be a reservoir of prebiotic molecules, and we have the interesting possibility that such precursors may have fallen to Earth out of its own primordial haze as life began to take hold here.

That smoggy orange ball Voyager I saw is loaded with aerosols laden with organic molecules. Here things get even more interesting, because while Titan’s surface probably lacks the energy needed to break apart atmospheric nitrogen, methane and carbon monoxide, rearranging them into more complex prebiotic molecules, the upper reaches of the atmosphere are exposed to ultraviolet radiation, not to mention charged particles from the Sun that have been deflected by Saturn’s magnetic field. The researchers mixed gases found in Titan’s atmosphere in the lab in Grenoble and simulated the energy hitting the upper atmosphere by using microwaves, causing some of the raw materials to bond together.

Image: A window into Titan’s atmosphere: Energized by microwaves, the gas mix inside the reaction chamber lights up like a pink neon sign. Thousands of complex organic molecules accumulated on the bottom of the chamber during this experiment. (Photo: S. Hörst)

The results from the work were striking. 5,000 different molecular formulae turn up with these simulations as analyzed by a mass spectrometer. Says Hörst:

“We really have no idea how many molecules are in these samples other than it’s a lot. Assuming there are at least three or four structural variations of each, we are talking up to 20,000 molecules that could be in there. So in some way, we are not surprised that we made the nucleotide bases and the amino acids… We never started out saying, ‘we want to make these things,’ it was more like ‘hey, let’s see if they’re there.’ You have all those little pieces flying around in the plasma, and so we would expect them to form all sorts of things.”

Is the same kind of chemistry at work in the atmospheres of planets around other stars? The Titan simulations give the idea continued plausibility, another indication that life may be commonplace in the galaxy or, at least, that the conditions for its formation should not be rare. We’ve come a long way from the iconic Chesley Bonestell images of Titan to today’s hazy landscapes sculpted by liquid ethane/methane and laden with a dense prebiotic soup of an atmosphere. It’s a fascinating world, as we’ve long suspected, and it’s giving us clues about the chemistry that produces biological material and eventually leads to life.

tzf_img_post