Enceladus Hotspots, and Memories of Orion

Although we’ve been preoccupied largely with theoretical matters this week, I don’t want it to close without reference to the new Cassini imagery of Enceladus. This shot was made at a phase angle of 145 degrees when Cassini was about 14,000 kilometers from Enceladus, during the flyby of November 21. The remarkable jets spraying from the fractured surface in the south polar region are clearly visible.

Image: Dramatic plumes, both large and small, spray water ice out from many locations along the famed “tiger stripes” near the south pole of Saturn’s moon Enceladus. The tiger stripes are fissures that spray icy particles, water vapor and organic compounds. More than 30 individual jets of different sizes can be seen in this image and more than 20 of them had not been identified before. At least one jet spouting prominently in previous images now appears less powerful. Credit: NASA/JPL/SSI.

I keep thinking about Project Orion, back in the crazy days before the Test Ban Treaty of 1963 closed down the nuclear option. The prospect of taking a huge vessel with a crew of 100 all the way to Saturn as early as 1968 was much in the air at Los Alamos, and Enceladus was to have been the ultimate destination, chosen because observations of the distant moon seemed to show plenty of ice on the surface. So gleefully did the Orion team ponder its propulsive capabilities that project leader Ted Taylor wanted to install a two-ton barber’s chair on the ship, a poke in the eye to chemical rocketry and all its limitations. The atomic spaceship was going to be big.

As to Enceladus, Dyson recalled:

“We knew very little about the satellites in those days. Enceladus looked particularly good. It was known to have a density of .618, so it clearly had to be made of ice plus hydrocarbons, really light things, which were what you need both for biology and for propellant, so you could imagine growing your vegetables there…”

The quote is from George Dyson’s Project Orion: The True Story of the Atomic Spaceship (New York: Henry Holt, 2002), and it always makes me wonder what Dyson and crew would have thought of Enceladus, with its extraordinary sprays of fine material showing geological activity and the possibility of liquid water, when they actually arrived. The new Cassini imagery shows us more jets than ever before, more than thirty individual geysers in one mosaic, with changes to previously seen jets that are telling:

“This last flyby confirms what we suspected,” said Carolyn Porco, imaging team lead based at the Space Science Institute in Boulder, Colo. “The vigor of individual jets can vary with time, and many jets, large and small, erupt all along the tiger stripes.”

The section of Baghdad Sulcus shown below combines heat data with visible-light images for a 40-kilometer stretch of this, the longest of the so-called ‘tiger stripes.’ Peak temperatures along Baghdad Sulcus reach 180 kelvin, perhaps as high as 200 kelvin, which the Cassini team believes is the result of heating by upwelling water vapor. This is an intense effect — the heat is confined to a narrow region about a kilometer wide along the fracture, and its strength varies along the length of the fissure.

Image: A mosaic combining high-resolution data from the imaging science subsystem and composite infrared spectrometer aboard NASA’s Cassini spacecraft. Pockets of heat appear along one of the mysterious fractures in the south polar region of Saturn’s moon Enceladus. The fracture, named Baghdad Sulcus, is one of the so-called “tiger stripe” features that erupt with jets of water vapor and ice particles. It runs diagonally across the image. This mosaic, obtained on Nov. 21, 2009, shows a 40-kilometer (25-mile) segment of Baghdad Sulcus and illustrates the correlation between the geologically youthful surface fractures and anomalously warm temperatures recorded in the south polar region. It shows the highest-resolution data yet of the heat leaking from the moon’s interior along the tiger stripes.

This JPL news release goes into more detail:

While the heat appears to emanate mostly from the main Baghdad tiger stripe, some of the fractures branching off or parallel to it also appear warmer and active to varying degrees, though this needs to be confirmed by further analysis. The total amount of infrared energy and the relative amounts given off at different wavelengths show that the highest temperatures along Baghdad Sulcus are limited to a region no more than tens of meters (yards) across. Most of the heat measured by the infrared spectrometer probably arises from the warm flanks of the active fractures, rather than their central fissures. The narrow central fissure is probably even warmer than the 180 Kelvin (minus 140 degrees Fahrenheit) detected – possibly warm enough for liquid water in the fractures to be the source of the observed jets.

Carolyn Porco refers to Enceladus’ “organic-rich, liquid sub-surface environment” as “the most accessible extraterrestrial watery zone known in the Solar System.” The temperature differential between places like Baghdad Sulcus and the 50 kelvin reading of the surrounding surface is fascinating, and tells us that melting underground ice in these regions may not be all that difficult. Too bad we missed out on Orion’s 1968 journey, but the eight Cassini flybys thus far are telling us much about this unexpectedly interesting moon, surely the target of a future mission of its own.

tzf_img_post

Other Life in the Multiverse?

What conditions would you say are ‘congenial to life’? For physicist Robert Jaffe and colleagues at MIT, the phrase refers to places where stable forms of hydrogen, carbon and oxygen can exist. Jaffe explains why:

“If you don’t have a stable entity with the chemistry of hydrogen, you’re not going to have hydrocarbons, or complex carbohydrates, and you’re not going to have life. The same goes for carbon and oxygen. Beyond those three we felt the rest is detail.”

It’s an important issue in Jaffe’s work because he wants to see whether other universes could harbor life. We know that slight changes to the laws of physics would disrupt the evolution of the universe we live in. The strong nuclear force, for example, could have been just a bit stronger, or weaker, and stars would have been able to produce few of the elements needed to build planets. Remove the electromagnetic force and light would not exist, nor would atoms and chemical bonds.

Nudging Nature’s Parameters

Run through the constants of nature and you’ll find many that have to show precise values for life as we know it to have formed. Thus the idea that there may be not one but many universes, each with its own laws, and the thought that we happen to occupy a universe where the conditions that make life a possibility managed to fall into place.

Anthropic reasoning like this — things have to be this way because otherwise we couldn’t be here to think about all this — suggests that multitudes of universes exist, a multiverse in which almost all the universes would be devoid of life and, indeed, matter as we know it. Jaffe is interested in finding out whether universes with different physical laws might not be so inhospitable to life after all. His team focused on universes with nuclear and electromagnetic forces that allow atoms to exist. Another stipulation: Universes that allowed stable forms of hydrogen, carbon and oxygen.

Then it became a matter of playing with nature’s building blocks. Take quarks: In our universe, the ‘down’ quark is roughly twice as heavy as the ‘up’ quark, so that neutrons are 0.1 percent heavier than protons. Jaffe’s team lightened up the down quark so that protons were up to one percent heavier than neutrons. According to this modeling, hydrogen would no longer be stable, but the heavier isotopes deuterium and tritium would be. Carbon-14 could exist and so would a form of oxygen. It’s a different universe than ours, but the models say life could emerge in it.

Other quark variations, including one where the ‘up’ and ‘strange’ quarks have roughly the same mass, unlike in our universe, produced atomic nuclei made up of neutrons and a hyperon called the ‘sigma minus,’ which would replace protons. The fact that we have a reasonable understanding about quark interactions makes them useful for studies of this kind, but changing other physical laws is even trickier business.

Into a ‘Weakless’ Universe

Nonetheless, Lawrence Berkeley National Laboratory researchers have modeled universes that lack one of the four fundamental forces of ours. Without the weak force big bang nucleosynthesis — turning groups of four protons into helium 4 nuclei of two protons and two neutrons — would not have been possible. But when the team at LNBL removed the weak nuclear force in their models, they were able to tweak the other three forces to compensate. Stable elements could form in this universe as well.

Note what’s happening here. Rather than changing a single constant, the LBNL researchers tweaked several. After all, in a multiverse that can keep spewing out universe after universe, all combinations would seem to be possible and you can keep trying until you get it right. This Scientific American article by Alejandro Jenkins (MIT) and Gilad Perez (now at the Weizmann Institute) gets into the specifics:

In the weakless universe, the usual fusing of protons to form helium would be impossible, because it requires that two of the protons convert into neutrons. But other pathways could exist for the creation of the elements. For example, our universe contains overwhelmingly more matter than antimatter, but a small adjustment to the parameter that controls this asymmetry is enough to ensure that the big bang nucleosynthesis would leave behind a substantial amount of deuterium nuclei. Deuterium, also known as hydrogen 2, is the isotope of hydrogen whose nucleus contains a neutron in addition to the usual proton. Stars could then shine by fusing a proton and a deuterium nucleus to make a helium 3 (two protons and one neutron) nucleus.

But would these stars be anything like what we are familiar with? The article continues:

Such weakless stars would be colder and smaller than the stars in our own universe. According to computer simulations by astrophysicist Adam Burrows of Princeton University, they could burn for about seven billion years—about the current age of our sun—and radiate energy at a rate that would be a few percent of that of the sun.

A Strange But Living Universe

A strange place, this ‘weakless’ universe. Supernova explosions of the kind that synthesize and distribute heavy elements in our universe would not occur, at least not from the same causes, but a different kind of supernova caused by accretion rather than gravitational collapse would be possible, allowing elements to seed interstellar space. A planet like ours circling one of the weakless stars would need to be six times closer to the Sun to stay habitable. And check this out:

Weakless Earths would be significantly different from our own Earth in other ways. In our world, plate tectonics and volcanic activity are powered by the radioactive decay of uranium and thorium deep within Earth. Without these heavy elements, a typical weakless Earth might have a comparatively boring and featureless geology—except if gravitational processes provided an alternative source of heating, as happens on some moons of Saturn and Jupiter.

Chemistry, on the other hand, would be very similar to that of our world. One difference would be that the periodic table would stop at iron, except for extremely small traces of other elements. But this limitation should not prevent life-forms similar to the ones we know from evolving. Thus, even a universe with just three fundamental forces could be congenial to life.

Accounting for the Cosmological Constant

Still tantalizing is the cosmological constant, a measure of the amount of energy found in empty space. The discovery of the continuing acceleration of the universe’s expansion has brought ‘dark energy’ into the picture, implying a cosmological constant that is positive as well as minute, allowing the universe to form structure. It’s a constant that seems fine-tuned to a remarkable degree, and as the article notes, “…the methods our teams have applied to the weak nuclear force and to the masses of quarks seem to fail in this case, because it seems impossible to find congenial universes in which the cosmological constant is substantially larger than the value we observe. Within a multiverse, the vast majority of universes could have cosmological constants incompatible with the formation of any structure.”

All of this is almost joyously theoretical, basing itself on a theory of inflation that conceives of small pockets of spacetime that inflate so rapidly that it is impossible to travel between them. Inflation is highly regarded but not definitively understood, but different values for the constants of nature in the universes it produces seem like a reasonable conjecture. And the cosmological constant itself is an example of fine-tuning on such a scale that it may require the existence of a multiverse to give us a rational explanation for how we lucked into this one.

tzf_img_post

Millis: Approaches to Interstellar Flight

How do you go about pushing the frontiers of propulsion science? Tau Zero Foundation founder Marc Millis discussed the question in a just published interview with h+ Magazine. One aspect of the question is to recognize where we are today. Millis is on record as saying that it may be two to four centuries before we’re ready to launch an Alpha Centauri mission. Why the delay? The problem is not so much high-tech savvy as it is available energy, and Millis evaluates it by comparing the energy we use for rocketry today vs. the entire Earth’s consumption of energy.

The question is how much energy we produce and how much we consume, and what percentage of that is devoted to spaceflight. You can see and hear Millis discussing his calculations on the matter in a presentation he made at the TEDx Brussels 2009 session, one that is linked to from the interview. Obviously, the time to the Centauri stars decreases if we decide to put ten times more energy into the space program than we have historically done. Will we make such a choice?

While we’re working such issues out, Millis advocates backing off the idea of choosing a single best approach for interstellar flight. We’re a long way from actually flying such a mission, and rather than attempting to choose a single course, we do better by researching the entire range of possibilities:

Relative to the technology, as a culture we’re so used to thinking how we can get “there” the quickest, or what’s the best single approach. When it comes to interstellar flight and learning to live beyond Earth, this thinking sidetracks us because we’re so far from fruition in our understanding of interstellar space options, that there’s no way for us to pick “the” one way. Instead, there are many different options and unknowns. We stand to gain a lot more from the attempt to understand them – chipping away at them rather than not doing anything at all. By researching the spectrum of possibilities, we’re likely to be better off in the near term.

A research plan that looks laterally, the way a mountain climber evaluates the best path up? We haven’t explicitly tried that approach in interstellar studies, but Millis backs it:

I really want to change the paradigm of how we look at interstellar flight. It’s not just a matter of trying to get there quickly or to find “the best approach,” rather it’s finding the smartest things we can do today that set the stage for a more productive future. At the Tau Zero Foundation, we cover simple solar sails to the seemingly impossible faster-than-light. Rather than trying to identify the best approach, we’re trying to identify the next steps that students can work on to chip away at where their own personal interests lie.

Most of the h+ interview is spent on current issues, such as the cancellation of the Constellation program and the most realistic way to get to Mars, but those with a yen for breakthrough ideas will enjoy Millis’ thoughts on faster than light travel and the time paradoxes it might introduce. Does quantum entanglement show instantaneous connections between particles or are there other explanations, and are there faster than light implications in all this? Read the interview for more, and bear in mind that the book Millis edited with Eric Davis, Frontiers of Propulsion Science, gets into such questions with a vengeance. Re quantum entanglement and its implications, even Millis calls that the hardest chapter in the book, a statement with which most scientists would agree.

tzf_img_post

Missions Cometary and Otherwise

The Stardust spacecraft recently completed a course adjustment maneuver as it continues on its way to comet Tempel 1. The burn began at 2221 UTC on February 17 and lasted 22 minutes and 53 seconds. The net result: A change of the spacecraft’s speed by 24 meters per second. That may not sound like much, but it has big ramifications for this interesting mission. Tempel 1 is a rotating object, and mission scientists want to have a look at places previously imaged by the Deep Impact mission of 2005 (and yes, this is the first time we’ve revisited a comet). An adjustment to Stardust’s encounter time of eight hours and 20 minutes should maximize the chances of seeing the right surface features on the 2.99-kilometer wide potato-shaped mass.

The burn took place almost exactly a year before the spacecraft will reach Tempel 1. You may remember Stardust as the first spacecraft to collect cometary samples and return them to Earth for study. After the sample return capsule was retrieved in 2006, controllers changed the mission to Stardust-NExT (New Exploration of Tempel) and a four and a half year journey to Tempel 1 began. The spacecraft, meanwhile, is holding up quite well, according to project manager Tim Larson (JPL):

“We could not have asked for a better result from a burn with even a brand-new spacecraft. This bird has already logged one comet flyby, one Earth return of the first samples ever collected from deep space, over 4,000 days of flight and approximately 5.4 billion kilometers (3.4 billion miles) since launch.”

Stardust-NExT should be productive not only in providing high-resolution images of the comet’s surface, but in showing us any surface changes that have occurred between successive visits to the inner Solar System. Tempel 1 orbits the Sun every 5.5 years and was photographed spectacularly when the Deep Impact mission’s impactor drove a 200-meter crater some 30-50 meters deep into the object. That crater could not be imaged well by Deep Impact because of obscuring dust. Now we’ll see how that site looks today and learn more about how Jupiter family comets like Tempel 1 evolve.

Image: Comet Tempel 1 as seen by NASA’s Deep Impact mission. Credit: NASA.

Meanwhile, the European Space Agency has whittled down an extensive list of mission proposals to three survivors, which will now enter the definition phase, the step before any final decision is reached on implementation. Both Euclid, which will map galactic distributions to ferret out the underlying structure of dark matter and dark energy, and PLATO (PLAnetary Transits and Oscillations of stars) are of particular interest, with PLATO homing in on the frequency of planets around other stars, including terrestrial worlds. Like CoRoT, the transit-hunting PLATO would also be an astroseismological observatory, probing stellar interiors.

An assessment study prepared by EADS-Astrium on PLATO notes this distinction between it and earlier missions:

Both CoRoT and Kepler missions feature limitations in terms of minimum planet size, maximum orbital period, number of detected exoplanets and capability of further characterization of exoplanets and their host stars. PLATO will offer order of magnitude improvement of the science with respect to CoRoT and Kepler missions, filling the need for a further generation mission, observing more stars with increased magnitude and observing significantly smaller exoplanets, with significantly longer orbital periods.

Astrium and Thales Alenia Space ran parallel assessment studies on PLATO which were presented in December of last year. Both make for interesting reading. The Astrium study is linked above and the Thales study is here.

The third mission is Solar Orbiter, which would approach the Sun to within 62 solar radii. 52 proposals were narrowed down to six in 2008 and now pared down again, a reminder of how difficult it is to get a mission studied, approved, built and launched. Two out of the three missions are likely to be chosen for launch slots, but all three face budgetary challenges as they go through the definition phase, which will extend to mid-2011.

tzf_img_post

Relativistic Rockets, Antimatter and More

Interstellar theorist Richard Obousy (Baylor University) has some thoughts about William and Arthur Edelstein’s ideas on flight near the speed of light. As discussed in these pages on Friday, the Edelsteins, in a presentation delivered at the American Physical Society, had argued that a relativistic rocket would encounter interstellar hydrogen in such compressed form that its crew would be exposed to huge radiation doses, up to 10,000 sieverts in the first second. Because even a 10-centimeter layer of aluminum shielding would stop only a tiny fraction of this energy, the Edelsteins concluded that travel near lightspeed would be all but impossible.

Obousy, who handles the Project Icarus Web site, has his own credentials related to high speed travel, authoring a number of papers like the recent “Casimir energy and the possibility of higher dimensional manipulation” (abstract) that press for continued work into breakthrough propulsion. And when he talked to astrophysicist Ian O’Neill about warp drive concepts last week, Obousy said that we are in the process of laying down “…a mathematical and physical framework for how such a device might function, given the convenient caveat of a ‘sufficiently advanced technology.'” The device, he said, is purely theoretical as of now and we have no evidence that it could be built.

But should we keep investigating? On that score, Obousy has no doubts. With regard to shielding, he argues that metamaterials that bend radiation around objects are a place to begin, offering a conceivable barrier against the kind of radiation the Edelsteins are talking about. All of which makes for lively reading, as does Obousy’s continuing work on the Project Icarus team. Icarus is the descendant of Project Daedalus, the 1970’s era starship design created by the British Interplanetary Society. And while the Icarus guidelines focus on fusion as the propulsion method of choice, Obousy’s interests extend not just to warp drive but also to antimatter possibilities.

The latter is of interest because of its huge energy density, drawing on the abundant energy available within all matter. A single kilogram of matter contains 9×1016 J of energy. “[I]n simpler terms,” says Obousy, “about five tonnes of antimatter would theoretically be enough to fuel all the world’s energy consumption for a single year.” But as he notes in this entry on the Project Icarus site, storage is a huge problem. Positively charged positrons exert a Coulombic force of repulsion against each other, one of the reasons we can store only tiny amounts with current technology.

Ideal storage involves neutral antimatter — antimatter with no net charge — which points to antihydrogen (a stable atom containing a single positron and an antiproton) as a solution. Storing antihydrogen in the form of a Bose Einstein Condensate is one possibility for packing more of the stuff in less space.

As to the vast cost of antimatter production, Obousy has this to say:

With regards to the question of production, current methods utilized at CERN are prohibitively expensive and generation of antihydrogen in quantities that would be valuable to spaceflight would cost trillions of dollars. Despite this, it’s important to recognize that CERN is not a dedicated antimatter production facility and that antihydrogen production is a remarkable, yet tertiary goal of the facility. According to recent research, a low-energy antiproton source could be constructed in the USA at a cost of around $500M over a five year period, and would be an important first step for mass production of antimatter. However the overall roadmap for antimatter propulsion would involve timescales closer to 50 years.

If we start talking near-future uses of antimatter, though, tiny quantities could be put to work in projects like Steven Howe’s antimatter sail, which would use antihydrogen to initiate a fission reaction in a small, uranium-coated sail. Howe developed this idea for NASA’s now defunct Institute for Advanced Concepts. The antimatter, which drifts from storage unit to sail, causes fission as it encounters the uranium, producing neutrons and fission fragments that leave the sail at enormous speeds. NASA’s John Cole, who studied the antimatter sail idea while at Marshall Space Flight Center, told me in 2003 that the sail could develop specific power on the order of 2000 kilowatts per kilogram, enough to drastically shorten human missions to the outer planets even if Howe’s estimates are an order of magnitude off.

Or could antimatter be used as a trigger for fusion? Obousy is interested in the prospect:

Although a spacecraft propelled by antimatter may be many decades away, it maybe possible to use antimatter in the near future to catalyze nuclear fusion reactions using antimatter. Only very small quantities would be required and this might provide an alternative method for liberating energy from fusion. Because Icarus must use current, or near technology, it is possible that Icarus will utilize this form of propulsion…

And he adds:

Clearly a multitude of technological hurdles must be overcome before antimatter use becomes routine in space exploration. However, the fundamental theoretical issues have been proved. Antimatter exists, antihydrogen can be created technologically, antihydrogen can be stored. The rest is progress.

For more on the potential uses of antihydrogen in propulsion, see Nieto et al., “Controlled antihyrogen propulsion for NASA’s future in very deep space,” NASA/JPL Workshop on Physics for Planetary Exploration, 2004 (available online).

tzf_img_post