Oxygen, Carbon Dioxide on Rhea

Interesting chemistry on the surface of Saturn’s moon Rhea seems a natural conclusion following the announcement of the discovery of oxygen in its evanescent atmosphere. And what a difference from Saturn’s largest moon, Titan, whose atmosphere is not only thick, but packed with nitrogen and methane, with little trace of carbon dioxide or oxygen. Rhea’s tenuous exosphere, which includes carbon dioxide, is so thin that its density of oxygen is about five trillion times lower than that of Earth’s atmosphere. Even so, interesting things may happen on an icy surface in this scenario.

I’m diverted as I write this — pardon the digression, but this does come back to Rhea — because I’m thinking about German friends who are fans of the wonderful 1960’s television show Raumpatrouille (Space Patrol), which followed the adventures of the spaceship Orion some years before Star Trek ever appeared on German screens. First broadcast in 1966, Space Patrol went on to achieve true cult status in West Germany and has legions of fans even today, with numerous novels following long after the show’s demise. And a true Space Patrol fan knows that in episode one, Captain McLane takes the Orion down to the icy surface of Rhea, a place Star Trek‘s Captain Kirk never ventured.

Image: A scene from Space Patrol. Credit: Bavaria Atelier GmbH.

But enough of 1960’s television. The evidence for both oxygen and carbon dioxide on Rhea comes from the continuing work of the Cassini mission. The assessment that accompanies it is that high-energy particles striking the surface dislodge atoms, molecules and ions into the atmosphere. We’re looking, then, at the chemical decomposition of surface water ice as it is radiated by plasma from Saturn’s magnetosphere, an indication that radiation can create complex chemistry on small, icy worlds like this one. Ben Teolis (Southwest Research Institute), a Cassini team scientist and lead author of the paper on this work, thinks processes like this may be widespread:

“The new results suggest that active, complex chemistry involving oxygen may be quite common throughout the solar system and even our universe,” said Teolis. “Such chemistry could be a prerequisite for life. All evidence from Cassini indicates Rhea is too cold and devoid of the liquid water necessary for life as we know it.”

But translate these results to an icy object with liquid water under the surface. According to Teolis (see this JPL news release), if a mechanism existed to transport oxygen and carbon dioxide from the surface to a sub-surface ocean, complex compounds might flourish and lead to the formation of life. As Saturn keeps insisting to us through its varied moons, the earliest precursors of life may be relatively common. “Rhea is turning out to be much more interesting than we had imagined,” says Cassini project scientist Linda Spilker (JPL). But then, that’s been true of almost all the large moons we’ve looked at around our gas giants.

Image: This fantastic view shows, from left to right, Saturn’s moons Mimas, Dione and Rhea, on the far side of Saturn’s nearly edge-on rings. The trailing hemispheres of all three moons are sunlit here, and wispy markings can be seen on the limbs of both Dione and Rhea. The diameter of Mimas is 397 kilometers, Dione is 1,118 kilometers and Rhea is 1,528 kilometers. The image was taken in visible blue light with the Cassini spacecraft narrow-angle camera on March 15, 2005, at a distance of approximately 2.4 million kilometers from Saturn. The image scale is 14 kilometers per pixel. Credit: NASA/JPL/Space Science Institute.

So how much oxygen are we talking about? The Cassini plasma spectrometer, and in particular its electron spectrometer, found peak densities of oxygen of around 50 billion molecules per cubic meter, with carbon dioxide levels at 20 billion molecules per cubic meter. Moreover, the plasma spectrometer saw strong indications of streams of positive and negative ions with masses that corresponded to ions of oxygen and carbon dioxide. The assumption that the oxygen in Rhea’s atmosphere comes from surface water ice seems reasonable given the rate at which Saturn’s magnetic field sprays the surface with energetic particles trapped in the field.

The carbon dioxide is susceptible to other interpretations, perhaps the irradiation of organic molecules trapped in Rhea’s water ice, but possibly the result of dry ice dating back to the solar nebula. A third possibility: Carbon dioxide could be the remnant of carbon-rich materials left by meteor strikes, all of which fits with the dark, carbon-based coating that appears on the surface.

The paper is “Cassini Finds an Oxygen-Carbon Dioxide Atmosphere on Saturn’s Icy Moon Rhea,” Science Express 26 November 2010 (abstract).


Pulsar Navigation for Deep Space

We’ve seen some remarkable feats of celestial navigation lately, not the least of which has been the flyby of comet Hartley 2 by the EPOXI mission. But as we continue our push out into the Solar System, we’re going to run into the natural limits of our navigation methods. The Deep Space Network can track a spacecraft from the ground and achieve the kind of phenomenal accuracy that can thread a Cassini probe through a gap in the rings of Saturn. But positional errors grow with distance, and can mount up to 4 kilometers per AU of distance from the Earth.

To go beyond the Solar System, we’ll need a method that works independently, without the need for ground station assistance. Pulsar navigation is one way around the problem. Imagine a spacecraft equipped with a radio telescope that can determine its position by analyzing the signals from distant pulsars. These super-dense remnants of stellar explosions emit a beam of electromagnetic radiation that is extremely regular, and as we’ve seen in these pages before, that offers a navigational opportunity, especially when we’re dealing with millisecond pulsars.

Scientists have been studying how to use pulsars for navigation since the objects were first discovered and several proposals have surfaced that are based on measuring the time of arrival of pulses or the phase difference between pulses, all in reference to the Solar System barycenter, the center of mass for all orbiting objects in the system. But a new paper from Angelo Tartaglia (INFN, Torino) and colleagues takes a look at an operational approach for defining a what they call an ‘autonomous relativistic positioning and navigation system’:

We assume that a user is equipped with a receiver that can count pulses from a set of sources whose periods and positions in the sky are known; then, reckoning the periodic electromagnetic signals coming from (at least) four sources and measuring the proper time intervals between successive arrivals of the signals allow to localize the user, within an accuracy controlled by the precision of the clock he is equipped with.

Moreover, the spacecraft determines its own position solely by reference to the signals it receives, which no longer have to flow from Earth:

This system can allow autopositioning with respect to an arbitrary event in spacetime and three directions in space, so that it could be used for space navigation and positioning in the Solar System and beyond. In practice the initial event of the self-positioning process is used as the origin of the reference, and the axes are oriented according to the positions of the distant sources; all subsequent positions will be given in that frame.

Hence the term ‘autonomous’ to describe the system. Marissa Cevallos did a terrific job on the pulsar navigation story in a recent online post, talking to the researchers involved and noting a key problem of conventional spacecraft navigation: We can use Doppler shift to calculate a spacecraft’s position, but we lack accuracy when it comes to generating a three-dimensional view of the vehicle’s trajectory. What pulsars could provide would be an ability to place the spacecraft in that three dimensional frame, as the Italian team was able to demonstrate through computer simulations using software that worked with artificial signals to test the method.

This is celestial navigation of a kind that conjures up sailing ships deep in southern seas in the 18th Century, using the stars to fix their position. We know, of course, that neither stars nor pulsars are fixed in the sky, but the regularity of pulsars is such a huge advantage that we can adjust for long-term movement. What is more problematic is the weakness of the pulsar signal, which could demand the use of a large radio telescope aboard the spacecraft. That will remain an issue for work outside the Solar System, but in the inner System, the Italian team wonders whether we could combine pulsars signals with those of local transmitters. From the paper:

For the use in the Solar system, one could for instance think to lay down regular pulse emitters on the surface of some celestial bodies: let us say the Earth, the Moon, Mars etc. The behaviour of the most relevant bodies is indeed pretty well known, so that we have at hands the time dependence of the direction cosines of the pulses: this is enough to apply the method and algorithm we have described and the final issue in this case would be the position within the Solar system. In principle the same can be done in the terrestrial environment: here the sources of pulses would be onboard satellites, just as it happens for GPS, but without the need of continuous intervention from the ground: again the key point is a very good knowledge of the motion of the sources in the reference frame one wants to use.

What’s fascinating about this work is that while it does not consider the numerous technological problems involved in building such a positioning system, it does define an autonomous method which fully moves the positioning frame from Earth to spacetime, in what the authors call a ‘truly relativistic viewpoint.’ The paper goes on:

The procedure is fully relativistic and allows position determination with respect to an arbitrary event in flat spacetime. Once a null frame has been defined, it turns out that the phases of the electromagnetic signals can be used to label an arbitrary event in spacetime. If the sources emit continuously and the phases can be determined with arbitrary precision at any event, it is straightforward to obtain the coordinates of the user and his worldline.

The spacecraft using these methods, then, is fully capable of navigating without help from the Earth. For nearby missions, emitters on inner system objects can supplement the observation of a single bright pulsar to produce the data necessary for the positional calculation, but deep space will demand multiple pulsars and the onboard capabilities of an X-ray telescope to acquire the needed signals. How we factor that into payload considerations is a matter for future engineering — right now the key task is to work out the feasibility of a pulsar navigation system that could one day guide us in interstellar flight.

The paper is Tartaglia et al., “A null frame for spacetime positioning by means of pulsating sources,” accepted for publication in Advances in Space Research (preprint). See also Ruggiero et al., “Pulsars as celestial beacons to detect the motion of the Earth” (preprint).

Related (and focused on the analysis of X-ray pulsar signals): Bernhardt et al., “Timing X-ray Pulsars with Application to Spacecraft Navigation,” to be published in the proceedings of High Time Resolution Astrophysics IV – The Era of Extremely Large Telescopes, held on May 5-7, 2010, Agios Nikolaos, Crete, Greece (preprint). Thanks to Mark Phelps for the pointer to this one.


Astrobiology on the Cheap

Keeping space missions separate can be a difficult challenge when so many satellites are launched on a single rocket. Take O/OREOS (Organism/Organic Exposure to Orbital Stresses). The small satellite rode into space on an Air Force Minotaur IV rocket on the 19th, a launch we noted here in connection with the NanoSail-D solar sail demonstrator. For NanoSail-D was itself carried into space as part of the FASTSAT payload bus (Fast, Affordable Science and Technology Satellite), and FASTSAT and O/OREOS were subsumed under a mission called Space Test Program S26. Not to mention a number of other satellites from universities and industry that hitched a ride on the same booster.

All of this produces not just confusion but acronym fatigue. Nonetheless, interesting science is in the works. O/OREOS is all about conducting astrobiology science experiments on the cheap using nanosatellites (CubeSats), helping scientists plan future experiments on how organic molecules are changed by exposure to space. Says Pascale Ehrenfreund (a project scientist at George Washington University:

“The O/OREOS science team is excited to receive the first real-time measurements from samples onboard two science experiments. This will demonstrate that CubeSat technologies can be used for future missions to address fundamental astrobiology objectives.”

The satellite will conduct experiments, some lasting as long as six months, autonomously after receiving the command from the ground station in Santa Clara, California. The Space Environment Viability of Organics experiment (SEVO) aims to monitor four classes of organic molecules as they are exposed to space conditions, using molecules now known to be widely distributed throughout the galaxy. The organics, housed in micro-environments, are to be exposed to radiation through solar ultraviolet as well as visible light, trapped particle radiation (i.e., energetic protons and electrons trapped by the Earth’s magnetosphere) and cosmic radiation. Researchers will look for changes in UV, visible and near-infrared light absorption.

Image: This artist’s concept represents complex organic molecules, known as polycyclic aromatic hydrocarbons. These large molecules, comprised of carbon and hydrogen, are distributed throughout the Milky Way and other galaxies like it. They play a significant role in star and planet formation and are very common on Earth. They form any time carbon-based materials are not burned completely. They can be found in sooty exhaust from cars and airplanes, and in charcoal broiled hamburgers and burnt toast. Aromatic molecules identified in meteorites might have been beneficial to the origin of life on Earth. Image credit: NASA/JPL-Caltech

The Space Environment Survivability of Live Organisms (SESLO) payload, meanwhile, will look at the activity of microorganisms adapting to the space environment, measuring their growth and health while exposed to weightlessness and radiation. The experiment will grow sets of microbes — Halorubrum chaoviatoris and Bacillus subtilis — of the kind found in salt ponds and soil in a dried and dormant state. The O/OREOS researchers will measure their population density and color changes while the microbes consume dyed liquid ingredients.

CubeSats are a useful way to proceed because they’re so much less expensive than the alternatives, as Ehrenfreund notes:

“Secondary payload nanosatellites, like O/OREOS are an innovative way to extend and enhance scientists’ opportunities to conduct research in low Earth orbit by providing an alternative to the International Space Station or space shuttle investigations. With O/OREOS we can analyze the stability of organics in the local space environment in real-time and test flight hardware that can be used for future payloads to address fundamental astrobiology objectives.”

All of that, of course, plays into the much larger picture of contributing to our knowledge of life’s origin and distribution in the universe. Not bad for a nanosatellite no larger than a loaf of bread that weighs a mere 5.4 kg, operating in a low-Earth orbit some 644 kilometers above the surface. O/OREOS is the first NASA cubesat to fly with two distinct and independent science experiments on an autonomous satellite, and is constructed from off-the-shelf commercial and NASA-designed parts to operate as an automated, self-contained space laboratory. The O/OREOS mission dashboard is operational, as is the Twitter feed @NASA_OOREOS.


A Cosmic Gravitational Wave Background?

A gravitational wave is a ripple in spacetime, one that follows naturally from the theory of general relativity — Einstein did, in fact, predict the existence of such waves back in 1916. Yet so far we have had nothing but an indirect detection in the form of the Hulse-Taylor binary (PSR B1913+16), a pulsar in a binary system that includes a second neutron star, the two orbiting around a common center of mass. The 1993 Nobel Prize in physics went to Richard Hulse and Joseph Hooton Taylor (Princeton University), who showed that the system’s orbital decay corresponds with the loss of energy due to the kind of gravitational waves Einstein predicted.

What we now need is a direct detection, but these waves have proven to be a tricky catch. Consider this: The distance between two spacecraft flying five million kilometers apart would be changed by about a picometer by the effects of gravitational waves. That’s a distance 100 million times smaller than the width of a human hair, some .000000000005 meters. Yet it’s the distance scientists hope to measure with the LISA mission (Laser Interferometer Space Antenna), in which three spacecraft will fly in a triangle connected by laser beams, a formation flight orbiting the Sun roughly 20 degrees behind the Earth.

The plan: Aboard each of the spacecraft will be a cube of platinum and gold that floats freely in space. Passing gravitational waves should cause the distance between the cubes to vary, and the good news out of the Jet Propulsion Laboratory this week is that after six years of working on the LISA technology, scientists have tuned its phase meters (laser beam detectors) to the point where such a detection should be possible, the laser ‘noise’ dropping sufficiently to allow the evanescent wave signature through. Thus JPL physicist Bill Klipstein:

“In order to detect gravitational waves, we have to make extremely precise measurements. Our lasers are much noisier than what we want to measure, so we have to remove that noise carefully to get a clear signal; it’s a little like listening for a feather to drop in the middle of a heavy rainstorm.”

A feather in a rainstorm indeed, and even that seems to understate the case. JPL is now demonstrating that its instruments are sensitive enough to make detecting gravitational waves a possibility. The principle should sound familiar, as it’s basically interferometry (though with a time delay), often discussed here in terms of pooling the resources of multiple telescopes so as to produce an effective aperture equal to the separation of the telescopes. In LISA’s case, the three spacecraft are affected by passing gravitational waves so that the distances between the test masses changes, a fact revealed by the distances traveled by the laser beams of light.

Ground-based data processing will then tell us whether the light detected by the onboard phase meters shows any variation in distance between the spacecraft. By introducing artificial noise into their detectors, the JPL team has been able to show that its data processing techniques can filter it out, highlighting those one-picometer distance changes scientists hope to see. The LISA mission is a joint project between the European Space Agency and NASA that would launch around 2020 if selected. The National Research Council’s decadal report, which more or less put an end to the Space Interferometry Mission, has given LISA a high recommendation.

Image: The gravitational-wave sky, as observed by LISA. The plane of the Galaxy is visible as the white horizontal band of emission from millions of Galactic binaries. The dots and squares mark the locations of a small fraction of the black-hole mergers and capture events that LISA will observe, while the purple background represents the relic gravitational radiation that LISA may detect from the very early Universe. Credit: NASA/ESA.

If it succeeds, the LISA mission will open up a new way of observing the universe, looking for low frequency gravitational waves (0.03 milliHertz to 0.1 Hertz), a band thought to contain the emission from massive black hole binaries of the kind that form after the merger of entire galaxies. Gravitational waves are all about huge astronomical events like this, generated by compact objects like stellar remnants falling into galactic black holes and neutron stars in tight binaries. The behavior of spacetime when pushed to extremes may teach us not only about dense matter and stellar remnants, but about the expansion history of the universe itself.

What would a Cosmic Gravitational Wave Background look like, and what could it tell us about the earliest moments of the universe? Remember that the Cosmic Microwave Background emerged about 400,000 years after the Big Bang, and represents the oldest light we can see via electromagnetic astronomy. Gravitational waves should be able to propagate to us from before that era, making them a probe of the early universe that could identify the existence of new fundamental objects like cosmic superstrings, if they exist, thus offering a window into the various models of string theory and sketching the history of the phase transitions that shaped the cosmos.


A ‘Benchmark’ Brown Dwarf

The WISE mission has received a lot of press in terms of discovering nearby brown dwarfs, but it’s clear that finding low-temperature objects is a major investigation at many Earth-bound sites as well. That includes the UKIRT (United Kingdom Infrared Telescope) Deep Sky Survey’s project to find the coolest objects in our galaxy, an effort that has paid off in the form of a unique binary system. One of the stars here is a cool, methane-rich T-dwarf, while the other is a white dwarf, the two low-mass stars orbiting each other though separated by a quarter of a light year.

Understanding Brown Dwarf Atmospheres

We need to put this find in context. In the absence of hydrogen fusion at the core, brown dwarfs depend upon gravitational contraction as their internal energy source. Cooling slowly over time as they shed their energies, brown dwarfs emit most of their radiation in the infrared, with spectra showing absorption bands of water, methane, carbon monoxide and other molecules in the stellar atmosphere, the absorption patterns being dependent on the star’s temperature. And in the study of brown dwarfs, what’s going on in that atmosphere has a lot to say about what we can surmise.

A number of surveys, ranging from the 2-Micron All Sky Survey to the UKIRT attempt and the Sloan Digital Sky Survey have been identifying brown dwarfs and pushing our knowledge down into the range of very low temperature stars. But the paper on the binary find notes the fact that many of the processes going on in brown dwarf atmospheres are not well understood, adding:

…the nature of BD [brown dwarf] evolution means that the mass-luminosity relation depends strongly on age, and in the absence of well constrained atmospheric properties there is no way to accurately determine mass and age… Identifying objects where one can pin down these properties independently can help aid the calibration of models.

But we do know a good deal about white dwarfs, so finding brown dwarfs in association with white dwarfs is helpful. Only a few such binaries have been identified — five, to be precise — and these five pair white dwarfs with the somewhat warmer L-dwarfs. The new binary system is the first discovery of a T-dwarf in association with a white dwarf, and all indications are that it has survived for close to 5 billion years, the wide separation reflecting the loss of mass as the white dwarf expelled its outer layers and thus weakened the gravitational pull between the stars.

The Beauty of a Binary

What we have, then, is a look into the physics of ultra-cool stellar temperatures (temperatures less than 1000 degrees Celsius), with the white dwarf establishing the age of both objects. UKIRT scientists are referring to the find as a ‘Rosetta stone’ for methane dwarfs like the one in this system.The T-dwarf is about Jupiter-size and, like the gas giant, is too cool to power up hydrogen fusion, so that the star becomes cooler and cooler over time. The white dwarf companion is a star that, having used up its nuclear fuel, has expelled its outer layers, leaving a cooling core about the size of the Earth, in a process that will eventually happen to our Sun.

What we have in the new binary is a system in which the so-called ‘planetary’ nebula formed by white dwarf material has fully dissipated over time, leaving us with the two widely spaced stars. Says Avril Day-Jones (Universidad de Chile):

“In about 6 billion years’ time, when our Sun ‘dies’ and becomes a white dwarf itself, the stars in the newly-discovered system will have changed dramatically. The methane dwarf will have cooled to around room temperature, and the white dwarf will have cooled to 2700 Celsius or the temperature of the methane dwarf at the start of its life.”

The twin objects are now known as LSPM 1459+0857 A and B, a binary that has held together despite the perturbations of the white dwarf’s history and the system’s own passage through the galactic disk. The paper notes that “This system is an example of how wide BD binary companions to white dwarfs make good benchmark objects, which will help test model atmospheres, and may provide independent means to calibrate BD properties of field objects.”

And although the binary is the first candidate system under study by the UKIRT team, the expectation is that many more will be found by combining the brown dwarf search with survey results on white dwarfs from the Sloan Digital Sky Survey. The paper calls for follow-up parallax measurements of the two components and fuller spectral studies of the T-dwarf, which would improve our estimates of the system’s age, peg the radius and mass of the white dwarf, and thus maximize the effectiveness of the benchmark provided by the cool brown dwarf.

The paper is Day-Jones et al., “Discovery of a T dwarf + white dwarf binary system,” accepted by Monthly Notices of the Royal Astronomical Society (preprint).