The Exoplanets Rising conference, now in progress at the Kavli Institute for Theoretical Physics (UC Santa Barbara), is offering a treasure trove of online material, including one I’m currently watching, a presentation by Magali Deleuil (Astronomy Observatory of Marseilles Provence) on CoRoT results. It’s extraordinary for those of us who couldn’t be at the conference to have quick access to talks by the likes of Michel Mayor, Lisa Kaltenegger, Geoff Marcy and Debra Fischer on everything from transit puzzles to metallicity trends.
Interesting to note that CoRoT is now, according to Deleuil, at the end of the ‘nominal lifetime of the instrument,’ although CoRoT’s extended mission has been approved and the spacecraft will remain in operation until March of 2013. Deleuil says that thirteen observing runs have been completed, totaling 75,000 light curves from ‘stares’ of more than 60 days and 50,000 light curves from 25-day stares. Since February of 2007, 150 planetary candidates have emerged, fifty of which have transit depths of less than 3 milli-magnitudes (mmag).
Image: One of the methods for detecting exoplanets is to look for the drop in brightness they cause when they pass in front of their parent star. Such a celestial alignment is known as a planetary transit. Such transits block a tiny fraction of the light that COROT is able to detect. Credit: CNES.
‘Transit depth’ refers to the amount of the star’s light that is blocked by the planet. The largest transit depth yet recorded belongs to HD 189733b, a ‘hot Jupiter’ in Vulpecula that shows a depth of approximately 3 percent. Transiting gas giants generally have transit depths on the order of 1 percent. All this is useful because transit depth can be used to calculate the size of the planet. The depth is equal to the ratio of the area of the planet to the area of the star, and we know the rough size of the star because we know its spectral type.
The 3 mmag Deleuil mentioned in her talk works out to just 0.3 percent, a tribute to the sensitivity of the instrument in detecting smaller worlds. Just how sensitive is CoRoT? Deleuil says the instrument can detect Neptune-size planets no matter what the magnitude of the star, but a two Earth-radii planet requires stars brighter than 14th magnitude. Between Kepler and CoRoT we are going to build a useful catalog of ‘super Earths’ as we keep homing in on a true Earth twin.
I have a lot of listening ahead — David Charbonneau (Harvard CfA) is next on my list with an overview of transit discoveries, and I want to check Paul Kalas (UC Berkeley) on direct detection results. Chris Sotin (JPL) speaks this morning on ocean worlds and Lisa Kaltenegger (Harvard CfA) this afternoon on habitability and biosignatures. Abstracts and poster presentations are also available on the Kavli site, making it a real watering hole for those following the exoplanet hunt.
I know I should be staggered by everything about the Large Hadron Collider, but frankly, what really has me jazzed this morning is that I’m writing this with a window on one side of my screen showing a live webcast from CERN and another in an upper corner showing a Saturnian moon. There is something truly science fictional at being able to follow ongoing events both here and in space from a PC fed by a worldwide dataflow, and what events they are. The LHC is emphatically in business. Following the successful collision of two 3.5 teraelectronvolt beams (1106 GMT), CERN director general Rolf Heuer said the obvious: “It’s a great day to be a particle physicist.”
Remarkable things will come out of the LHC, and it’s stunning to see the quality of data flowing out of CERN from events that are no more than an hour old, a tribute to the quality of the installation and the team behind it. I’m keeping a CERN window open on my (wide) screen, and the Twitter flow via #LHC is great fun — much jesting about the failure to produce black holes or time travelers, and one tweet (via @tiannwillemse) with this: “LHC destroys universe. News update: Backup universe seamlessly springs into existence. Calamity averted.”
Time will tell what new physics will eventually flow from the LHC. And what about the odd news from Mimas? New high-resolution imagery shows unusual patterns, among which is an unexpected hot region that, as shown in the image, looks for all the world like the iconic Pac-Man. Mimas, remember, is the moon with the enormous Herschel Crater, but we’re finding that it’s unusual in other ways as well, based on data from Cassini’s February 13 flyby.
Image: This figure illustrates the unexpected and bizarre pattern of daytime temperatures found on Saturn’s small inner moon Mimas (396 kilometers, or 246 miles, in diameter). The data were obtained by the composite infrared spectrometer (CIRS) on NASA’s Cassini spacecraft during the spacecraft’s closest-ever look at Mimas on Feb. 13, 2010. Credit: NASA/JPL/GSFC/SWRI/SSI.
Why do we see such unusual temperature variations? Cassini’s composite infrared spectrometer shows that the warmest region of the moon is not found in early afternoon along the equator but in the morning, along one edge of the moon’s disc (hence the Pac-Man shape), where temperatures reach 92 K. Contrast that with the average 77 K on the rest of the moon, with a warm spot found around the Herschel Crater at 84 K.
No one has a quick answer on the odd V-shaped pattern, but theories are emerging:
“We suspect the temperatures are revealing differences in texture on the surface,” said John Spencer, a Cassini composite infrared spectrometer team member based at Southwest Research Institute in Boulder, Colo. “It’s maybe something like the difference between old, dense snow and freshly fallen powder.”
You would also assume that icy debris from Saturn’s E-ring would keep Mimas light in color, but what we see on the surface are dark streaks along crater walls and dark debris at the foot of each wall. The image below tells the story. One possibility is that Mimas accumulates silicate minerals or carbon-rich particles over time, possibly the result of impurities already embedded in surface ice. The darker material would be concentrated and left behind when the brighter ice evaporates.
Image: This false-color view of Saturn’s moon Mimas from NASA’s Cassini spacecraft accentuates terrain-dependent color differences and shows dark streaks running down the sides of some of the craters on the region of the moon that leads in its orbit around Saturn. The image was taken during Cassini’s closest-ever flyby of the moon. The false-color image shows how colors vary across the moon’s surface, particularly the contrast between the bluish terrain on the right side of this view near Herschel Crater and greenish terrain elsewhere. The origin of the color differences (exaggerated by computer enhancement) is not yet understood, but may be caused by subtle differences in the surface composition between the two terrains. Credit: NASA/JPL/SSI.
Mimas has received little attention given the surprises we’ve found on Enceladus, but now we see that it has mysteries of its own. The larger lesson is that extending our reach into new areas invariably confounds older assumptions, as we have found in the exoplanet hunt from the discovery of the first pulsar planets on. Who knows what further surprises our own system holds (I’m looking at you, New Horizons), and who knows what fundamental insights we may gain as the work at CERN ramps up? Theories give us possibilities, but it’s the accumulation of raw data that tells the story.
If we’re going to get lucky with SETI, it’s probably going to be through the reception of an interstellar beacon rather than the chance detection of an electronic emission from space. Sure, chance catches are possible, and for all we know odd receptions like the WOW! signal of 1977 might be cases in point. But we can’t confirm such signals because they’re one-shot affairs, whereas a beacon, designed to be received over interstellar distances, just might give us other options.
Understanding the Interstellar Beacon
So what can we say about beacons? In a guest editorial for the SETI League, former NASA SETI signal detection analyst Bob Krekorian takes a shot at the problem. Krekorian assumes a space-faring species will put its transmitter inside the habitable zone, designed to exist as close to the parent star as feasible to take advantage of the huge amounts of energy available there. If we were building such a beacon, we might decide to place it between the orbits of Earth and Venus, for energy reasons and efficient servicing.
Image: Looking toward galactic center in the infrared. Finding an extraterrestrial beacon may mean re-examining the assumptions that drive the search. Credit: Suzan Stolovy (SSC/Caltech) et al., JPL-Caltech, NASA.
In any case, imagine a beacon that rotates synchronously as it orbits its star. Rather than sending out a beam in all directions, it operates so that, in Krekorian’s words, “a strip of declination is illuminated at its orbital opposition and conjunction modes.” The advantages of this more targeted method seem considerable, as the author goes on to explain:
For potential contactees, the result is a gain of two orders of magnitude in received power over an omni-directional antenna and elimination of the Doppler drift accelerations from both planetary rotation and orbital motion. Assuming a continuous wave or a pulse signal, the signal types expected, detection is a much easier proposition, since the signal trajectory does not cut across detection channels in the time-frequency plane during the observation period.
All of which has further implications for how we proceed:
When the assumption is made that the signal remains stationary in a channel, coherent detection (match filtering) methods can be used to significantly improve detection sensitivity. Finally, one of the biggest challenges facing potential SETI recipients like us or anybody is the monumental real time signal processing (computer operations per second) effort that is required, if all possible signal drift paths resulting from planetary rotation are considered.
A Galactic Test for Membership?
The problems in finding such a signal are, of course, enormous, and it’s interesting to speculate, as Krekorian goes on to do, about ways to increase the information rate in a communications channel. The Shannon-Hartley theorem shows us the connection between channel capacity (bits per second), bandwidth and signal-to-noise ratio (SNR). The interplay of these factors is intriguing because it sets up a potential test (what Krekorian calls ‘an entrance fee’) for joining any organization of communicating civilizations.
You would think, for example, that increasing bandwidth is always a preferable option, for it allows you to pump more bits per second into your signal. But this added channel capacity also adds more noise into the system, which means your signal-to-noise ratio becomes problematic. Having found a signal, here’s how we could cope with this at our end:
SNR is directly proportional to telescope collecting area, so once we find the signal, larger collecting areas can be built to recover SNR and take advantage of the high data rates. This puts some of the burden of maximizing channel capacity on the receiving party, maybe the entrance fee required to join the galactic telecommunication federation.
To Find an Interstellar Roadmap
Intriguing stuff, and it’s important to note Krekorian’s point here, which is that as SETI was formerly practiced at NASA, it proceeded in two ways. A targeted search looked at a single star for 1000 seconds, and an all-sky survey looked briefly throughout the celestial sphere. Both of these strategies would fail if we assume that a stellar orbital beacon of the kind Krekorian outlines is broadcasting. Let’s assume that any civilization smart enough to build such a beacon and powerful enough to keep it in operation would know the pitfalls emerging civilizations like ours would encounter. In that case, what would they do to make it obvious?
Is there, then, an interstellar roadmap that is, in Krekorian’s words, “obvious to every emerging civilization like us based on the fundamentals of physics, astrophysics and mathematics, especially number theory, to connect one to the broadcast”? Perhaps, as Krekorian muses in an earlier SETI League editorial, the acquisition signal will be in the form of a pointer that directs us to the primary communications channel. In any case, we would expect that extraterrestrials choosing to make themselves known would make the problem of detecting their signal as straightforward as possible. In the absence of government funding, we now rely on the private sector to uncover possible strategies.
Kepler is going to give us a better idea of how likely we are to find such a beacon. If terrestrial worlds in the habitable zones of their stars turn out to be relatively common, we’ll be spurred to ask why we have yet to receive a signal and ponder whether SETI’s methods are not, as Krekorian implies, severely behind the times. If terrestrial worlds are found to be scarce, the problem is magnified by the potential distance between us and possible senders. SETI will continue in either case, but so must the relentless effort to discover what methods would most reliably result in a signal, assuming there is one out there to be had.
Related: The Guardian reviews Paul Davies’ The Eerie Silence.
Ordinary baryonic matter (think protons and neutrons) is thought to account for no more than one-sixth of the total mass in the universe, the rest being dark matter that does not reflect or emit light. Usefully, though, dark matter does interact with the rest of the universe through gravity, and it can be probed by studying gravitational lensing. Here the light of distant galaxies is deflected by the gravity of foreground concentrations of mass. All matter, whether baryonic or dark, is sensitive to this effect, making it possible to study dark matter on a large scale.
Data from the Cosmic Evolution Survey (COSMOS) offer this possibility, revealing just how dark matter is distributed in the cosmos. And by supplementing COSMOS with redshift data, we’re finding that the survey offers clues to dark energy as well. But first, some background on how COSMOS data have been used in dark matter work. 1000 hours of Hubble observations from its Advanced Camera for Surveys (the largest project ever conducted by Hubble) were the cornerstone of this work.
Structure on the Largest Scale
If COSMOS is gigantic (covering 1.6 square degrees of sky, about nine times the area of the full Moon), the survey has been supplemented with spectra from the European Southern Observatory’s Very Large Telescope and imagery from the Subaru and CFHT telescopes in Hawaii, as well as data from the ESA’s XMM/Newton space observatory. In 2007 we saw the result: Confirmation of the idea that normal matter accumulates along the densest concentrations of dark matter. The map shows a network of filaments that intersect in massive structures that correspond to the location of galactic clusters. From the paper on this work:
Our results are consistent with predictions of gravitationally induced structure formation, in which the initial, smooth distribution of dark matter collapses into filaments then into clusters, forming a gravitational scaffold into which gas can accumulate, and stars can be built.
The ‘scaffold’ metaphor is powerful, and the image below offers the overview:
Image: This three-dimensional map offers a first look at the web-like large-scale distribution of dark matter, an invisible form of matter that accounts for most of the Universe’s mass. The map reveals a loose network of dark matter filaments, gradually collapsing under the relentless pull of gravity, and growing clumpier over time. This confirms theories of how structure formed in our evolving Universe, which has transitioned from a comparatively smooth distribution of matter at the time of the big bang. The dark matter filaments began to form first and provided an underlying scaffolding for the subsequent construction of stars and galaxies from ordinary matter. Credit: NASA, ESA and R. Massey (California Institute of Technology).
The implication is that without dark matter, the universe would not have contained enough mass for structures to collapse and galaxies to form. The map extends roughly halfway back to the beginning of the universe and reveals the increasingly clumpy character of dark matter as it collapses under gravity. The international team behind this work is saying that the map goes beyond inferences of dark matter to direct observations revealing its effects.
Understanding how dark matter is distributed in space and time is essential if we’re to make sense of the way galaxies appeared and clustered. And this is interesting: The galaxies with active star formation are found in less populated voids and dark matter filaments. Says Nick Scoville (Caltech):
“It is remarkable how the environment on the enormous cosmic scales seen in the dark matter structures can influence the properties of individual stars and galaxies – both the maturity of the stellar populations and the progressive “down-sizing” of star formation regions to smaller galaxies is clearly dependent on the dark matter environment.”
COSMOS and Dark Energy
I bring all this up as necessary background for the more recent dark energy work, which likewise taps the COSMOS survey and its 575 overlapping views of the same part of the universe. For galactic clustering provides clues to dark energy, whose repellant force on matter may have influenced dark matter clumps. The evolution of what we see in the cosmos today is driven on the one hand by gravitational collapse and on the other by the accelerating expansion of the universe. Here I’ll quote Ludovic Van Waerbeke (University of British Columbia), who has worked with a team led by Tim Schrabback (Leiden University) in studying the 446,000 galaxies in the COSMOS field. The researchers supplemented the Hubble data with redshift data from ground telescopes to assign distances to 194,000 of the COSMOS galaxies.
Van Waerbeke lays out the findings:
“Our results confirmed that there is an unknown source of energy in the universe which is causing the cosmic expansion to speed up, stretching the dark matter further apart exactly as predicted by Einstein’s theory.”
Here the dark matter is crucial, for the analysis depends upon the large-scale distribution of matter. Schrabback’s team worked with algorithms designed to make sense of the distorted shapes of distant galaxies as affected by weak gravitational lensing. The notion of a universe composed of dark matter, normal baryonic matter and dark energy receives powerful support:
“Dark energy affects our measurements for two reasons. First, when it is present, galaxy clusters grow more slowly, and secondly, it changes the way the Universe expands, leading to more distant — and more efficiently lensed — galaxies. Our analysis is sensitive to both effects,” says co-author Benjamin Joachimi from the University of Bonn.
“Our study also provides an additional confirmation for Einstein’s theory of general relativity, which predicts how the lensing signal depends on redshift,” adds co-investigator Martin Kilbinger (Institut d’Astrophysique de Paris).
Image: This image shows a smoothed reconstruction of the total (mostly dark) matter distribution in the COSMOS field, created from data taken by the NASA/ESA Hubble Space Telescope and ground-based telescopes. It was inferred from the weak gravitational lensing distortions that are imprinted onto the shapes of background galaxies. The color coding indicates the distance of the foreground mass concentrations as gathered from the weak lensing effect. Structures shown in white, cyan and green are typically closer to us than those indicated in orange and red. To improve the resolution of the map, data from galaxies both with and without redshift information were used. The new study presents the most comprehensive analysis of data from the COSMOS survey. The researchers have, for the first time ever, used Hubble and the natural “weak lenses” in space to characterise the accelerated expansion of the universe. Credit: NASA, ESA, P. Simon (University of Bonn) and T. Schrabback (Leiden Observatory).
This fascinating work is proof of the power of gravitational lensing. We generate massive datasets that can be mined and, when mixed with confirmatory information from other venues, used to create a picture of the universe’s structure at the largest scale. In doing so, we’re learning how much the night sky that we can see is shaped by forces that we are only now becoming capable of studying.
For more on the dark energy work, the paper is Massey et al., “Dark matter maps reveal cosmic scaffolding,” Nature 445 (7 January 2007), pp. 286-290 (abstract). The upcoming paper on dark energy in Astronomy & Astrophysics is Schrabback et al., “Evidence for the accelerated expansion of the Universe from weak lensing tomography with COSMOS.” This ESA news release is helpful.
Making discoveries with new space missions always seems frustratingly slow, probably because with missions like Kepler, our expectations are so high. So it’s interesting to ponder what all is involved in getting the data analyzed and the discoveries pegged. This post from the Kepler team’s Charlie Sobeck points out that the first five planets Kepler found were the result of six weeks of flight data and about 25 days of ground-based observing to eliminate the false positives and determine the mass of the planets and properties of the host stars.
Nothing runs as smoothly as we might wish. The Kepler team had to sort through much of the data manually because the data processing software is not yet fully functional at NASA Ames. But word from the site is that a major software upgrade has finished development and can now be applied to analysis of almost a full year of data. It’s also worth noting that the Kepler mission is now being managed by Ames rather than the Jet Propulsion Laboratory. Getting past that kind of management transition has got to be a spur for new results.
Sobeck does note a problem with one of the spacecraft’s detector modules, which stopped working on January 9:
Under normal operations, each module and its electronics convert light into digital numbers. For the darkest parts of the image between stars, we expect these numbers to be very small (but not zero). Correspondingly, for the brightest stars in the image, much larger numbers are expected creating an image of each observed star and its background neighborhood. The numbers we see coming out of failed module 3 are all very similar in size and considerably lower than the normal levels. These numbers produce an image that looks like the “snow” on a television that has very bad reception. There are no stars visible in these images.
The problematic module is on the periphery of the field of view, and because the spacecraft rotates by 90 degrees every three months, the problem should not compromise the mission — the module observes a different part of the Kepler field of view each season. You can see the module’s position (and the spacecraft’s rotation) in the following image:
Image: Kepler field of view showing position of the failed module for each season. Credit: Charlie Sobeck/Kepler science team.
The module may be recoverable, but several weeks will go by before the team makes the attempt as the current situation is reviewed. But even in the worst case scenario of total module failure, Sobeck points out that no part of the Kepler field of view is rendered unobservable by the problem. At worst, the reduction in results might reach 5 percent, a figure, he adds, that “…shouldn’t significantly affect the Kepler science performance.”