Earthly Windows into Dark Energy

While lamenting the budgetary problems of space-based missions like SIM — the Space Interferometry Mission — I often find myself noting in the same breath that technological advances have us doing things from the ground we used to think possible only from space. Make no mistake, we need to develop space-based interferometry for future studies of exoplanet atmospheres and their possible biomarkers. But it’s gratifying that the next generation of ground-based telescopes using adaptive optics coupled with extremely large instruments like the Giant Magellan Telescope will also give us powerful tools for studying exoplanets.

Image: An artist’s rendering of the Giant Magellan Telescope in its enclosure. Credit: Giant Magellan Telescope Organization.

The same holds true for another intriguing line of investigation. We’ve known about dark energy since the late ’90s, when two groups — the Supernova Cosmology Project and the High-z Supernova Search Team — discovered that the expansion of the universe is accelerating. It’s now believed that dark energy makes up some 70 percent of the mass and energy of the universe, a startling thought given that we know almost nothing about it. No wonder dark energy is considered by many the most significant problem facing physics today.

Pursuing its nature is something we can do in space, as we saw yesterday in discussing the WFIRST mission advocated by the National Research Council’s Decadal Survey. But we can do a great deal from the ground as well. HETDEX (the Hobby-Eberly Telescope Dark Energy Experiment) is a project to discover what dark energy is. An upgrade to the Hobby-Eberly Telescope run by the University of Texas at McDonald Observatory, HETDEX has just received an $8 million grant from the National Science Foundation to survey dark energy.

Here’s the method: HETDEX will make a survey of approximately one million star-forming galaxies between 10 and 11 billion light years away. The goal is to study these findings to learn whether dark energy is constant through time. At the heart of the survey is a spectrograph called VIRUS, to be assembled and aligned at Texas A&M University. VIRUS will be made up of 150 copies of a single spectrograph, allowing it to capture spectra from 33,000 points on the sky simultaneously using fiber optics to move light to the spectrograph array.

Image: Artist’s concept of the upgraded Hobby-Eberly Telescope. The VIRUS spectrographs are contained in the curved gray “saddlebags” on the side of the telescope. They receive light through the green cables, which contain bundles of fiber-optic lines. This illustration shows the telescope without its enclosing dome. Credit: McDonald Observatory/HETDEX Collaboration.

In making this survey, HETDEX also opens up an extremely useful window on the early universe. Says Robin Ciardullo (Penn State):

“The HETDEX observations, in addition to revealing key information about the expansion history of the universe, will also provide important insights into the formation of galaxies. We will be obtaining information from a time before our sun was born. We believe that many of the objects HETDEX detects will someday evolve into galaxies similar to the Milky Way, so the experiment will also be providing a glimpse into our galaxy’s infancy.”

The HETDEX survey is to begin in January of 2012 and, according to reports from project partners, is currently on track. After three years of operations, the survey will cease, with data being released to the public. Thus HETDEX joins existing dark energy projects like the Dark Energy Survey (Fermilab) and the Baryon Oscillation Spectroscopic Survey (part of the Sloan Digital Sky Survey). Will a firm fix on dark energy emerge from one or more of the three? Let’s hope so, for understanding what makes the cosmos accelerate is fundamental to physics.

Ultimately, the ability to make such surveys using space-based instrumentation holds out the promise of going even deeper into these mysteries, but it’s heartening that as we survive economic downturns and build toward an eventual space infrastructure, we can accomplish so much from our planetary surface. Learning what makes the universe tick is an end in itself, but there’s also the tantalizing possibility that at some point in the future, we’ll learn whether there is an energy here that can be put to work to turn a system-wide space-based infrastructure interstellar.

tzf_img_post

Losing SIM: Thoughts on Exoplanetary Strategy

For all the excitement the Kepler mission has generated, we sometimes forget its limitations. Kepler is engaged in a transit hunt for exoplanets that will help us identify not just gas giants but planets the size of our own. But it’s a brute-force method, looking at a huge number of stars to identify the few whose planetary systems are aligned properly for us to see transits. And the necessary limitation is that when we do find terrestrial-sized worlds, we’ll be unable to do much by way of follow-up, because most of those planets will be thousands of light years away.

This is not to diminish Kepler’s critical work (nor that of CoRoT), for in no other way are we currently gaining this kind of overview of the planetary environment around a wide range of stars. But Philip Horzempa reminds us in a recent post on The Space Review that we have follow-up missions in the pipeline that are now losing their funding. Specifically, the Space Interferometry Mission (known as SIM Lite in its last incarnation), received no backing from the Astronomy and Astrophysics Decadal Survey, recently released by the National Research Council. If SIM still hangs by a thread, it is getting to be a very slim thread indeed.

The Next Phase of the Planet Hunt

I want to talk about SIM (and I’ll try to avoid elegaic tones) because it illustrates the kind of questions we face as we try to press beyond finding exoplanets to eventually bringing Earth-class planets into close scrutiny. In its various incarnations, SIM was a radically different concept than Kepler, one that would have used an interferometer to combine the light from separate telescopes to obtain high-resolution data on the positions of stars. So exquisitely sensitive would SIM’s instruments have been that they could have detected Earth-class planets as well as the familiar ‘hot Jupiters.’ But more important still was that SIM would have brought the terrestrial planet hunt closer to home by looking for nearby Earths that circle Sun-like stars.

Detecting the telltale ‘wobbles’ of planets around the stars closest to us is an obvious priority and whether we have SIM or not, we still need a plan that gets a comparable result. Let me quote Horzempa on what he thinks SIM’s significance would have been in terms of future missions that will look for biosignatures in the atmospheres of exoplanets:

The key will be finding the “exoEarths” that are close to us, meaning within 30 light-years. Only those Earths that are close enough to our solar system will reflect sufficient light from their parent Sun to allow telescopes, of any design, to examine them in detail. SIM is the only mission capable of detecting those nearby Earths. SIM will be a guide, a “GPS” for those who seek other Earths. Without SIM, all future endeavors to examine and chart those nearby Earths will stall.

I disagree with that last statement, although I do think SIM would have put us on a faster track. And the main reason for that is what Horzempa goes on to say:

In addition, SIM will be a pathfinder for the use of interferometers in space. This is a vital technology for future projects such as the planned Life Finder and Planet Imager missions, which will use arrays of space telescopes. Those arrays will use the technique of interferometry to combine their light and produce exquisite data, and maps, of those nearby Earths. Without SIM’s pioneering effort, those projects will be delayed for decades.

Losing SIM in the Budget

Horzempa is blunt about what happened to a SIM project that was already highly developed and ready to enter its implementation stage. What he calls the ‘big-ticket’ items like Constellation and the James Webb Space Telescope had precedence in the mid-2000s, with work on SIM reduced to a low but continuing level as ground testing of hardware proceeded. Given all that, what has him exercised is the fact that the Astro2010 report did not include SIM in its list of recommended missions for the next decade. The WFIRST mission is proposed instead, consuming the funding that might have made SIM a space-borne reality. Horzempa again:

The decision of Astro2010 to eliminate nearly all traces of funding for planet-hunting space telescopes is breathtaking. They did offer a token to those studying exoplanets by adding a microlensing capability to the WFIRST infrared telescope. Microlensing can detect exoplanets by detecting subtle brightening of stars. However, microlensing will only find exoplanets that orbit stars located 20,000 light-years away. Distance, as noted above, is crucial to any follow-up missions. Microlensing, like the Kepler mission, can detect Earth-sized planets, but they will be so far away that it will be almost impossible to locate the stars around which they orbit. Like Kepler, microlensing will return data on the statistics of planet sizes, but will lead us no closer to finding a warm, water-bearing Earth twin.

Should NASA put $2 billion in WFIRST, whose primary purpose is the study of dark energy? This is Horzempa’s primary question, noting that the already developed SIM would take about half as much to get it ready for launch, leaving sufficient resources for a smaller mission with a dark energy focus. The key point is that while WFIRST exists as a concept, SIM existed as the work of a team that had reached all technical milestones required of it and had built hardware.

Looking Toward Nearby Stars

Without SIM, the question becomes ‘where do we stand on identifying and characterizing nearby planets’? WFIRST’s microlensing capabilities will be used to detect exoplanets tens of thousands of light years away, but what about the stars whose atmospheres we will eventually study for life signatures? Shouldn’t these stars be NASA’s logical focus after Kepler?

I pulled out the decadal survey again to look at its treatment of the terrestrial-planet hunting process. The report goes from the Kepler transit survey directly to WFIRST’s microlensing studies, followed up by improved radial velocity measurements on existing ground-based telescopes, looking for planets a few times more massive than Earth as targets for future missions. It calls on the James Webb Space Telescope to study the atmospheric or surface composition of small planets orbiting the coolest red stars. The missing link here is the precision that a SIM mission would bring to what the survey assumes will be ground-based follow-ups.

That precision could help us greatly in identifying small planets around stars close to the Sun. Right now we have three ongoing studies of the Alpha Centauri system, looking for planets around Centauri A and B. We’ve just been through a wave of public interest about a nearby planet that may not even exist — Gliese 581g — because it was thought to be in the habitable zone of its star. Nearby stars exert an understandable fascination because of the prospect of closer study.

The other missing link is the technology of space-based interferometry. Budgetary realities are what they are, and the decadal survey process has served us well for many years at prioritizing science. Yet thankfully, there is a sense in which good mission ideas never really die. To get those biosignatures we hope to find and, one day, actual images of a distant planetary surface, we’ll tap the interferometric expertise that is SIM’s legacy. But the betting here is that a true terrestrial planet finder using these methods is still decades away.

tzf_img_post

Exoplanet Atmospheres: What We Don’t Know

What happens in the atmosphere of a tidally locked world in the habitable zone of a red dwarf? We have solid work suggesting through simulations that habitable conditions could exist there, but it’s also true that we’re in the early stages of these investigations and we have no actual examples to work with. Drawing hasty conclusions is always dangerous, particularly when we’re talking about the details of atmospheric circulation on a planet no one has ever seen.

Take Gliese 581g. Assuming it exists — and there is still a bit of doubt about this, although the consensus seems to be that it’s really there — we can place it in a temperature zone that would allow life. We don’t know for a fact, though, that it isn’t a water world, covered entirely with deep ocean, a planet that migrated from beyond the snowline into its present position. And even if it is a rocky planet with a substantial atmosphere, our simulations of atmospheric circulation only represent the best that is known today. This early in the game, we should expect surprises.

Addendum: I’m evidently wrong about the consensus re Gliese 581g, as it appears that its existence really is doubtful. Check the comments to this post to learn more.

A Lesson from a ‘Hot Jupiter’

An instructive case is the planet Upsilon Andromedae b, which although hardly in a habitable region (it’s a hot Jupiter orbiting — and tidally locked to — an F-class primary in a 4.6-day orbit), has yielded useful information about conditions within its own atmosphere. The blindingly obvious notion that a tidally locked planet should have its hottest region directly in the center of the Sun-facing side turns out to be untrue, and not just by a small amount. For new work via the Spitzer Space Telescope tells us that Upsilon Andromedae b’s hot spot is fully 80 degrees away from high noon, on the side rather than the star-facing center.

Ian Crossfield, lead author of the paper on this work, has this to say about Upsilon Andromedae b:

“We really didn’t expect to find a hot spot with such a large offset. It’s clear that we understand even less about the atmospheric energetics of hot Jupiters than we thought we did.”

Upsilon Andromedae b is thus a cautionary tale, as if more were needed, about the dangers of over-extrapolation. Exoplanet atmospheric science is, in any case, an infant discipline, and aside from the simulations of red dwarf planet atmospheres (Joshi, Haberle and Reynolds’ work at NASA Ames kicked this off in a 1997 paper in Icarus), we’ve focused almost entirely on hot Jupiters, using transits where available to study atmospheric composition. We’ve found water, methane, carbon dioxide and carbon monoxide in their atmospheres in a series of remarkable investigations.

The Spitzer instrument measured the total light of Upsilon Andromedae b and its star in the infrared, which is how the unusual temperature distribution came to light. The system turns out to be brightest when the planet is at the side of the star as seen from Earth rather than when it is behind the star, showing its Sun-facing side (Upsilon Andromedae b does not transit). It’s incumbent upon theorists to come up with a solution to this latest atmospheric riddle. In the hunt are star-planet magnetic interactions and the effects of supersonic winds but one suspects we’ll be hearing of other possibilities.

The Danger of Quick Assumptions

Back to the red dwarf question. We have to be careful about making too many assumptions about Gliese 581g and any other planet discovered to be in the habitable zone of a red dwarf. I admit to being optimistic, but there is so much we don’t yet know. Suppose that Gl 581g is indeed an ocean planet. Centauri Dreams reader Dave Moore has commented in the past about a paper by Timothy Merlis and Tapio Schneider (Caltech) that offers encouraging results re tidally locked ocean planets, with moderate temperatures and a super-rotating atmosphere that maintains a mild climate (interestingly, the coldest parts of the planet are the poles, not the anti-stellar point).

Is a rocky world in the habitable zone going to be equally moderate in its temperatures, and will its precipitation patterns follow those suggested by Merlis and Schneider for a water world? We can speculate all we want about the hydrological cycle and so on, but it’s the encounter between theory (and simulations) with observational data that handily supplies the monkey wrench, as it just has in the case of Upsilon Andromedae b. Until we have observations of tidally locked red dwarf planets beyond their wraith-like appearance in hundreds of hours of radial velocity data, continued caution on habitability is the only recourse.

The Merlis and Schneider paper is “Atmospheric dynamics of Earth-like tidally locked aquaplanets,” in press at the Journal of Advances in Modeling Earth Systems (full text). The Upsilon Andromedae b paper is Crossfield et al., “A New 24 micron Phase Curve for Upsilon Andromedae b,” accepted by The Astrophysical Journal (preprint).

tzf_img_post

The Interstellar Tool Builders

Long before I knew what ideas for interstellar flight were out there in the literature, I always saw the idea of a trip between the stars in Homeric terms. It would be an epic journey that, like that of Odysseus, would resonate throughout human history and become the stuff of legend, even myth. In back of all that was the belief that any vehicle we could design that could carry people and not just instruments to the stars would be a ‘generation ship,’ in which the crew were born, raised their families, lived their lives and died while the ship, moving at maybe 1 percent of light speed, pressed on to destination.

That familiar science fiction trope still has a ring of truth about it, because if for some reason we as a species decided we absolutely had to get a few human beings to Alpha Centauri, about the only option we would have for the near-term is a solar sail and a close-pass gravity assist by the Sun, and even in the best case scenario, that still works out to around a thousand year journey. Epic indeed. Now and then I re-read old science fiction tales like Brian Aldiss’ Non-Stop‘ (published in the US as Starship) and Robert Heinlein’s Orphans of the Sky to be reminded of how firmly such notions managed to settle into our psyche as the idea of star travel grew. And I think about Odysseus and crew on that wine-dark sea that would almost destroy them.

We’d like to do better, of course, which is why long-shot concepts like Alcubierre’s ‘warp drive’ continue to intrigue us. Igor Smolyaninov, now at the University of Maryland, has been working with metamaterials to study the Alcubierre concept of altering spacetime itself so that a ship could move in a kind of ‘bubble’ that never exceeds the speed of light through space while radically reducing travel times. As Richard Obousy showed us yesterday, Smolyaninov’s interest is in developing the needed tools to study the warping of spacetime in the laboratory.

Metamaterials are the result of engineering that alters how light behaves as it moves through the materials, and Smolyaninov is interested in how they can help us simulate the behavior of light under the kind of extreme gravitational conditions a warp drive would create. I think Richard did a terrific job of showing us that what we are doing here is to simulate exotic light ray trajectories because of the unique properties of metametarials. We are definitely not talking about creating anything beyond a simulation of what happens to spacetime near a sublight warp drive.

What Smolynaninov wants to demonstrate, then, is a tool for the further study of these ideas, and I leave it to those better qualified than myself to judge the quality of his findings. Warp drive is a fantastically problematic concept, one that demands exotic matter with negative energy density, and one that suffers from instability from quantum effects and may well be prohibited by the laws of physics, as Smolyaninov is quick to point out. Nonetheless, he adds:

…our results demonstrate that physics of a gradually accelerating warp drive can be modeled with newly developed “perfect” magnetoelectric metamaterials built from split ring resonators. Since even low velocity physics of warp drives is quite interesting, such a lab model deserves further study.

I’m reminded of Claudio Maccone’s continuing work on the Karhunen-Loève Transform (KLT) and the mathematical tools that may one day be needed to communicate with spacecraft moving at relativistic speeds. Yes, we are generations away from such spacecraft (although the KLT techniques Maccone studies have already been used for spacecraft communication with the Galileo mission), but creating the mathematical frameworks and the laboratory experiments to help us study the implications of fast interstellar flight is part of building toward that future.

Also along the lines of fast travel comes news from NASA Ames director Pete Worden that the center has begun a project with DARPA called ‘the Hundred Year Starship.’ Worden made the announcement at a Long Now Foundation event in San Francisco, but went on to spend most of his talk on unrelated near-term ideas like electric propulsion and microwave thermal propulsion, in which power is beamed to a spacecraft and used to heat a propellant, allowing missions to be designed in which the fuel onboard can be reduced, to the benefit of payload size.

The latter ideas have little to do with the ‘Hundred Year Starship,’ but just who is in charge of this project, how NASA and DARPA will develop it, and what its goals are (does it envision a starship that can be launched within a hundred years, or a starship that can complete a hundred-year mission to a star?) remains to be seen. DARPA, according to this account, is putting $1 million into the project, with $100K from NASA. More on this as it becomes available, and here’s to the notion of keeping our enthusiasm for the challenge even at this early stage of building the tools we need to define it. On that score, an occasional dose of science fiction is highly recommended.

tzf_img_post

Exploring Alcubierre’s Ideas in the Lab

by Richard Obousy

Physicist Richard Obousy has long been fascinated with the Casimir force, dark energy, and the stability of higher dimensions. His dissertation at Baylor University, in fact, focused on the possibility that dark energy could be an artifact of Casimir energy in extra dimensions. Now project leader of Project Icarus, Obousy here takes a look at a recent paper by Igor Smolyaninov (University of Maryland) that explores the Alcubierre ‘warp drive’ concept from the standpoint of material parameters. Can warp drive be modeled in the laboratory, and under what constraints? Finding the answer may yield new information about this exotic concept. As Smolyaninov says in his paper, “We will find out what kind of metamaterial geometry is needed to emulate a laboratory model of the warp drive, so that we can build more understanding of the physics involved.”

Fermat’s principle dictates that light rays follow the shortest optical paths in media. Effectively they are geodesics, and Einstein’s General Theory of Relativity (GR) has developed the theoretical tools for studying fields in curved geometries.

General Relativity also has the capability to predict the path a light ray would travel under unusual gravitational conditions, for example, black-holes, big bangs, wormholes and warp drives. One of the underlying principles behind GR is that matter affects space in such a way that space becomes curved. GR contains all the necessary mathematical machinery to model this.

Until recently, there has been no way to recreate such exotic gravitational conditions in a lab. The best we can do is simulate them on a computer. However, with the advent of metamaterials, we now have much improved capability to manipulate electromagnetic (EM) radiation and to physically simulate the remarkable phenomena predicted in GR.

This has been done recently in the context of the Alcubierre Warp Drive.

One aspect of Igor Smolyaninov’s recent work that I would like to emphasize is that it is a simulation of the path a light ray would travel – not a duplication of the phenomenon. See, for example, the figure below. GR predicts that an object as massive as the sun has the capacity to bend space in such a way that the path that a light ray takes becomes curved when it enters the influence of the stars gravity. A star, therefore, can appear to be somewhere different from a star-chart’s prediction due to the bending of its path by the sun.

Now, in the absence of the sun, the light ray could be coerced to follow a similar path with the simple inclusion of a device such as a prism, or a mirror, which refracts/reflects the light, forcing it to take an identical path to the one it would have taken in the presence of the sun.

Image: The bending of starlight due to a powerful gravitational field, compared with the ‘simulated’ bending by inserting a prism, or mirror in the appropriate position. Above illustrates a prism, but a mirror, angled correctly, could achieve the same effect. Credit: Richard Obousy.

Of course, the prism, or mirror, is not actually bending spacetime, it is merely mimicking the effect. Analogously, metamaterials provide experimentalists the ability to mimic, or simulate exotic light ray trajectories thanks to the remarkable property of metamaterials having a negative refraction (when examined macroscopically).

So, a black hole, warp drive or big bang is not actually being created -its effects, on light, are simply being simulated.

With regard to the paper ‘Metamaterial-based model of the Alcubierre warp drive’, Smolyaninov derives some of the features that a metamaterial would need in order to simulate the Alcubierre warp drive in the lab. Due to certain physical restrictions, only a simulation of sub-light velocities is possible, up to about 25% the speed of light. Of course, no propulsion actually occurs – just a simulation of the space in the immediate vicinity of a sublight warp drive.

Although Smolyaninov’s paper won’t directly assist us in getting to Alpha Centauri, it is certainly a welcome tool for physicists interested in exploring some of the properties of warp drives in the lab!

The paper is Smolyaninov, “Metamaterial-based model of the Alcubierre warp drive,” available as a preprint.

tzf_img_post