Extremely Large Telescopes and the Hunt for Terrestrial Worlds

How large can a telescope get? Today’s largest optical telescopes boast 10-meter mirrors (33 feet across). But the recent Royal Astronomical Society meeting in Birmingham (UK) heard the case for much larger instruments, on the order of 50 to 100 meters (165-330 feet) in diameter, optical instruments the size of the Deep Space Network’s largest antennae. Moreover, such instruments would have as much as forty times the spatial resolution of the Hubble Space Telescope, though operating deep within Earth’s atmospheric well.

European astronomers have been engaged in this study for the past four years; you can see a synopsis of their work in an online brochure called “Extremely Large Telescopes: The Next Step In Mankind’s Quest For The Universe” (PDF warning). The conclusions of their report are remarkable:

The vast improvement in sensitivity and precision allowed by the next step in technological capabilities, from today’s 6-10 m telescopes to the new generation of 50-100 m telescopes with integrated adaptive optics capability, will be the largest such enhancement in the history of telescopic astronomy. It is likely that the major scientific impact of these new telescopes will be discoveries we cannot predict, so that their scientific legacy will also vastly exceed even that rich return which we can predict today.

The key here is that the spatial resolution of a telescope improves in proportion to its diameter. For that reason, Extremely Large Telescopes (ELTs) will be able to join space-borne missions like Terrestrial Planet Finder in an active search for Earth-like planets, and should be able to analyze the makeup of their atmospheres. The report estimates that a 30-meter telescope equipped with adaptive optics should be able to study Jupiter-class planets around stars tens of light years away. A 100-meter telescope would be able to detect Earth-like planets around stars as far as 100 light years away, within which confines are located roughly 1000 Sun-like stars.

Dr. Isobel Hook of Oxford University told the RAS meeting that a 50 to 100-meter telescope could be built within 10 to 15 years for a cost of about 1 billion Euros. A major design study is now beginning in Europe to develop the technology needed to build such telescopes. For its part, the European Southern Observatory has been working since 1997 on the OWL (Overwhelmingly Large Telescope) feasibility study, which is expected to be completed by the fall of this year. OWL would be the mother of all ELTs, a 100-meter telescope that would be capable of detecting life-bearing planets around other stars.

Image (click to enlarge): A crude simulation showing the increase of angular resolution with telescope size. OWL angular resolution will be 40 times better than that of the Hubble Space Telescope. Credit: European Southern Observatory.

Centauri Dreams‘ note: Space-based and ground observatories are both necessary no matter how efficiently Earth-based telescopes can gather light. Space telescopes can work with x-ray and infrared wavelengths that are absorbed or blocked by the atmosphere. Adaptive optics do wondrous things in compensating for atmospheric blurring, but they can only get you into the near-infrared, leaving a wide range of observations to their space-based counterparts.

Two curves are converging here. First, we are at a point where the advantages of space telescopes are mitigated by their cost and the difficulty of maintenance, as the Hubble instrument has so graphically demonstrated. The second curve follows the cost of ground-based instruments and weighs their feasibility as mirror sizes continue to increase. At some point, our space-based infrastructure should be flexible enough to operate instruments in space that will eclipse even the best forseeable ground telescopes at all wavelengths — and at lower cost — but that day may be decades away.

Voyager and the Benefits of ‘Slow Science’

The Washington Post takes note of the possible suspension of funding for the two Voyager spacecraft in a story by Rick Weiss called Our Incredible Shrinking Curiosity. As discussed here in a previous entry, NASA is eying the Voyager budget of $4.2 million per year as it ponders cutbacks. Such news, says Weiss, leaves him “…depressingly convinced that these 8 billion-mile-long extensions of human curiosity are indeed now smarter, or at least more enlightened, than the mortals who made them.”

In Weiss’ view, “…the U.S. scientific enterprise is riddled with evidence that Americans have lost sight of the value of non-applied, curiosity-driven research — the open-ended sort of exploration that doesn’t know exactly where it’s going but so often leads to big payoffs.”

But is it a lack of curiosity that motivates such cuts, or something more temporal? Isn’t the real culprit our inability to think in the longer time frames required of what Stewart Brand calls ‘slow science’? The Voyagers have been on their journey since 1977, and probably have fifteen years or so of useful science left in them. But assuming they last until 2020, we’re talking a 43-year mission, one that has been in some cases handed off to a new generation of scientists, passed along like an heirloom out of the past. Slow science glories in such projects, but we live in a bottom-line culture that has trouble seeing beyond the next quarterly return, much less into the next century.

Voyager in flightCentauri Dreams would argue that this applies to the other examples Weiss discusses of projects threatened with extinction. Geologists who received Congressional authorization to deploy thousands of sensors to learn more about tectonic forces have found only a fraction of the money appropriated, and a mere 62 of the planned 7000 sensors have been deployed. Meanwhile, the Department of Energy is ending the BTeV project at Fermilab, jeopardizing our chances of finding the so-called ‘supersymmetric’ particle. The demand for immediate results in terms of products and profits makes it all to easy to cut back on projects that might take decades to bear fruit.

Here’s what Brand has to say about slow science in his book The Clock of the Long Now (New York: Basic Books, 1999):

Enormous, inexorable power is in the long trends, but we cannot measure them or even notice them without doing extremely patient science. These days science is more often driven at commercial or even fashion velocity than at the deliberate pace of governance or the even slower pace of nature. As history accelerates, people become fast learners, and that’s good, but it is also a problem. ‘Fast learners tend to track noisy signals too closely and to confuse themselves by making changes before the effects of previous actions are clear,’ says decision analyst James Marsh. Quiz shows and classroom teachers reward the quick answerer. This is not helpful in domains where the quick answer is the wrong answer.

A case in point: a nine-year study in Africa concluded that burning new woody growth in open grassland could not prevent the woods from taking over. A forty-year study of the same issue proved the opposite; researchers learned that it simply takes more than ten years of annual fires to keep woody rootstocks from resprouting.

How much might we learn about the heliosphere and the interstellar medium beyond by studying the data the two Voyagers will still be returning for another fifteen years? Raising the alarm on this issue impacts more than two spacecraft. It affects the way we do science itself, our motivations for the pursuit, our appreciation for the challenges the universe presents us. To go beyond the futility of the daily quick fix requires recapturing the sense of science as legacy, a priceless gift to be passed along to those who come after us.

Fundamental Constant May Need Tweaking

Michael Murphy has been studying the fundamental constants of nature — numbers that are key to any given theory of how the universe works — for the past five years. His work at the Institute of Astronomy at Cambridge University has particularly foused on possible changes to the fine structure constant, a number central to electromagnetism, and therefore crucial to the interaction between light and matter. If its value were slightly different, life could not exist, although tiny changes over time could be tolerated.

Normally denoted by the Greek letter α (alpha), the fine structure constant can be worked out through experiment to great precision. According to Murphy, the numbers come to 1/alpha = 137.03599958, with an experimental uncertainty of a mere 0.00000052. But as the astronomer told the Physics 2005 conference at the University of Warwick (UK) today, the fine structure constant may have had a slightly different value in the early universe.

Visible light view of quasar 3C 273Murphy bases his conclusion on observations performed with the Keck 1 10-meter telescope on Mauna Kea (Hawaii). By recording the spectra of distant quasars as their light passes through intervening dust clouds, Murphy’s team can see how the light is absorbed and measure the atomic ‘fingerprint’ of the clouds’ atoms as they were ten billion years ago. These clouds are primarily hydrogen, but also contain traces of heavier atoms like magnesium, silicon, iron, nickel and zinc. A comparison between these spectra and comparable measurements here on Earth shows slightly differing wavelengths than expected. And because the appearance of these absorption lines is governed by the fine structure constant, Murphy believes he is seeing changes in its value.

Image: The Hubble Space Telescope’s Advanced Camera for Surveys (ACS) has provided the clearest visible-light view yet of the nearby quasar 3C 273. Can light from such objects help us find cracks in physical theories previously thought solid? Credit: NASA and J. Bahcall (IAS)

From Murphy’s Web site:

…our results have the potential to revolutionise the way we understand the universe on all scales, from the subatomic to the universal. All of modern physical theory is based on the assumption that the laws of physics remain the same no matter where or when you happen to be. Physicists have what is called a “standard model” of the universe which allows them to explain all observed phenomena. This standard model cannot explain variations in the constants and so, if our results are correct, the standard model would need a complete overhaul: we will have discovered the first hint of a completely new set of physical laws, hitherto unseen and not to be understood for some time.

Ahead for Murphy is work with a new dataset from the Very Large Telescope in Chile. But he believes that a different type of experiment is needed to confirm the work. Possible candidates: measurements of fluctuations in the cosmic background radiation, or measurements of the abundance of elements in the primordial universe. In each case, obtaining results with the needed precision may be some years away, so what we are dealing with is a suggestion of a changing constant that could force us to reevaluate our knowledge of fundamental physical laws.

Centauri Dreams‘ take: Work like Murphy’s underlines the need for NASA to reconstitute its Breakthrough Propulsion Physics program. While the chance of major breakthroughs in a short timeframe is slight, the point is that a modest investment in continuing research allows us to examine the puzzling anomalies around the edges of known theory that may — or may not — point to further physics in need of understanding. Until we comprehend the connection between gravity and quantum mechanics, we should assume a new set of physical laws remains to be discovered, with ramifications we cannot anticipate.

For more, see the article “Are the laws of nature changing with time?” in Physics World. For Murphy’s recent work, see the paper “Limits on variations in fundamental constants from 21-cm and ultraviolet quasar absorption lines” at the ArXiv site. An exhaustive list list of publications from Murphy and team is here.

GQ Lupi B: Exoplanet or Brown Dwarf?

The recent image of a possible planet around the star GQ Lupi has met with understandable enthusiasm in the press, but we still don’t know whether the small object just to the right of the star in the image below is a planet or a brown dwarf. The boundary between brown dwarf and planet is tricky terrain, but the European Southern Observatory fixes it at roughly 13 Jupiter masses, which is the critical mass needed to ignite deuterium. Brown dwarfs, then, are objects heavier than that.

Photograph of GQ LupiBut the observations of Ralph Neuhäuser and colleagues do not provide a direct estimate of the smaller object’s mass. Because GQ Lupi and its companion presumably formed at the same time, then the new object is young, and traditional models for such calculations may not apply. But using them, according to a press release from the ESO, implies that the object is somewhere between 3 and 42 Jupiter masses. In other words, based on what we can determine so far, GQ Lupi b is either a planet or a brown dwarf.

Image: The photo above shows the VLT NACO image, taken in the Ks-band, of GQ Lupi. The feeble point of light to the right of the star is the newly found cold companion. It is 250 times fainter than the star itself and it located 0.73 arcsecond west. At the distance of GQ Lupi, this corresponds to a distance of roughly 100 astronomical units. North is up and East is to the left. Credit: European Southern Observatory.

From the press release:

These early phases in brown dwarf and planet formation are essentially unknown territory for models. It is very difficult to model the early collapse of the gas clouds given the conditions around the forming parent star. One set of models, specifically tailored to model the very young objects, provide masses as low as one to two Jupiter-masses. But as Ralph Neuhäuser points out “these new models still need to be calibrated, before the mass of such companions can be determined confidently”.

The astronomers also stress that from the comparison between their VLT/NACO spectra and the theoretical models of co-author Peter Hauschildt from Hamburg University (Germany), they arrive at the conclusion that the best fit is obtained for an object having roughly 2 Jupiter radii and 2 Jupiter masses. If this result holds, GQ Lupi b would thus be the youngest and lightest exoplanet to have been imaged.

The only way to sort this out is through further observations. Meanwhile, you can read an unedited version of the Neuhäuser paper “Evidence for a co-moving sub-stellar companion of GQ Lupi” here (PDF warning). The paper has already been accepted for publication in Astronomy and Astrophysics.

Emergence of the ‘Dark Energy Star’

“It’s a near certainty that black holes don’t exist,” says George Chapline. A physicist at the Lawrence Livermore National Laboratory, Chapline has an alternative explanation: when a massive star collapses, what remains is not a black hole but a star that’s filled with dark energy. Some 70 percent of the universe seems to be composed of dark energy, though no one knows precisely what it is. Chapline’s work may lead to new insights into the stuff.

A preprint of Chapline’s paper “Dark Energy Stars” appears at the ArXiv site, where the author lays down the gauntlet early on:

In the 1950s a consensus was reached, partly as a result of meetings such as a famous meeting at Chapel Hill in 1957, that although quantum effects might be important below some very small distance, on any macroscopic scale the predictions of classical general relativity (GR) should be taken seriously. In the
summer of 2000 Bob Laughlin and I realized that this cannot possibly be correct. Indeed I am sure it will be a puzzle to future historians of science as to why it took so long to realize this.

And again,

The picture of gravitational collapse provided by classical general relativity cannot be physically correct because it conflicts with ordinary quantum mechanics.

Spiral galaxy NGC 300Strong assertions, but we have to come to grips with the apparent conflict between general relativity, where universal time does not exist and clocks operate at different speeds depending on their situation, and quantum mechanics, which demands a universal time for its equations to work. An event horizon, as postulated around the now familiar concept of a black hole, “…makes it impossible to everywhere synchronize atomic clocks,” according to Chapline.

Image: The spiral galaxy NGC 300. Black holes are thought to lurk at the center of many such galaxies, but are they really huge stars filled with dark energy? Credit: Jet Propulsion Laboratory.

The upshot is that when a star collapses, space-time inside it goes through a ‘quantum phase transition’ into dark energy. The dark star’s negative gravity could cause matter, once sucked in, to re-emerge. A story on Nature.com explains the result:

If the dark-energy star is big enough, Chapline predicts, any electrons bounced out will have been converted to positrons, which then annihilate other electrons in a burst of high-energy radiation. Chapline says that this could explain the radiation observed from the centre of our galaxy, previously interpreted as the signature of a huge black hole.

He also thinks that the Universe could be filled with ‘primordial’ dark-energy stars. These are formed not by stellar collapse but by fluctuations of space-time itself, like blobs of liquid condensing spontaneously out of a cooling gas. These, he suggests, could be stuff that has the same gravitational effect as normal matter, but cannot be seen: the elusive substance known as dark matter.

Centauri Dreams‘ note: Chapline’s theories are one way to explain gamma-ray bursts, a subject we looked at just yesterday. Matter falling onto the surface of a dark energy star could trigger these phenomena, the result of interactions among the decay products as the matter re-emerges from the star.