GRB Burst Tests Special Relativity

Gamma ray bursts (GRBs) are much in the news. GRB 090423 turns out to be the most distant explosion ever observed, an event that occurred a scant 630 million years after the Big Bang. We’re detecting the explosion of a star that occurred when the first galaxies were beginning to form. Current thinking is that the earliest stars in the universe were more massive than those that formed later, and astronomers hope to use GRB events to piece together information about them. GRB 090423 was evidently not the death of such a star, but more sensitive equipment like the Atacama Large Millimeter/submillimeter Array (ALMA) will soon be online (ALMA within three years) to study more distant GRBs and open up more from this early epoch.

fermi_map

And then there’s all the fuss about Einstein. The Fermi Gamma Ray Space Telescope, which has already captured more than a thousand discrete sources of gamma rays in its first year of observations, captured a burst in May that is tagged GRB 090510. Evidently the result of a collision between two neutron stars, the burst took place in a galaxy 7.3 billion light years away, but a pair of photons from the event detected by Fermi’s Large Area Telescope arrived at the detector just nine-tenths of a second apart.

Image: This view of the gamma-ray sky constructed from one year of Fermi LAT observations is the best view of the extreme universe to date. The map shows the rate at which the LAT detects gamma rays with energies above 300 million electron volts — about 120 million times the energy of visible light — from different sky directions. Brighter colors equal higher rates. Credit: NASA/DOE/Fermi LAT Collaboration.

These are interesting photons because they possessed energies differing by a million times, yet their arrival times were so close that the difference is probably due solely to the processes involved in the GRB event. Einstein’s special theory of relativity rests on the notion that all forms of electromagnetic radiation travel through space at the same speed, no matter what their energy level. The results seem to confirm this, leading to newspaper headlines heralding ‘Einstein Wins This Round,’ and so on.

And at this point, Peter Michelson (Stanford University, and principal investigator for Fermi’s Large Area Telescope) echoes the headlines:

“This measurement eliminates any approach to a new theory of gravity that predicts a strong energy dependent change in the speed of light. To one part in 100 million billion, these two photons traveled at the same speed. Einstein still rules.”

All of this matters because we’re trying to work out a theory of quantum gravity, one that helps us understand the structure of spacetime itself. The special theory of relativity holds that the energy or wavelength of light has no effect on its speed through a vacuum. But quantum theory implies that, at scales trillions of times smaller than an electron, spacetime should be discontinuous. Think of it as frothy or foamy.

fermi_photons

In that case, we would expect that shorter wavelength light (at higher energies) would be slowed compared to light at longer wavelengths [but see Erik Anderson’s comment below, which argues that this should be reversed, greater delays for lower-energy photons]. So the theories imply, and they’ve been untestable because we don’t have the tools to tease out information at the Planck length and beyond.

Image: In this illustration, one photon (purple) carries a million times the energy of another (yellow). Some theorists predict travel delays for higher-energy photons, which interact more strongly with the proposed frothy nature of space-time. Yet Fermi data on two photons from a gamma-ray burst fail to show this effect, eliminating some approaches to a new theory of gravity. Credit: NASA/Sonoma State University/Aurore Simonnet.

So does special relativity fail at the quantum level? The Fermi instrument has given us the chance to use astronomical data to test these ideas. The 0.9 second difference between the arrival times of the higher and lower-energy gamma rays tells us that quantum effects that involve light slowing proportional to its energy do not show up at least until we get down to about eight-tenths of the Planck length (10-33 centimeters).

That’s useful information indeed, but we still need a way to place Einsteinian gravity into a theory that handles all four fundamental forces, unlike the Standard Model, which unifies three. So what GRB 090510 gives us is early evidence about the structure of spacetime of the sort that future Fermi data can help to refine. Einstein confirmed? At this level, yes, but pushing deeper into the quantum depths via GRB study may help us see, in ways that laboratory experiments cannot show, how gravity at the quantum level fits into the fabric of the universe.

The paper is Abdo et al., “A limit on the variation of the speed of light arising from quantum gravity effects,” Nature advance online publication 28 October 2009 (abstract). The paper on GRB 090423 is Chandra et al., “Discovery of Radio Afterglow from the Most Distant Cosmic Explosion,” submitted to Astrophysical Journal Letters and available online.

tzf_img_post

Advancing Action at NASA (and Beyond)

Back in 2003, I went to Glenn Research Center in Cleveland for a meeting with Marc Millis. The Breakthrough Propulsion Physics project that Millis headed had recently been shut down, but I had the sense that this might be temporary and was eager to talk to him about what BPP had thus far accomplished. My feeling about its reinstatement proved to be inaccurate, and just four years later, NASA also shut down its Institute for Advanced Concepts in Atlanta, leaving a conceptual void at the agency’s core.

Two Takes on Futuristic Studies

NIAC and BPP were working opposite sides of the street even when both were fully funded. Whereas NIAC took a more short-term perspective, funding research projects with implications for space in the not distant future, BPP plunged into far more theoretical terrain, looking at everything from engineering the vacuum to wormhole physics and the potential for warp drive. You could trace some of this impulse back to the Vision-21 gathering in 1990 at what was then Lewis Research Center, a conclave that charged presenters with examining technologies that might emerge not within decades but over the course of the next thousand years.

NIAC had a different slant, serving (in the words of the recent National Research Council report on the institute), as an “open forum for the external analysis and definition of space and aeronautics advanced concepts to complement the advanced concepts activities conducted within NASA.” NIAC opened the doors to non-NASA researchers, receiving more than 1300 proposals and awarding a total of 168 grants in the nine years of its existence. The topics evaluated emphasized known physics and, in most cases, practicality, stopping well short of the highly theoretical and determinedly visionary studies of BPP, which targeted breakthroughs.

NIAC and Its Limitations

I mention all this because the findings of the NRC report, discussed in these pages yesterday, are cheering in one sense, sobering in another, and in both senses deeply instructive. The committee recommends that NASA establish a new organization (NIAC2), one tasked with seeking out “visionary, far-reaching, advanced concepts with the potential of significant benefit to accomplishing NASA’s charter and to begin the process of maturing these advanced concepts for infusion into NASA’s missions.” There is much in the report about the new agency’s placement within NASA (I refer you to the document on this — all the suggestions about internal organization seem sound to me).

Then we run into this:

NIAC’s focus on revolutionary advanced concepts with a time horizon of 10 to 40 years in the future often put its projects too far out of alignment with the nearer-term horizons of the NASA mission directorates, thereby diminishing the potential for infusion into NASA mission plans. The committee recommends that NIAC2 should expand its scope to include concepts that are scientifically and/or technically innovative and have the potential to provide major benefit to a future NASA mission in 10 years and beyond.

We’re talking about a practical problem of working in conjunction with the rest of NASA, and it’s an understandable one. As funding for long-term ideas began to dry up and resources were funneled to flight-system development and operations, there was a natural disconnect that occurred between NIAC and the agency it was established to serve. One way of getting around this, the committee believes, is to open the future NIAC2 proposals to teams within NASA as well as external to the agency.

The committee is advocating a NIAC with a much more limited range. Consider:

…the committee found that NIAC’s focus only on concepts that were revolutionary was too restrictive. There is a spectrum of advances, ranging from incremental or evolutionary improvements in individual components through innovative combinations of existing technologies to produce new results, to concepts that are truly revolutionary because they replace existing capabilities with something very different or enable new missions not previously possible.

Taking ‘Revolutionary’ Off the Table

And the report goes on to spell out the problem. Revolutionary concepts are often considered too ‘far out’ to be relevant to NASA’s immediate needs — in other words, designing an optimum Titan rover doesn’t help when you’ve been tasked with a return to the Moon, and interstellar laser sail design isn’t even on the map. The new phrase being pushed for NIAC2 is ‘technically innovative’ rather than ‘revolutionary,’ with a restriction that concepts, to be funded, should have the potential to provide a major benefit to a future NASA mission or system.

This report is a trenchant document, one that acknowledges the funding issues that have crippled advanced studies at the agency. In making these recommendations (and many others), the NRC committee is also highlighting the fact that without funding dedicated to the task, NASA cannot develop any organization — call it NIAC2 or whatever you choose — that examines the kind of long-term, possibly far-future technologies once considered by the Breakthrough Propulsion Physics project. In that sense, this is sobering reading, a call for a new NIAC but a less visionary one.

New Paradigms Emerging

Advanced studies are clearly moving off the NASA campus, much to the dismay of many who would like to devote their talents to these areas. Given all this, organizations like the Tau Zero Foundation hope to offload the spirit of investigation that motivated Breakthrough Propulsion Physics to a philanthropic and non-governmental environment. Just as we’re seeing a growing move toward commercial space activities, so private funding for futuristic research is increasingly necessary.

Our horizons are shrinking at the government space agency level. Long-term — and this is the only context we can put this in — the human determination to push ideas and explore new places will win out. If that involves surviving funding shortfalls, weathering recessions, shifting to new models of revenue and changing focus from the near future to a far murkier outcome measured in centuries, even millennia, so be it.

“The mind adapts and converts to its own purposes the obstacle to our acting,” said the Roman emperor Marcus Aurelius in a book (his Meditations) that I return to often. “The impediment to action advances action.” Acquire that spirit and good things must of necessity follow, no matter how challenging the research picture currently looks.

tzf_img_post

NIAC Redux: A Visionary Future

The decision to close NASA’s Institute for Advanced Concepts (NIAC) in 2007 was a blow to the research community, especially given the fact that the agency’s Breakthrough Propulsion Physics project had been shelved some years previously. These twin haymakers to the study of futuristic technologies emphasized the lack of support for spending money on anything beyond the near-term, and reminded us that forty years after the fact, we still can’t manage even a return to the Moon.

NIAC seemed to offer better. Established in 1998, the Atlanta-based program offered non-NASA scientists a chance to delve into revolutionary space and aeronautics concepts, with a multi-tiered funding strategy and the potential for the best ideas to receive further study within the agency (or in a number of cases, from sources outside it). NIAC was hardly a budget-breaker, totalling $36.2 million spread across the nine years of its existence.

A New Report Looks at Invigorating Research

Now we have a new report on NIAC’s effectiveness and possible future iterations, one commissioned by the NASA administrator and performed by the National Research Council. Fostering Visions for the Future: A Review of the NASA Institute for Advanced Concepts is available online, either readable there or (the better choice) via a quick download. The report takes a critical look at the current situation:

In the 1980s, under the pressure of limited budgets, NASA retreated from its exciting, risk-taking, high-technology culture. At present, its big programs, all very costly, relate either to continued low-Earth-orbit human spaceflight with little cutting-edge technology involved, or to the planned return of humans to the Moon in a manner that looks remarkably like the Apollo program with an infusion of existing 21st-century technology. Today, NASA’s investment in advanced concepts and long-term technological solutions to its strategic goals is minimal.

What to do? Getting research on longer-term technologies back into the NASA playbook is critical. NIAC not only sparked ideas, but led to solid results. The program supported 126 phase I studies and 42 Phase II studies in the course of its brief existence, and about 29 percent of the Phase II efforts went on to secure additional funding from NASA and other sources. The NIAC Web site, still available, offers the corpus of this work. And the report highlights three projects that appear to have had an impact on NASA’s long-term plans.

Mini-Magnetosphere Plasma Propulsion

We’ve looked at M2P2 several times on this site, fascinated with Robert Winglee’s work on using energy from space plasmas to accelerate payloads. Winglee’s team at the University of Washington used Phase II funding to perform laboratory tests on their proposed magnetic-inflation process and to confirm the effect, work that led to further evaluation at Marshall Space Flight Center. Here’s a quick precis from the report:

These experiments were able to quantify the performance of the prototype through comparative studies of the laboratory test results with the simulation results and provided strong evidence that the high thrust levels (1-3 N) reported in the original description should be achievable for low energy input (~500 kW) and low propellant consumption… Further testing to measure the thrust levels attainable by the prototype, however, did not confirm measurable thrust. In the 2001 to 2002 time frame, the M2P2 concept was considered a viable, emerging technology by the NASA Decadal Planning Team and the NASA Exploration Team. Through peer review, the M2P2 effort was deemed highly innovative and technically competent. In 2002, a review panel that included plasma experts concluded there were additional unresolved technical issues that centered around magnet field strengths, mass, and power requirements. While partially addressed by the M2P2 team, this work came to a stop due to changing priorities within the agency.

Micro-Arcsecond X-ray Imaging Mission

This is Webster Cash’s work at the University of Colorado at Boulder. Phase I work validated the idea of using an array of x-ray mirrors on free-flying spacecraft that would be coordinated to focus the x-rays on a set of beam-combining and detector spacecraft. Additional work at MSFC was promising and the x-ray interferometry proposal went into Phase II funding, being incorporated in 2000 into NASA strategic plans. From the report:

Dubbed MAXIM, the concept appeared in the NRC decadal survey of astronomy and astrophysics released in 2000, which identified x-ray interferometry for $60 million in funding over the following 10 years. Cash has selected as a long-range goal to image the event horizon of a black hole. While the technical implementation remains extremely challenging, the fact that the laboratory demonstration of this capability was published in Nature testifies to the significance of this accomplishment.

Indeed. The concept seems more than workable, and went on to further study:

NASA has continued support to further define and develop high-resolution x-ray imaging missions, and Cash’s interferometry concept has remained among the leading contenders. The MAXIM Pathfinder mission was the subject of a NASA Goddard Space Flight Center (GSFC) Integrated Mission Design Center study in 2002. In 2004 MAXIM received a $1 million 3-year grant from NASA’s Astronomy and Physics Research and Analysis Program to further develop the optics for this concept. Today, the technology of x-ray interferometry that was the subject of the initial NIAC study is the first of three competing methods that NASA is pursuing under its Black Hole Imager mission.

New Worlds Observer

This again is a Webster Cash proposal, and one we’ve examined under various names (all incorporating ‘New Worlds’) on this site (a search will pull these up). Now we’re into Terrestrial Planet Finder territory, with a project that was refined through NIAC funding to study pinhole camera and occulting mask designs for direct imaging of planetary systems around other stars. This one also went into Phase II and the results were strong:

During Phase II, Cash and his collaborators demonstrated suppression performance (reduction of starlight intensity) <10-7 in a laboratory test of a miniature occulter. Both a publication in Nature in July 2006 and the laboratory demonstration testify to the significance and technical competence of the basic concept and the research supported by NIAC.

In fact, the various New Worlds designs Cash has studied make it clear that former NASA administrator Dan Goldin’s dream of imaging and even mapping an extraterrestrial world is not out of the question in coming decades. NASA liked this one a lot:

With the completion of the NIAC Phase II study, NASA provided significant additional support for Cash’s occulter concept, and it is now one of the competitive concepts for the Terrestrial Planet Finder program. In addition, both Ball Aerospace Corporation and Northrop Grumman Corporation have made internal investments to further develop the concept in conjunction with Cash and his team. In February 2008, NASA announced that a team led by Cash was awarded $1 million for the New Worlds Observer as one of its Astrophysics Strategic Missions Concept Studies (ASMCS). That study has been completed and the results will be used to prepare the New Worlds Observer mission concept for the NRC’s Astronomy and Astrophysics Decadal Survey, Astro2010.

Solid Results, and a Problem

While these three projects in particular stand out, it’s interesting to see that fourteen Phase I and Phase II projects received an additional $23.8 million in funding from a wide range of organizations, demonstrating (in the language of the report) “…the significance of the nation’s investment in NIAC’s advanced concepts.” The report also identifies a key problem:

One of the weaknesses of the NIAC program was the lack of sufficient funding to mature the selected concepts to the point that a NASA program could take substantial interest. By design, NIAC concepts completing Phase II were certainly not at a technology readiness level that allowed adoption by a NASA flight program. This technology-readiness disconnect between the external innovators and NASA program personnel made infusion of NIAC concepts into future agency missions or strategic plans exceedingly difficult.

The NRC report recommends that NASA reestablish an entity something like NIAC, recognizing that the agency needs open methods to secure access to new mission and system concepts from any source, and not just those developed within NASA. The agency also needs to develop effective processes for evaluating new ideas for future missions and bringing these ideas to fruition. A new program, the report calls it NIAC2, would be a key to this.

Recognize this: At present, there is no NASA organization responsible for soliciting, evaluating and developing advanced concepts and infusing them into NASA planning.

A new NIAC would help, but would it look like the old one? More on this tomorrow.

tzf_img_post

Habitable Moons and Kepler

While we’ve looked several times in these pages at David Kipping’s work on exomoons, the investigation of moons much closer to home reminds us that finding a habitable satellite of another planet may not be out of our reach. After all, we’re gaining insights into possible habitats for at least microbial life on (or in) places like Europa and Enceladus, and speculations about similar biospheres within some Kuiper Belt objects also keep them in contention.

So what about a habitable moon around a distant gas giant? Kipping (University College London) has now gone to work on the question in relation to the Kepler space telescope. His findings are striking: A Saturn-sized planet in the habitable zone of an M-dwarf star would allow the detection of an exomoon down to 0.2 Earth masses.

cosmic_chasm_m

Image: A habitable exomoon would offer an exotic vista, a view that may be more common in the galaxy than we have previously imagined. Credit: Dan Durda.

Now that sounds unusual, given that Kepler can’t find planets of such small size. How, then, does Kipping hope to find exomoons at this scale? The answer is that an exomoon detection depends on two measurements, neither of which demands observing the dip in starlight caused by the moon itself. What Kipping’s team is looking for is the effect the moon has on the planet, and the transit of that Saturn-class world around an M-dwarf is something Kepler can work with.

The method relies on two sets of observations, the first being transit timing — variations in the amount of time it takes a transiting planet to complete its orbit can be the signal of a moon. Add transit timing variation to the second measurement — transit duration — and you can nail down the presence of that moon. Transit duration measures the speed at which the planet actually passes in front of the star. Detecting Life-Friendly Moons, on the Astrobiology Magazine site, explains that the two signals occur separately when a moon is involved, screening out other possible causes.

The Kepler researchers have their hands full looking for exoplanets, but an exomoon survey using Kepler data is certainly in the running for future investigation. If a moon as small as a third of Earth mass can hold onto a magnetic field, life could develop despite the presence of nearby planetary radiation belts. Such a moon, Kipping believes, would be detectable in the right transit situation as much as 500 light years from the Sun.

“There may be just as many habitable moons as habitable planets in our galaxy,” Kipping tells Astrobiology Magazine, an audacious thought with all kinds of ramifications for the Fermi paradox. Remarkably, combining the two measurements his team works with would allow Kipping to estimate both the moon’s orbital period and its mass. Run that to its limit and you get this: Knowing the orbital period would allow prediction of a lunar eclipse whose spectroscopic study could then reveal the signature of atmospheric gases on the moon’s surface. That’s pushing current technology to the max, but it’s increasingly feasible as we tune up our resources.

The paper is Kipping et al., “On the detectability of habitable exomoons with Kepler-class photometry,” Monthly Notices of the Royal Astronomical Society, published online 24 September, 2009 (abstract).

tzf_img_post

Refining the Deuterium Starship

by Adam Crowl

Adam Crowl has been following Friedwardt Winterberg’s fusion concepts for some time, and now weighs in with a look at Winterberg’s latest thinking on the use of deuterium reactions in advanced propulsion designs. If fusion is our best bet for interstellar missions, we need to get past the limitations of deuterium/tritium, which produces a neutron flux of such proportion that a manned mission would pay a huge penalty in shielding. Winterberg’s ideas on thermonuclear deuterium reactions offer a technique with high exhaust velocities, one with interesting echoes of Project Orion.

Back in the 1960s Robert Enzmann imagined immense fusion-propelled starships which saved tankage mass by storing frozen fusion fuel – chiefly deuterium – as a huge frozen ball. Enzmann and his co-workers eventually found that deuterium isn’t a very strong solid and a tank of some sort would be needed for mechanical support under acceleration. Even so attaching a starship to a great big mass of frozen deuterium seems a good idea, in light of Dr. Friedwardt Winterberg’s updated e-print from June, entitled “Advanced Deuterium Fusion Rocket Propulsion for Manned Deep Space Missions.”

ASF_0515

Of course the trick is igniting the deuterium-deuterium reaction and getting a high fusion-burnup fraction out of the fuel-target. Winterberg is an unapologetic skeptic of SF style interstellar ‘short-cuts’, rightly pointing out the non-evidence for anything like wormholes or warp-drives. Interstellar travel can only be seriously contemplated on the grand scale via fusion rockets.

Image: The Enzmann starship, as envisioned by artist Rick Sternbach in a 1973 issue of Analog.

Igniting Deuterium (and the Benefits)

The two fusion reactions presently within our technological reach require relatively abundant deuterium, namely deuterium-tritium and deuterium-deuterium reactions. The D-T reaction has two liabilities – most of its energy goes into uncharged neutrons and the tritium has to be continually made via breeding it in a lithium jacket subjected to that same neutron bombardment.

The D-D reaction is not so easily done as the D-T, but Winterberg’s work makes the prospects look good with sufficient effort. In this most recent paper he suggests a way of reducing the wasteful loss of kinetic energy to electrons and neutrons. Firstly the electrons are allowed to escape the fusion target after the initiation of the fusion compression process – this merely requires the target to have a sufficiently small width. Secondly the neutron/ion kinetic energy fraction can be altered by causing an auto-catalytic fusion detonation wave to form in the burning front of the fusion reaction. This causes x-rays produced by the reaction to focus on the unburnt fuel, preparing it for the fusion reaction itself.

The estimated exhaust velocity is fairly high, an estimated 19,000 km/s (0.063 c) which means a 120,000 ton starship attached to 12,000,000 tons of deuterium can do a delta-vee of ~0.29 c. With an efficient magnetic sail that means the journey speed approaches ~0.29 c, albeit with the mass-penalty of the sail.

The Shadow of Orion

Winterberg’s discussion gives some interesting insights into the “Orion” atomic rocket effort. Freeman Dyson wasn’t alone in being disturbed by the fallout issues with the fission pulse units and triggers and it was this concern which inspired Winterberg to develop non-fission triggers for pulsed nuclear fusion. Thus all the fusion starships inspired by Winterberg’s early work on electron beam ignition, like “Daedalus”, were in turn inspired by “Orion” and the fallout fears of its developers.

Non-fission triggers are a laudable goal of fusion propulsion research, but they pose a conundrum for slowing proliferation of thermonuclear weapons. Winterberg prefers a very high-powered proton beam ignition system for in-space flight, which would require a mile-long Super-Marx Generator to store up enough power for the beam here on Earth. Not a proliferation risk in either manifestation. However his preferred fusion launcher system does have military implications, using high-explosive driven argon UV lasers to trigger a two-stage DT/DD fusion pulse. Such a device could be used as a low radioactivity nuclear weapon – tritium dispersal poses its chief radiological hazard.

winterberg_3.jpg

Image: Pure deuterium fusion explosion ignited with an intense ion beam. D deuterium rod, h hohlraum, I ion beam, B magnetic field, R miniature target rocket chamber, H2 solid hydrogen, L laser beam to heat hydrogen in miniature rocket chamber. Credit: F. Winterberg.

Fission and the Space Imperative

Winterberg’s final discussion of keV energy chemical super-explosives is the more disturbing prospect in some ways. Lines of research indicate that some elements can be forced into metastable bonds between their inner electron shells, thus storing up keV energies in chemical form. Regular outer shell chemistry produces mere electron volts of energy per reaction, released as visible light and UV light photons, but keV energies mean x-ray photons, the sine qua non of fusion triggering.

In all likelihood such super-explosives will require dynamic compression to create, probably requiring the apparatus to be too heavy for bombs – but what if such materials could be ‘quenched’ to STP conditions? Then we’d have a real proliferation risk. Ted Taylor, the late co-worker of Dyson on “Orion”, once dreamt a solution to non-fission triggering of fusion reactions, but never told anyone for fear of the implications.

As Arthur C. Clarke once said nuclear power not only makes spaceflight possible, but imperative…

tzf_img_post