A Lab Experiment to Test Spacetime Distortion

Sonny White’s work on exotic propulsion has galvanized the press, as witness this story in the Daily Mail, one of many articles in newspapers and online venues. I was fortunate enough to be in the sessions at the 100 Year Starship Symposium where White, an engaging and affable speaker, described what his team at Eagleworks Laboratories (Johnson Space Center) is doing. The issue at hand is whether a so-called ‘warp drive’ that distorts spacetime itself is possible given the vast amounts of energy it demands. White’s team believes the energy problem may not be as severe as originally thought.

Here I’ll quote Richard Obousy, head of Icarus Interstellar, who told Clara Moskowitz in Space.com: “Everything within space is restricted by the speed of light. But the really cool thing is space-time, the fabric of space, is not limited by the speed of light.”

On that idea hangs the warp drive. Physicists Michael Pfenning and Larry Ford went to work on Miguel Alcubierre’s 1994 paper, the first to examine the distortion of spacetime as a driver for a spacecraft, to discover that such a drive would demand amounts of energy beyond anything available in the known universe. And that was only the beginning. Alcubierre’s work demanded positive energy to contract spacetime in front of the vessel and negative energy to expand spacetime behind it. Given that we do not know whether negative energies densities can exist, much less be manipulated by humans, the work remained completely theoretical.

Image: A starship (in the center of the ring) taking advantage of the distortion of spacetime. Credit: Harold White.

But interesting things have developed since the original Alcubierre paper. Running quickly through what White told the Houston audience, Chris van Den Broeck was able to reduce the energy costs of a warp drive significantly and other theorists have continued to drop the numbers. White’s team has been examining ways to continue that progression, but what is eye-catching is that he is working on a laboratory experiment to “perturb spacetime by one part in ten million” using an instrument called the White-Juday Warp Field Interferometer to create the minute spacetime disruption.

I know of no teams other than White’s who are looking at lab work that could tell us whether a perturbation of spacetime can actually be created. From a NASA document on this work:

Across 1cm, the experimental rig should be able to measure space perturbations down to ~1 part in 10,000,000. As previously discussed, the canonical form of the metric suggests that boost may be the driving phenomenon in the process of physically establishing the phenomenon in a lab. Further, the energy density character over a number of shell thicknesses suggests that a toroidal donut of boost can establish the spherical region. Based on the expected sensitivity of the rig, a 1cm diameter toroidal test article (something as simple as a very high-voltage capacitor ring) with a boost on the order of 1.0000001 is necessary to generate an effect that can be effectively detected by the apparatus. The intensity and spatial distribution of the phenomenon can be quantified using 2D analytic signal techniques comparing the detected interferometer fringe plot with the test device off with the detected plot with the device energized.

So it’s interesting stuff, and it takes us to an even lower energy requirement, from the mass-energy of a planet the size of Jupiter to, in White’s view, a mass about the size of one of our Voyager probes. The reduction in the exotic matter/negative pressure required is managed by optimizing the warp bubble thickness and also by oscillating the bubble intensity, which according to White’s mathematics reduces the stiffness of spacetime. Thus we go from a Jupiter-sized portion of exotic matter to an amount weighing less than 500 kg.

White said the test was an attempt to prove that spacetime perturbation is possible, and likened it to a ‘Chicago pile moment.’ It was in 1942 that the first demonstration of a controlled nuclear reaction produced just half a watt of power, but a year later a four megawatt reactor was already in operation. With no tidal forces inside the warp bubble and a proper acceleration of zero, a future craft would be an undemanding platform in which to travel, and White pointed out that clocks aboard the spacecraft would move at the same rate as clocks back on Earth. It’s an exotic idea, but one that White’s lab testbed will now poke and prod to see if it’s possible.

Addendum: Al Jackson just sent me an email about other matters, but it includes a portion that’s specifically related to the above topic that I want to quote:

“I did my doctorial stuff in General Relativity. When I was in Austin for Armadillocon, last August, I asked my adviser, Richard Matzner, about the Alcubierre deal, since Richard does a lot of numerical GR he knows Alcubierre (who is an ace numerical GR guy), says he never heard him talk about his warp drive. Richard is not much interested in it either, thinks the solution is Lyapunov unstable. I have seen some works from Italy about Alcubierre and other ‘exotic matter’ warp solutions that show the models are unstable. Richard said he thinks Kip Thorne is no longer interested in it. I have never seen a really ‘heavy hitter’ like Hawking or Thorne, or a whole lot of other first string GR theorists ever remark on Alcubierre or the other recent solutions. There was a ‘name’ relativistist, William A. Hiscock, who did, he felt the solutions were not physical, but he thought people should keep trying. Alas that guy died young, only a few years ago.

But it is interesting that these solutions exist. I think, it’s going to take more imagination and further discoveries before something can be made of this.”

tzf_img_post

Exotic Detections: Wormholes and Worldships

SETI always makes us ask what human-centered assumptions we are making about extraterrestrial civilizations. When it comes to detecting an actual technology, like the starships we’ve been talking about in the last two posts, we’ve largely been forced to study concepts that fit our understanding of physics. Thus Robert Zubrin talks about how we might detect a magsail, or an antimatter engine, or a fusion-powered spacecraft, but he’s careful to note that the kind of concepts once studied by the Breakthrough Propulsion Physics Project at NASA may be undetectable, since we really don’t know what’s possible and what its signature might be.

I mentioned zero-point energy in a previous post because Zubrin likewise mentions it, an idea that would draw from the energy of the vacuum at the quantum level. Would a craft using such energies — if it’s even possible — leave a detectable signal? I’ve never seen a paper on this, but it’s true that one classic paper has looked at another truly exotic mechanism for interstellar travel, the wormhole. These shortcuts through spacetime make space travel a snap. Because they connect one part of the universe to another, you go in one end and come out the other, emerging into another place and, for all we know, another time.

The fact that we don’t know whether wormholes exist doesn’t mean we can’t think about how to detect one, although the authors of the classic paper on wormhole detection make no assumptions about whether or not any intelligent species would actually be using a wormhole. The paper is “Natural Wormholes as Gravitational Lenses,” and it’s no surprise to find that its authors are not only wormhole specialists like Matt Visser and Michael Morris, but physicists with a science fiction connection like John Cramer, Geoffrey Landis, Gregory Benford and the formidable Robert Forward.

Image: A wormhole presents a shortcut through spacetime. Can one be detected? Credit: Wikimedia Commons.

The analysis assumes that the mouth of a wormhole would accrete mass, which would give the other mouth a net negative mass that would behave in gravitationally unusual ways. Thus the GNACHO (gravitationally negative anomalous compact halo object), which playfully echoes the acronym for massive compact halo objects (MACHOs). Observationally, we can look for a gravitational lensing signature that will enhance background stars by bending light in a fundamentally different way than what a MACHO would do. And because we have MACHO search data available, the authors propose checking them for a GNACHO signature.

In conventional gravitational lensing, when a massive object moves between you and a much more distant object, a greatly magnified and distorted image of the distant object can be seen. Gravitational lensing like this has proven a useful tool for astrophysicists and has also been a means of exoplanet detection. But when a wormhole moves in front of another star, it should de-focus the light and dim it. And as the wormhole continues to move in relation to the background star, it should create a sudden spike of light. The signature, then, is two spikes with a steep lowering of light between them.

The authors think we might find the first solid evidence for the existence of a wormhole in our data by looking for such an event, saying “…the negative gravitational lensing presented here, if observed, would provide distinctive and unambiguous evidence for the existence of a foreground object of negative mass.” And it goes without saying that today’s astronomy, which collects information at a rate far faster than it can be analyzed, might have such evidence tucked away in computer data waiting to be discovered by the right search algorithms.

Would a wormhole be a transportation device? Nobody knows. Assuming we discover a wormhole one day, it would likely be so far away that we wouldn’t be able to get to it to examine its possibilities. But it’s not inconceivable that a sufficiently advanced civilization might be able to create an artificial wormhole, creating a network of spacetime shortcuts for instantaneous travel. Matt Visser has discussed a wormhole whose mouth would be held open by negative energy, ‘…a flat-space wormhole mouth framed by a single continuous loop of exotic cosmic string.’ A primordial wormhole might survive from the early universe. Could one also be created by technology?

Civilizations on the Brink

More conventional means of transport like solar or laser-powered sails present serious problems for detection. In Jerry Pournelle and Larry Niven’s The Mote in God’s Eye, an alien lightsail is detected moving at seven percent of the speed of light, its spectrum the same as the star that it is approaching but blueshifted, which is how analysts have determined it is a sail. The novel’s detection occurs with far more sophisticated observatories than we have in our day, when finding a solar or lightsail in transit would be a tricky thing indeed. A fusion rocket, for example, would emit largely in the X-ray range and could be detectable for several light years, but a lightsail is a highly mutable catch.

I remembered reading something about this in Gregory Matloff’s Deep Space Probes (Springer, 2005) and checked the book to extract this:

If ET prefers non-nuclear travel, he might utilise a laser or maser light sail. If the starship is near enough and the laser/maser is powerful enough, reflections from the sail might be observable as a fast-moving and accelerating monochromatic ‘star.’ However, detection will depend on sail shape and orientation as well as other physical factors.

Therefore, it is not as easy to model the spectral signature of these craft as it is energetic nuclear craft. A starship accelerated using lasers or masers may be easier to detect during deceleration if a magsail is used.

Writing in the comments to yesterday’s post, Centauri Dreams reader James Jason Wentworth recalls Larry Niven’s short story “The Fourth Profession,” which has a lightsail detection something like the one in The Mote in God’s Eye:

“All right. The astronomers were studying a nearby nova, so they caught the intruder a little sooner. It showed a strange spectrum, radically different from a nova and much more constant. It got even stranger. The light was growing brighter at the same time the spectral lines were shifting toward the red.

“It was months before anyone identified the spectrum.

“Then one Jerome Finney finally caught wise. He showed that the spectrum was the light of our own sun, drastically blue-shifted. Some kind of mirror was coming at us, moving at a hell of a clip, but slowing as it came.”

Some sails could be truly gigantic, and we can imagine worldships large enough to require sails the size of a planetary radius, which could be detected when near their home or destination stars, but would be hard to find when in cruise. Matloff goes on to suggest that any search for this kind of ship should look near stars from which an entire civilization might be emigrating. A star like Beta Hydri is a possibility, a nearby (21 light years) solar-type star now expanding from the main sequence. This is the longest shot of all, but finding unusual signatures in visible light near a star leaving the main sequence would at least compel a second look.

The wormhole paper is John Cramer, Robert L. Forward, Gregory Benford et al., “Natural Wormholes as Gravitational Lenses,” Physical Review D (March 15, 1995): pp. 3124-27 (available online). See also Matloff and Pazmino, “Detecting Interstellar Migrations,” in Astronomical and Biochemical Origins and the Search for Life in the Universe, ed. C. B. Cosmovici, S. Bowyer and D. Werthimer, Editrici Compositori, Bologna, Italy (1997), pp. 757-759.

tzf_img_post

Resolving the Pioneer Anomaly

Anomalies are always fascinating because they cause us to re-examine our standard explanation for things. But in the case of the so-called ‘Pioneer anomaly,’ the Jet Propulsion Laboratory’s Slava Turyshev, working with a group of scientists led by JPL’s John Anderson, needed an explanation for practical reasons. The possibility that there was new physics to be detected had the scientists wondering about a deep space mission to investigate the matter, but missions are expensive and the case for a genuine Pioneer effect had to be strengthened or else put to rest.

All of this led Turyshev to begin a multi-year data-gathering mission of his own, scouring records related to Pioneer wherever they might be found to see if what was happening to the spacecraft could be explained. The effect was tiny enough that it was originally dismissed as the result of leftover propellant in the fuel lines, but that explanation wouldn’t wash. Something was causing the two Pioneers to decelerate back toward the Sun, a deceleration that was finally measured as being about 300 inches per day squared (0.9 nanometers per second squared).

I love Turyshev’s quote on the matter, as seen in this JPL news release:

“The effect is something like when you’re driving a car and the photons from your headlights are pushing you backward. It is very subtle.”

Subtle indeed, but combing through telemetry and Doppler data, the team made a number of memorable finds, starting with the discovery of dozens of boxes of magnetic tapes stored under a staircase at JPL itself. This one is a story I always cite when talking about the danger of data loss in a time of digital information. Fully 400 reels of magnetic tape were involved, carrying records from the 114 onboard sensors that charted the progress of each of the two missions. All this information had to be transferred to DVD, as did other data from floppy disks that had been preserved at NASA’s Ames Research Center by mission engineer Larry Kellogg.

Image: For old times’ sake (and because these guys are heroes of mine), great figures from the Pioneer era, here seen celebrating after what turned out to be one of the last contacts with Pioneer 10 in 2002 (the final contact was made in January of 2003). Left to right: Paul Travis, Pioneer senior flight controller; Larry Lasher, Pioneer project manager; Dave Lozier, Pioneer flight director and Larry Kellogg, project flight technician. Credit: NASA Ames.

We came just that close to losing the key Pioneer data altogether, a reminder of the need to back up information and convert it into new formats to ensure its preservation. The Pioneers were launched at a time when data were routinely saved on punch cards which were themselves then converted into different formats at JPL, and other information had to be tracked down at the National Space Science Data Center at NASA Goddard in Greenbelt, MD. All told, Turyshev and team collected about 43 gigabytes of data and – a close call indeed – one of the tape machines needed for replaying the magnetic tapes, an item about to be discarded.

In the salvaged files from the Pioneers we learn the secret of the anomaly: Heat from electrical instruments and the thermoelectric power supply produces the effect detected from Earth, or in the words of the recently published paper on this research, the anomaly is due to “…the recoil force associated with an anisotropic emission of thermal radiation off the vehicles.” Take account of the thermal recoil force and no anomalous acceleration remains. If this work stands up, the Pioneer anomaly, well worth investigating because it seemed to challenge our standard model of physics, can be explained in a way that is consistent with that model in every respect.

The paper is Turyshev et al., “Support for the Thermal Origin of the Pioneer Anomaly,” Physical Review Letters 108, 241101 (abstract). Credit for the Pioneer image at the top of the page: Don Davis/NASA.

tzf_img_post

FTL Neutrinos: Closing In on a Solution

The news that the faster-than-light neutrino results announced to such widespread interest by the OPERA collaboration have now been explained has been spreading irresistibly around the Internet. But the brief piece in ScienceInsider that broke the news was stretching a point with a lead reading “Error Undoes Faster-Then-Light Neutrino Results.” For when you read the story, you see that a fiber optic cable connection is a possible culprit, though as yet an unconfirmed one.

Sean Carroll (Caltech) blogged on Cosmic Variance that while he wanted to pass the news along, he was reserving judgment until a better-sourced statement came to hand. I’ve thought since the beginning that a systematic error would explain the ‘FTL neutrino’ story, but I still was waiting for something with more meat on it than the ScienceInsider news. It came later in the day with an official CERN news release, and this certainly bears quoting:

The OPERA collaboration has informed its funding agencies and host laboratories that it has identified two possible effects that could have an influence on its neutrino timing measurement. These both require further tests with a short pulsed beam.

So we have not just one but two possibilities here, both with ramifications for the neutrino timing measurements and both needing further testing. And let’s go on with the news release:

If confirmed, one would increase the size of the measured effect, the other would diminish it. The first possible effect concerns an oscillator used to provide the time stamps for GPS synchronizations. It could have led to an overestimate of the neutrino’s time of flight. The second concerns the optical fibre connector that brings the external GPS signal to the OPERA master clock, which may not have been functioning correctly when the measurements were taken. If this is the case, it could have led to an underestimate of the time of flight of the neutrinos. The potential extent of these two effects is being studied by the OPERA collaboration. New measurements with short pulsed beams are scheduled for May.

Image: Detectors of the OPERA (Oscillation Project with Emulsion-tRacking Apparatus) experiment at the Italian Gran Sasso underground laboratory. Credit: CERN/AFP/Getty Images.

We may well be closing on an explanation for a result many scientists had found inconceivable. Here’s a BBC story on the possibility of trouble with the oscillator and/or an issue with the optical fiber connection. We learn here that a new measurement of the neutrino velocity will be taken in 2012, taking advantage of international facilities ranging from CERN and the Gran Sasso laboratory in Italy to Fermilab and the Japanese T2K. The story quotes Alfons Weber (Oxford University), who is working on the Minos effort to study the neutrino measurements at Fermilab:

“I can say that Minos will quite definitely go ahead… We’ve already installed most of the equipment we need to make an accurate measurement. Even if Opera now publish that ‘yes, everything is fine’, we still want to make sure that we come up with a consistent, independent measurement, and I assume that the other experiments will go forward with this as well.”

So this is where we are: An anomalous and extremely controversial result is being subjected to a variety of tests to find out what caused it. If I were a betting man, I would put a great deal of money on the proposition that the FTL results will eventually be traced down to something as mundane as the optical fiber connector that is now the subject of so much attention. But we’ll know that when it happens, and this is the way science is supposed to work. OPERA conducted numerous measurements over a three year period before announcing the FTL result. Let’s now give the further work time to sort out what really happened so we can put this issue to rest.

tzf_img_post

New Work on FTL Neutrinos

A paper in the December 24 issue of Physical Review Letters goes to work on the finding of supposed faster-than-light neutrinos by the OPERA experiment. The FTL story has been popping up ever since OPERA — a collaboration between the Laboratori Nazionali del Gran Sasso (LNGS) in Gran Sasso, Italy and the CERN physics laboratory in Geneva — reported last September that neutrinos from CERN had arrived at Gran Sasso’s underground facilities 60 nanoseconds sooner than they would have been expected to arrive if travelling at the speed of light.

The resultant explosion of interest was understandable. Because neutrinos are now thought to have a non-zero mass, an FTL neutrino would be in direct violation of the theory of special relativity, which says that no object with mass can attain the speed of light. Now Ramanath Cowsik (Washington University, St. Louis) and collaborators have examined whether an FTL result was possible. Neutrinos in the experiment were produced by particle collisions that produced a stream of pions. The latter are unstable and decayed into muons and neutrinos.

What Cowsik and team wanted to know was whether pion decays could produce superluminal neutrinos, assuming the conservation of energy and momentum. The result:

“We’ve shown in this paper that if the neutrino that comes out of a pion decay were going faster than the speed of light, the pion lifetime would get longer, and the neutrino would carry a smaller fraction of the energy shared by the neutrino and the muon,” Cowsik says. “What’s more, these difficulties would only increase as the pion energy increases. So we are saying that in the present framework of physics, superluminal neutrinos would be difficult to produce.”

This news release from Washington University gives more details, pointing out that an important check on the OPERA results is the Antarctic neutrino observatory called IceCube, which detects neutrinos from a far different source than CERN. Cosmic rays striking the Earth’s atmosphere produce neutrinos with energies that IceCube has recorded that are in some cases 10,000 times higher than the neutrinos from the OPERA experiment. The IceCube results show that the high-energy pions from which the neutrinos decay generate neutrinos that come close to the speed of light but do not surpass it. This is backed up by conservation of energy and momentum calculations showing that the lifetimes of these pions would be too long for them to decay into superluminal neutrinos. The tantalizing OPERA results look more than ever in doubt.

Image: The IceCube experiment in Antarctica provides an experimental check on Cowsik’s theoretical calculations. According to Cowsik, neutrinos with extremely high energies should show up at IceCube only if superluminal neutrinos are an impossibility. Because IceCube is seeing high-energy neutrinos, there must be something wrong with the observation of superluminal neutrinos. Credit: ICE.WUSTL.EDU/Pete Guest.

As we continue to home in on what happened in the OPERA experiment, it’s heartening to see how many physicists are praising the OPERA team for their methods. Cowsik himself notes that the OPERA scientists worked for months searching for possible errors and, when they found none, published in an attempt to involve the physics community in solving the conundrum. Since then, Andrew Cohen and Sheldon Glashow have shown (in Physical Review Letters) that if superluminal neutrinos existed, they would radiate energy in the form of electron-positron pairs.

“We are saying that, given physics as we know it today, it should be hard to produce any neutrinos with superluminal velocities, and Cohen and Glashow are saying that even if you did, they’d quickly radiate away their energy and slow down,” Cowsik says.

The paper is Cowsik et al., “Superluminal Neutrinos at OPERA Confront Pion Decay Kinematics,” Physical Review Letters 107, 251801 (2011). Abstract available. The Cohen/Glashow paper is “Pair Creation Constrains Superluminal Neutrino Propagation,” Physical Review Letters 107, 181803 (2011), with abstract available here.

tzf_img_post

The SN 1987A Experiment

If neutrinos really do travel at a velocity slightly higher than the speed of light, we have a measurement that challenges Einstein, a fact that explains the intense interest in explaining the results at CERN that we discussed on Friday. I think CERN is taking exactly the right approach in dealing with the matter with caution, as in this statement from a Saturday news release:

…many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations make an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.

And this is followed up by a statement from CERN research director Sergio Bertolucci:

“When an experiment finds an apparently unbelievable result and can find no artifact of the measurement to account for it, it’s normal procedure to invite broader scrutiny, and this is exactly what the OPERA collaboration is doing, it’s good scientific practice. If this measurement is confirmed, it might change our view of physics, but we need to be sure that there are no other, more mundane, explanations. That will require independent measurements.”

All this is part of the scientific process, as data are sifted, results are published, and subsequent experiments either confirm or question the original results. I’m glad to see that the supernova SN 1987A has turned up here in comments to the original post. The supernova, which exploded in February of 1987 in the Large Magellanic Cloud, was detected by the “Kamiokande II” neutrino detector in the Kamioka mine in Japan. It was also noted by the IMB detector located in the Morton-Thiokol salt mine near Fairport, Ohio and the ‘Baksan’ telescope in the North Caucasus Mountains of Russia.

Neutrinos scarcely interact with matter, which means they escape an exploding star more quickly than photons, something the SN 1987A measurements confirmed. But SN 1987A is 170,000 light years away. If neutrinos moved slightly faster than the speed of light, they would have arrived at the Earth years — not hours — before the detected photons from the supernova. The 25 detected neutrinos were a tiny fraction of the total produced by the explosion, but their timing matched what physicists believed about their speed. The OPERA result, in other words, is contradicted by an experiment in the sky, and we have a puzzle on our hands, one made still more intriguing by Friday’s seminar at CERN, where scientists like Nobel laureate Samuel Ting (MIT) congratulated the team on what he called an ‘extremely beautiful experiment,’ one in which systematic error had been carefully checked.

Image: In February 1987, light from the brightest stellar explosion seen in modern times reached Earth — supernova SN1987A. This Hubble Space Telescope image from the sharp Advanced Camera for Surveys taken in November 2003 shows the explosion site over 16 years later. Supernova SN1987A lies in the Large Magellanic Cloud, a neighboring galaxy some 170,000 light-years away. That means that the explosive event – the core collapse and detonation of a star about 20 times as massive as the Sun – actually occurred 170,000 years before February 1987. Credit: P. Challis, R. Kirshner (CfA), and B. Sugerman (STScI), NASA.

It’s true that OPERA was working with a large sample — some 16000 neutrino interaction events — but skepticism remains the order of the day, because as this New Scientist story points out, there is potential uncertainty in the neutrinos’ departure time, there being no neutrino detector at the CERN end. As for the GPS measurements, New Scientist labels them so accurate that they could detect the drift of the Earth’s tectonic plates. Can we still tease out a systematic error from the highly detailed presentation and paper produced by the CERN researchers? They themselves are cautious, as the paper makes clear:

Despite the large significance of the measurement reported here and the stability of the analysis, the potentially great impact of the result motivates the continuation of our studies in order to investigate possible still unknown systematic effects that could explain the observed anomaly. We deliberately do not attempt any theoretical or phenomenological interpretation of the results.

A prudent policy. Let’s see what subsequent experiments can tell us about neutrinos and their speed. The paper is The OPERA Collaboration, “Measurement of the neutrino velocity with the OPERA detector in the CNGS beam,” available as a preprint.

tzf_img_post