A Test for Exotic Propulsion?

Can we calculate the gravitational field of a mass moving close to the speed of light? Franklin Felber (Starmark Inc) believes he can, with implications for propulsion. Back in 2006 we looked briefly at Felber’s work, describing what the physicist believes to be a repulsive gravitational field that emerges from his results. Felber discussed the matter at the Space Technology and Applications International meeting that year, where he presented his calculations of the ‘relativistically exact motion of a payload in the gravitational field of a source moving with constant velocity.’

Above a certain critical velocity, Felber believes, any mass will gravitationally repel other masses, an effect that is twice as strong in the forward direction of motion, but also works in the backward direction. An object lying in the narrow beam thus produced could be accelerated quickly and with little stress. He described the effect in a paper he submitted in 2005 to the arXiv site:

At radial approach or recession speeds faster than 3-1/2 times the speed of light, even a small mass gravitationally repels a payload. At relativistic speeds, a suitable mass can quickly propel a heavy payload from rest nearly to the speed of light with negligible stresses on the payload.

In other words, a mass moving faster than roughly 57.7 percent of the speed of light will repel other masses that are placed within what we could call an ‘antigravity beam’ in front or in back of it. If true, the effect would provide the energy source needed to produce accelerations otherwise impossible. In Felber’s studies, the supposed ‘antigravity’ effect becomes stronger as the mass approaches the speed of light more and more closely.

The advantages are listed in Felber’s recent 2009 paper:

This new means of ‘antigravity’ propulsion addresses the major engineering challenges for near-light-speed space travel of providing enormous propulsion energy quickly without undue stresses on a spacecraft. By conventional propulsion, acceleration of a 1-ton payload to 0.9c requires imparting a kinetic energy equivalent to about 30 billion tons of TNT. In the ‘antigravity beam’ of a speeding star or compact object, however, a payload would draw its energy for propulsion from the repulsive force of the much more massive driver. Moreover, since it would be moving along a geodesic, a payload would ‘float weightlessly’ in the ‘antigravity beam’ even as it was accelerated close to the speed of light.

The effect would take place within a narrow cone, but would be extraordinarily useful, in Felber’s view, if we could find a way to tap it, for the energy needs to reach these high velocities would be available naturally, and the stresses of acceleration would be manageable tidal forces in free-fall motion along a geodesic. The result is what Felber calls ‘hypervelocity propulsion.’

To say this is problematic is to state the obvious. How to tap into these energies? Here’s Felber’s thought on that, from the 2006 paper:

Whether the payload is accelerated by a strong or a weak field, the payload travels along a geodesic. The only stresses on the payload, therefore, are the result of tidal forces in the accelerated frame of the payload. These stresses can be arranged by choice of the trajectory to be kept within acceptable limits. Greater practical problems for gravitational propulsion are finding a suitable and accessible driver mass at relativistic velocities, and maneuvering the payload in and out of the driver trajectory.

The italics are mine, highlighting a key issue — if Felber’s work (which draws on a 1924 David Hilbert paper that discussed the repulsion of relativistic particles by a static Schwarzschild field) is correct, then we still have the problem of arranging our payload in relation to the driver mass. In other words, taking advantage of these effects would itself require breakthroughs in space propulsion that would render the advantage of using the effect minimal. It would assume a highly advanced space infrastructure, one capable of ranging freely through deep space, and apparently a lot of luck.

But let’s put aside practicality and look at the effect itself. Theories abound and what we need are workable ways of testing them, which is why so many people are dissatisfied with the various string theory formulations — how do we confirm what seem to be purely mathematical constructs? Felber’s new paper argues that the Large Hadron Collider will be capable of testing his ideas by measuring the forces on a test mass. The physicist believes such a test could be performed without interfering with normal LHC operations, assuming we get the LHC to ‘normal’ operations any time soon.

Felber’s experiment would measure “… the repulsive gravitational impulses of proton bunches delivered in their forward direction to resonant detectors just outside the beam pipe. This test could provide accurate measurements of post-Newtonian parameters and the first observation of ‘antigravity’, as well as validating the potential utility of relativistic gravity for spacecraft propulsion in the distant future.” He believes such a test could be performed for less than one percent of the cost of NASA’s Gravity Probe B, whose total tariff may well have reached $1 billion. Lab tests can be cheaper than space tests, but will Felber’s ideas attact the needed funding even at these levels?

The 2009 paper is Felber, “Test of relativistic gravity for propulsion at the Large Hadron Collider” (abstract), while the 2006 paper is “Exact Relativistic ‘Antigravity’ Propulsion” (abstract). Technology Review looks at Felber here.

tzf_img_post

Asteroids: A Near Miss, An Informative Hit

New observations of asteroid Apophis, reported at the Division for Planetary Sciences meeting in Puerto Rico, indicate that the chances of its striking the Earth in 2036 must be recalculated, diminishing from roughly 1 in 45,000 to 1 in 250,000. There goes one disaster scenario, but enter another: An impact possibility exists for the year 2068. Says David Tholen (University of Hawaii):

“Our new orbit solution shows that Apophis will miss Earth’s surface in 2036 by a scant 20,270 miles, give or take 125 miles. That’s slightly closer to Earth than most of our communications and weather satellites.”

Too close for comfort, but a miss is a miss. Apophis reappears from behind the Sun in 2010 and is sure to be the object of even more intense scrutiny in years to come. But bear in mind that, just as we were able to refine our figures for the 2029 and 2036 encounters, we will probably be able to reduce the 1 in 333,000 chance now calculated for Apophis to actually strike in 2068.

apophis-20071114-browse

Image: Asteroid Apophis will keep us occupied for some time in figuring out the chances it will strike the Earth at some future date. Perhaps more important, it serves as a reminder of the number of near-Earth objects we have yet to identify. Most of the data for the latest Apophis updates come from the University of Hawaii’s Institute for Astronomy in Manoa. Credit: UH/IA.

Remember that we’ve only known about this object since its 2004 discovery, and the subsequent finding that an impact probability (for April 13, 2029) existed caused a bit of a sensation before it was discounted. Now we’ve shoved Apophis back to 2068, but it continues to remind us that our catalog of Earth-crossing objects is anything but complete. Don Yeomans (JPL) notes that the Near-Earth Object Program Office at JPL offers a Twitter feed (@AsteroidWatch), or you can check the AsteroidWatch Web site.

Related: Also from the Division of Planetary Sciences meeting comes further word about the small asteroid 2008 TC3 that fell to Earth last year. This was the first asteroid to have been studied before it hit the Earth. Astronomers Marek Kozubal and Ron Dantowitz of the Clay Center Observatory (Brookline, MA) tracked the object for two hours just before impact. Peter Scheirich (Ondrejov Observatory, Czech Republic) and colleagues deduced that it was shaped like ‘a loaf of walnut raisin-bread.’

asteroid_bread

Image: Artist’s conception of 2008 TC3 approaching the Earth. Credit: SETI Institute.

I do a lot of baking and my walnut-raisin bread isn’t shaped much different from my standard sandwich loaf, so I’m uncertain about the detail here. But it’s fascinating to learn from forensic evidence presented at the DPS session that fragments of 2008 TC3 are of a type called ‘polymict ureilite.’ The meteorites show traces of being heated to 1150-1300 degrees Celsius before cooling rapidly, a process in which carbon in the asteroid turned part of its olivine mineral iron into metallic iron. A SETI Institute news release explains the significance of this:

… asteroid 2008 TC3 is the remains of a minor planet that endured massive collisions billions of years ago, melting some of the minerals, but not all, before a final collision shattered the planet into asteroids.

The news release, by the way, is written by Peter Jenniskens, who traveled to the impact site in the Sudan to study the asteroid and retrieved numerous fragments. And this is interesting: Despite the torturous impacts — the carbon in these meteorites is the most ‘cooked’ of all known meteorites, and carbon crystals of graphite and nano-diamonds have been detected — some of the original organic material survived. Polycyclic aromatic hydrocarbons have turned up in abundance, and Michael Callahan and colleagues (NASA GSFC) reported that they even found traces of surviving amino acids.

tzf_img_post

An Oxygen-Rich Europan Ocean?

Long-time Centauri Dreams readers already know of my admiration for Richard Greenberg’s work on Europa, admirably summarized in his 2008 title Unmasking Europa: The Search for Life on Jupiter’s Ocean Moon (Copernicus). It’s a lively and challenging book, one which Greenberg used to take sharp issue with many of his colleagues, and although he played this aspect of the work down in a phone conversation when I reviewed the book, the animated back and forth makes for a fascinating look at how planetary science gets done.

In his book, Greenberg argues forcefully that the thickness of Europa’s ice is unlikely to be more than a few kilometers, and that its active resurfacing would make it possible for life-forms below the ice to occasionally be carried above it. That would be good news for our hopes of detecting life, of course, for it would obviate the need to drill through the ice sheet. A spacecraft’s electronics might not last long given radiation levels this close to Jupiter, but probably long enough to make that kind of identification, if such is indeed possible.
europa_chaos

Image: View of a small region of the thin, disrupted, ice crust in the Conamara region of Jupiter’s moon Europa showing the interplay of surface color with ice structures. The white and blue colors outline areas that have been blanketed by a fine dust of ice particles ejected at the time of formation of the large (26 kilometer in diameter) crater Pwyll some 1000 kilometers to the south. A few small craters of less than 500 meters or 547 yards in diameter can be seen associated with these regions. These were probably formed, at the same time as the blanketing occurred, by large, intact, blocks of ice thrown up in the impact explosion that formed Pwyll. The unblanketed surface has a reddish brown color that has been painted by mineral contaminants carried and spread by water vapor released from below the crust when it was disrupted. The original color of the icy surface was probably a deep blue color seen in large areas elsewhere on the moon. The colors in this picture have been enhanced for visibility. Credit: NASA/JPL/University of Arizona.

The tortured ice surrounding a distant Europan ridge, then, could be the venue for our first discovery of non-terrestrial life. The larger question is whether the ocean floor on Europa actually provides the conditions for life. Here the answer also ties in to that active resurfacing, one that leaves few impact craters intact and suggests that what we see from a spacecraft could be no older than 50 million years or so. Greenberg notes at the ongoing Division of Planetary Sciences meeting in Puerto Rico this week that cracks on the surface continually fill with fresh ice, while surface areas already in place are gradually replaced.

Add in mechanisms for gradually adding fresh material to the surface and you’ve exhausted the possibilities for resurfacing. But they’re all Greenberg needs to estimate that the delivery rate of oxidizers into the ocean is fast, so fast that the oxygen concentration of this sub-surface ocean could exceed that of Earth’s oceans in just a few million years. The upshot: This is enough oxygen to support not just micro-organisms but larger creatures, macrofauna whose metabolisms demand more oxygen.

And this is intriguing: Greenberg argues that it would have taken a couple of billion years for the first oxygen to reach the ocean. That delay could be crucial, for early organic structures could be disrupted by oxidation. On Earth, oxygen’s late arrival allowed life to go from pre-biotic chemistry to organisms that evolved to manage oxygen’s damaging effects. The same mechanism might have allowed creatures to emerge in Europa’s ocean.

We’re talking, remember, about a global body of water, one containing about twice the liquid water of all Earth’s oceans combined. If Greenberg is right, that ocean contains a hundred times more oxygen than previously estimated, allowing roughly 3 billion kilograms of macrofauna to subsist if their need for oxygen is roughly the same as we find in terrestrial fish.

Read Unmasking Europa for a close look at the Europan surface as seen through Voyager and Galileo imagery, the latter unfortunately compromised by the failure of the spacecraft’s high-gain antenna. Thin ice is a model supported by the imagery, and Greenberg goes on to discuss it in terms of the tidal forces acting on the moon. The case for thin ice is powerful, but we need new missions to nail it down, the first of which, the Europa Jupiter System Mission, won’t arrive any earlier than 2026.

tzf_img_post

Planetary Habitability Quantified

Habitability is always a matter of definition. Is it a measure of suitability for human life? Or do we take the larger astrobiological view that it’s based on suitability for microbial life, in which case we go from a narrowly defined habitable zone here in our Solar System to one that could potentially stretch from the upper atmosphere of Venus to the suspected subsurface aquifers on some Kuiper Belt objects. But these qualitative definitions have thus far lacked a quantitative counterpart, a method to quantify and compare potentially living worlds.

The matter has drawn the attention of Abel Mendez (University of Puerto Rico, Arecibo), who discussed his quantitative evaluation of planetary habitability at the Division for Planetary Sciences meeting this week in Fajardo, Puerto Rico. One of his results stands out immediately. Using finely tuned planetary models, Mendez found that among Mars, Venus, Europa, Titan and Enceladus, the latter has the highest subsurface habitability. That makes the Saturnian moon a tempting but difficult target, the inaccessibility of its habitable region rendering Mars and Europa better compromises for near-term missions to places where habitability is still unresolved.

Take a look at the habitability functions for various worlds in the image below:

Habitability_Image_LR_1

Image: This image shows a comparison of the potential habitable space available on Earth, Mars, Europa, Titan, and Enceladus. The green spheres represent the global volume with the right physical environment for most terrestrial microorganisms. On Earth, the biosphere includes parts of the atmosphere, oceans, and subsurface. The potential global habitats of the other planetary bodies are deep below their surface. Enceladus has the smallest volume but the highest habitat-planet size ratio followed by Europa. Surprisingly, it also has the highest mean habitability H, in the Solar System, although too deep for direct exploration. Mars and Europa are the best compromise between potential for life and accessibility. Credit: UPR Arecibo, NASA PhotoJournal.

Quantitative Habitability Theory could be useful if, as advertised, it sets up a baseline for making comparisons with past climate scenarios here on Earth, or examining other planets (including extrasolar ones) in light of a consistent set of principles. Here’s Mendez’ explanation of QH Theory:

“QH Theory is based on two new biophysical parameters: the habitability H, as a relative measure of the potential for life of an environment, or habitat quality, and the habitation M, as a relative measure of biodensity, or occupancy. Both parameters are related to other physiological and environmental variables and can be used to make predictions about the distribution, abundance, and productivity of primary producers, such as plants and phytoplankton, and microbial life in general. Initially, habitability was modeled from the environment’s temperature and humidity because they are easier to measure at planetary scales with ground or orbital instruments. Global habitability and habitations maps were constructed of terrestrial land and ocean areas with data gridded at various spatial and temporal resolutions. Preliminary work shows that the QH Theory is comparable to existing models in predicting terrestrial primary productivity.”

Mendez has established a quantity called Standard Primary Habitability that uses a variety of criteria for establishing the surface habitability on a given world. Interestingly, while the current SPH of Earth is close to 0.7, the figure has been as high as 0.9 in earlier periods, including the late Cretaceous, the time of the dinosaurs’ extinction. Earth achieved, in other words, a higher level of habitability in that era than today, at least as quantified by Mendez, until the events that led to the K/T extinction occurred.

James Kasting (Penn State) finds Mendez’ work useful, saying the methodology “…could also be extended to studies of planets around other stars that may be found during the next two decades.” Further integrating such factors as light, oxygen, carbon dioxide and nutrient concentrations will expand the model for Earth and help scientists extend it to habitable zones beyond.

tzf_img_post

Dark Energy’s Elusive Signature

It’s odd to think that there would be a connection between the large-scale structure of the universe and what we hope to achieve with deep space propulsion. But figuring out how things work on the largest scale may offer us valuable clues about what is possible and what is not. If we understand correctly how gravity works at the macro scale, then the evidence for ‘dark energy’ seems persuasive. Something is causing the universe not only to expand but to accelerate its expansion, and that something must operate against the force of gravity, which ought to be slowing the process down.

Which brings us to BOSS, the Baryon Oscillation Spectroscopic Survey, now beginning its operations after taking first light on the night of September 14-15. A part of the Sloan Digital Sky Survey III, BOSS will use the 2.5-meter telescope at Apache Point Observatory in New Mexico to measure the spectra of 1.4 million galaxies and 160,000 quasars by 2014. Out of this we should derive the most accurate data yet about the way the universe is put together at the largest levels. Newly designed spectrographs will be doing the leg work, more efficient at working in the infrared than instruments used by the original SDSS.

Astro_72

Image: One of the “first light” spectra taken by the Baryon Oscillation Spectroscopic Survey (BOSS). The top panel shows the targeted blue quasar, highlighted in the image of the sky, which are thought to be supermassive black holes in distant galaxies. At the bottom is shown the BOSS spectrum of the object which allows astronomers to measure the “redshift”, or distance to this object. BOSS plans to collect millions of such spectra and use their distances to map the geometry of the universe. Credit: D. Hogg, V. Bhardwaj, and N. Ross.

Where dark energy comes into play is that BOSS will be able to study the way protons and neutrons — baryons — interacted with light to create pressure oscillations. These, in turn, would have created variations in density as they traveled through the early universe, finally ceasing when the cosmos was about 400,000 years old as conditions cooled enough to halt their propagation. Nikhil Padmanabhan (Yale University) puts it this way:

“Like sound waves passing through air, the waves push some of the matter closer together as they travel. In the early universe, these waves were moving at half the speed of light, but when the universe was only a few hundred thousand years old, the universe cooled enough to halt the waves, leaving a signature 500 million light-years in length.”

That signature is found in the spacing of the galaxies BOSS will be measuring, whose distribution is described by Daniel Eisenstein (University of Arizona):

“We can see these frozen waves in the distribution of galaxies today. By measuring the length of the baryon oscillations, we can determine how dark energy has affected the expansion history of the universe. That in turn helps us figure out what dark energy could be.”

Galaxies, in other words, are more likely to be separated by about 500 million light years than 400 or 600 million light years, a spacing that scientists are calling a ‘frozen’ sound wave signature. That same structural effect should be seen in the clustering of visible matter in quasars and intergalactic gas as well, and should also have caused clumping in dark matter. What effect dark energy has had on this structure throughout the universe’s expansion should be apparent through the data that BOSS gathers as the project compares these scales at different eras.

Hard to believe that it was only in 1998 that firm evidence for dark energy first surfaced. We’ve been talking about it ever since and with more and more interest. Indeed, along with dark matter, it’s one of the great unanswered questions of physics. Now we can use a method beyond the supernova cosmology that revealed the universe’s acceleration in the first place, one that should complement these data and allow us to get a better handle on the dark energy phenomenon. Look for the first public data release from SDSS-III in December of 2010, yet another case of high-quality data becoming available for broad use on the Internet.

As to propulsion, there is simply no way to know what may come of all this. The continuing study of the universe’s structure may reveal clues to explain its acceleration without offering any helpful insights into exotic ways to push spacecraft. One of the tenets of the Breakthrough Propulsion Physics project all along was that the investigation of matters like these was not only helpful in working out propulsion strategies, but also in sharpening our understanding of physics in exotic realms like these. We gain, then, in deepening knowledge of the physics involved even if dark energy’s seeming ‘anti-gravity’ turns out to be explained by other means. Let’s see what BOSS has to say.

tzf_img_post