≡ Menu

Interstellar Flight and Long-Term Optimism

It’s fascinating to watch the development of online preprint services from curiosity (which is what the arXiv site was when Paul Ginsparg developed it in 1991) to today’s e-print options, hosted at Cornell and with mirrors not just at the original Los Alamos National Laboratory site but all around the world. Then, too, the arXiv is changing in character, becoming an early forum for discussion and debate, as witness Ian Crawford’s comments on Jean Schneider’s Astrobiology paper. We looked at Crawford’s criticisms of Schneider yesterday. Today we examine Schneider’s response, likewise a preprint, and published online in a fast-paced digital dialogue.

Schneider (Paris Observatory) focuses here on nuclear fusion and antimatter by way of making the case that interstellar flight will be a long and incredibly difficult undertaking. A bit of context: Schneider’s real point in the original Astrobiology piece wasn’t to offer a critique of interstellar flight ideas, but to call attention to the gap that will occur after we have made the first detection of biosignatures on exoplanets. We’ll have evidence of life when that occurs, but it may be centuries, in Schneider’s view, before we know what that life looks like, because unlike relatively nearby places like Mars, we’ll find it a monumental undertaking to send a probe to an exoplanet.

Antimatter and Its Dilemmas

Crawford, of course, questioned whether interstellar flight was as difficult to imagine as Schneider believed, and the two remain on separate paths. Schneider’s concerns about antimatter are solid: He’s looking at what it would take to produce the antimatter needed to power up a 100-ton spacecraft, and finds that the vehicle would require 1027 erg at 0.1c, the velocity Crawford uses as a base. And the problem here is daunting, for the total energy needed to produce the requisite antimatter is 200 terawatts over ten years of continuous production. Today’s total instantaneous energy production on Earth is about 20 terawatts.

But the problem gets trickier still. Schneider doesn’t go into antimatter storage, but listen to what Frank Close says about the issue in his book Antimatter (Oxford University Press, 2009):

The propaganda for antimatter power is that to take a manned spaceship to Mars, the three tonnes of chemical propellant could be reduced to less than a hundred of a gram of antimatter, the mass of a grain of rice….However, the promoters of such hopes say less about the weight of the technology that will be required for containing the antimatter. Large numbers of antiprotons or positrons imply a large concentration of electric charge, which has to be contained. To store even one millionth of the amount needed for a Mars trip would require tons of electric force pushing on the walls of the fuel tank…

And so on. These are major issues. We’ve been able to store about a million anti-protons at once thus far, which seems like a big number, but Close points out that it’s a billion billion times smaller than what you would need for a gram. Be aware, too, that since the discovery of the anti-proton in 1955, the total of our anti-proton production is less than a millionth of a gram. None of this is to rule out future advances in antimatter production or collection (we’ve looked at James Bickford’s ideas on antimatter harvesting from space before in these pages). But you can see why Schneider is a skeptic about antimatter as rocket fuel, at least as of now.

Fusion: A Promise Deferred

The fusion argument divides between those who see the bright promise vs. those who see the frustrating history of the idea. Schneider notes how conceptually simple fusion is, but the simple fact is that seventy years after the invention of the basic concept of deuterium/tritium fusion, we still can’t make it work in any stable production facility. He continues:

The ITER fusion facility is not expected to achieve production energy at a demonstration level before 2030, that is, almost a century after the nuclear fusion concept was invented. The author correctly mentions the developments in miniaturization. As an example, he cites the National Ignition Facility (a similar, less advanced project called « Mega Joule Laser » exists in Europe). But this facility, with all its control and cooling systems, is presently quite a non-miniaturized building. In spite of the fact that presently it will only provide impulsive (non continuous) fusion energy, presently at a slow rate of one impulse per hour, one can imagine that in the future these impulses can be accumulated to provide a sufficient acceleration to the spacecraft. But it requires an initial energy of a few mega joules per 1 nanosecond impulse, and in the spacecraft this energy must come from somewhere.

Schneider is open to ideas, and notes how presumptuous it is to predict what will happen after a century or more, calling for the debate over these issues to go on. In certain respects, I don’t find his views as different from Crawford’s as they might at first appear. While he correctly cites the interstellar dust problem and the danger of high-speed, debilitating collisions with particles, both authors are aware of how much we have to learn about the interstellar medium. As we start accumulating the data we’ll need, we have to take risk evaluation into account.

Calculating the Odds

How? Current missions may launch with somewhere between 1 and 0.1 percent chance of failure, but we already know that given an unforeseen breakthrough, interstellar missions will be incredibly costly, at least in the early going. Reducing the risk is thus mandatory (Schneider would like to see it go to something like one in several thousand), and doing that increases the cost. I don’t imagine Crawford would argue with this, though the two disagree on timelines, with Crawford more optimistic about the near-term, and Schneider arguing that centuries will likely pass before we can speak about a true interstellar probe. Referring to the Technology Readiness Level classification, a key part of risk evaluation, he has this to say:

For interstellar travel, we are at best at level 1 (or even 0.5), while a « Flight proven » mission will realistically require first a precursor mission to secure the technological concept, including shielding mechanisms, at say 500 to 1000 Astronomical Units. As a comparison, I can take the nulling interferometry concept for the infrared detection of exo-Earths. It was invented in the late 70s (Bracewell 1978) and is still not foreseen for a launch by ESA and NASA before 2030, that is, 50 years after the invention of the concept for a mission at least 100 times easier and cheaper than interstellar travel.

The Millennial View

Those with a near-term bent will see a profound gulf between these two views, but I think those of us who are looking long-term will find the debate converging on common themes. Neither of these men rules out interstellar flight on scientific grounds, and both are aware of the huge difficulties that must be overcome for it to occur. If you are an optimist over a given technology, you might believe, for example, that fusion will succeed in revolutionizing space propulsion and thus pave the way for an early mission. But if you’re content with the idea that interstellar flight is going to occur, the question of just which century it occurs in carries a bit less weight.

I don’t, then, find this statement off-putting:

To deserve an interstellar travel mission, an exoplanet will require very solid clues of biosignatures (to quote Willy Benz, « Extraordinary claims require exceptional proofs »). I hope that current radial velocity monitorings will discover the existence of habitable planets around Alpha Cen A or B, and that in the coming decades these planets will reveal solid biosignatures. But what if the nearest planet with credible biosignatures lies at 10 pc? Even at a speed of 0.1c, the travel will last 400 years.

True enough, and if biosignatures do turn out to be this rare, we’ll have to re-evaluate our options (and our motives) if we don’t develop the technologies to make the journey faster than 0.1c. But the issue isn’t whether we can cross ten parsecs. It’s whether interstellar flight as an idea has any merit, and it remains to be seen what mission concepts might one day develop to much closer stars. We’re in that necessary period of evaluation that is, as we speak, opening up new solar systems on a regular basis, and what we find in them will doubtless affect the motivations for a mission.

I’m with Crawford in noting the large number of propulsion options being investigated in the literature, but like Schneider, I wouldn’t want to single out a particular one as the most likely to succeed (although I admit to a continuing interest in beamed lightsail concepts). It’s too early for that, and the excitement of this era in our investigations is precisely that we’re at the entry level, scrambling for ideas, weighing technologies, speculating about future breakthroughs like nanotech that could revolutionize the game. Let the debate continue indeed, and let’s hope we can keep it at the lively level these two scientists have established through their dialogue.

The paper is Schneider, “Reply to ‘A Comment on ‘”The Far Future of Exoplanet Direct Characterization” – the Case for Interstellar Space Probes,'” accepted at Astrobiology (preprint).


Comments on this entry are closed.

  • Ronald October 13, 2010, 9:59

    The quote “Extraordinary claims require exceptional proofs” is said to come from Willy Benz here. However, I have seen this and similar quotes attributed to different people through the years.
    Most notable Carl Sagan’s version “Extraordinary claims demand extraordinary evidence” with regard to UFOs as being alien visitors.

    Further, I agree with Schneider, making a very, very small exception in the case of (terrestrial) planets near Alpha Centauri A and/or B.

    If these two don’t possess any suitable planets, the nearest even remotely sunlike stars (Epsilon Eridani and Tau Ceti) are resp. almost 11 and 12 ly away. If the nearest suitable targets are 30 or 40 ly away, still a mere hop in cosmic terms, we have a real problem with any (subluminar) probe.

    For a really good space telescope, such as the New Worlds Imager or FOCUS, this difference in distance will hardly be a challenge.

    As I argued in the previous thread: for a telescope a 10x greater distance of target only requires some enhanced technical specifications plus maybe a modest percentage more cost.
    For an interstellar probe it would be a show-stopper.

  • Mike October 13, 2010, 13:33

    I think you’ve nailed it Ronald unless we are endowed with some amazing technological breakthrough.

  • Paul D. October 13, 2010, 16:33

    I would like to see laser propulsion using near-resonance scattering off ions trapped in a magnetic field. The scattering cross section can be very high, and the system could potentially operate at very high power density (since the “sail” is already vaporized.)

    Also: there’s a proposal for dark matter (from Zhitnitsky and colleagues) that would have most dark matter be small chunks of quark antimatter (explaining the baryon asymmetry of the universe by positing that antiquarks were preferentially sequestered into these chunks.) If this theory were correct, there would be large numbers of microscopic antimatter chunks, with masses in the ton range, passing through the solar system all the time.

  • John Freeman October 13, 2010, 16:48

    There is some comfort in the thought that the nearest we may need to come to true interstellar propulsion is an engine that will get us the 700 au to the solar focus. Using that we can get comparable results, in many ways, to a flyby mission for the nearest stars at least.

    The only reservation I have with this idea is that the solar focus alows only for passive scanning. [note at this point I’m going off the deep end a bit] Is there any way the solar focus could be used as the recieveing end of an active scanning system ?

    As an example of what I mean: the arecibo radio telescope can be used to probe the poles of the plent mercury. Could any concievable source of radiation, or high speed particles, based in oru own solar system be used in conjunction with a reciever at the solar focus as an interstelar equivilent? Just a thought.

  • Kenneth Harmon October 13, 2010, 16:49

    Ron and Mike,

    Sadly, I have to agree with both of you once the exception is made for Alpha Centuari. Barring some “physics miracle” over the next Century if there is nothing of interest around Alpha Centuauri then any viable “brute force” (especially if there is a Brown Dwarf on the way) options are gone. Distance is likely to be the biggest driver over the next few Centuries for determing the viability of Interstellar Travel since there are “Alpha Centauri Ships”, that go out to ~5 Lyr’s, and there is everything else. The only real quibble I have with Ron and Mike’s assessment is the mention of Epsilon Eridani since I can’t for the life of me understand why we would spend all the time and money to go there since it is a very very young star system, and we are not likely to find much life there yet beyond some basic building blocks of life. It has planets, but they are young snowballs or toasty rocks. Instead of Epsilon Eridani, Epsilon Indi seems to hold much more promise over the longer term, although it also has a “distance problem” as does anything beyond ~5 Ly’s of Sol/Terra.

    Bottom line, the best major near term step to take on the road to Interstellar travel be it “quasi Interstellar Travel” to Alpha Centauri” or the “real thing” to everywhere else beyond ~5 Ly’s is to get a Multi-disciplinary Interstellar Survey Program Launched over the next 10 years to take a detailed look at the Sol/Terra Neighborhood out to 60 Ly’s with special focus on those Star Systems within 20 Ly’s of Sol/Terra. Once we have this information in hand, and the distance issue better calibrated then some serious planning, and research can be done on Interstellar Travel beyond the current Icarus effort. If there are planets of interest around Alpha Centauri, and some type of bio-markers are found then early Interstellar Travel may become much more viable since there would be “demand pull”, combined with a very hard, but “doable” quality to it. In essence, Alpha Centauri is close enough that a way will be found within known physics to get there, although it may not be pretty. Real Interstellar Travel, i.e travel beyond Alpha Centauri, and ~5 Ly’s is very hard and there are unlikely to be any shortcuts. In that case Marc Millis calculations on Energy usage and cost effectiveness become key drivers, and we are not likely to see Interstellar Travel until the 24th or 25th Century at the earliest. In fact, what may happen between now and ~2200 is full exploitation of our Solar System, and several unmanned/manned trips to Alpha Centauri (if anything is there) f0llowed by many Centuries before true Interstellar Travel emerges. However, it may be possible once Humans have reached Alpha Centauri to actually launch a few probes from there to other surrounding Star Systems that are farther from Sol/Terra, but within ~5 Ly’s of Alpha Centauri. Known Physics can be engineered over the next Century to push Human exploration out to a ~5 Ly radius if there is a reason to do so since allot could be done with a couple of Trillion invested over several decades. The key will be is there a drving requirement to do it, and for that answer much more basic data is needed.

  • Adam October 13, 2010, 17:49

    Like I said yesterday – and Paul says today – there’s not a lot of difference between their statements, but there is definitely a more optimistic attitude behind Ian’s. I also think Ian is probably closer to the truth in hoping for more exoplanet targets closer to Sol than the sprinkle of G stars. Red dwarfs shouldn’t yet be discounted in our ignorance of exoplanet options – even if Zarmina is a data chimaera.

  • kurt9 October 13, 2010, 18:43

    I’ve a question for those of you into biology.

    If the Eukaryote had never formed, would the prokaryotic photosynthetic bacteria been able to, on their own, to create as Oxygen-rich atmosphere as Earth has today?

  • Adam October 13, 2010, 22:09

    Hi kurt9
    That’s really geobiology and the current thinking is that rising oxygen levels needed burial of organic matter. A fair amount is buried in the ocean, but a lot more is degraded on the way to the sea floor. Now it seems the emergence of Rodinia before the Cryogenian paved the way for land’s colonization by lichen and the burial of organic matter in the near-shore. Lichen are, of course, a symbiosis of eukaryotic fungii and algae, so oxygenation probably did need some kind of eukaryotic lifeform. But the process was fitful and oxygen only slowly rose, receiving another boost or two as land plants became established and grew huge, leading to the high oxygen levels in the late Carboniferous/Permian. The reformation of a super-continent, this time Pangaea, ironically perhaps encouraged the re-emergence of anoxic deep-sea conditions (dominant through the Proterozoic) and the end-Permian mass-extinction via a deep-sea overturn with release of H2S/CO2 in quantity.

  • djlactin October 14, 2010, 2:43

    yes. Cyanophytes (prokaryotic) were responsible for the first oxygenation of Earth’s atmospher. In fact, the chloroplasts of eukaryote plants are “domesticated” cyanophytes.

  • Peter Popov October 14, 2010, 4:24

    The summary of Kenneth Harmon looks asymptotically correct, +/- a few centuries. So, assume a full account of all planets harboring life is done in the next 50 years from our present location. Next, assume no miracle physics is discovered. Finally, assume that some form of propulsion scheme is available at reasonable cost in say, 2400.

    Now can anyone predict what will be important to society in 300 years? Some 300 years ago people were concerned with matters such as the number of heavenly creatures containable on the head of a pin. I think in a matter of decades the technology will be there to synthesize all sorts of artificial critters – dynosaurs, blue elephants, elfs, dwarfs, dragons, etc. So, by 2400 there will be little to learn from another biosphere. Except if it contains intelligence. Now, present estimates of the nearest distance to ET, based on SETI research, is in the ballpark of 2000ly, if memory serves me right. So, to physically make contact you need to get relativistic. 0.3c is not even close.

    To make a present-day analogy, there is a similar situation inside our own neighbourhood. For years, people have been proposing a Mars soil sample return mission (robotic). Arguments are always given for the scientific value of such an enterprise. The original cost was prohibitive. As technology improves, the cost decreases. However, the original questions are either settled, or become irrelevant.

    It should be noted that the only really enduring positive argument throughout human history is the profit argument. Unless there is a clear profit argument, there will be no interstellar travel. If there is a profit argument for humans to spread through the solar system, then maybe, and only mabe, they will slowly diffuse into interstellar space. This process, however will be a natural one, the complete oposite of an Apollo style program.

    Now, all proposals I’ve seen on centauri-dreams, are for an Apollo style, extremely expensive program (~4% of US GDP at its peak), of zero practical utility (no direct $ profit, if any indirect). Moreover, what is subconciously on everybody’s mind (including mine) is manned interstellar travel. The only situation I can think of, that such an enterprise may happen, is in the extremely unlikely event of a detection of a clearly hostile ET, followed by wide-spread panic and demand for action. It is essential that the hostile nature of such an ET is clear and the dangers imminent, that is a few centuries into the future. The survival argument is the only other thing which has persisted through our history.

  • Ronald October 14, 2010, 5:37

    Kenneth Harmon: “…Epsilon Eridani since I can’t for the life of me understand why we would spend all the time and money to go there since it is a very very young star system, and we are not likely to find much life there yet beyond some basic building blocks of life”.

    Well, for colonization and terraforming, we do not need planets with full-blown life, maybe better even not. Young terrestrial planets with all the necessary building blocks and a primeval atmosphere may be very attractive targets.

    Other than that I fully agree with your post.

    BTW: Tau Ceti, object of a lot of SF, might not have more than a lot of dust and asteroids, maybe proto-planetoids, a so-called failed planetary system, due to its lack of building materials, i.e. very low metallicity.

    Beyond that, within 20 ly, as reasonable solartype targets:
    40 Eridani A (= Omicron(2) Eridani A)
    Sigma Draconis
    Eta Cassiopeiae A
    82 Eridani
    Delta Pavonis

    All of them, except Delta Pavonis, with rather low metallicity. DP quite high metallicity and gradually evolving off main-sequence into subgiant, getting quite bright.

  • Ronald October 14, 2010, 6:25

    kurt9: from what I have read the long delay before significant build-up of oxygen in our atmosphere was not primarily due to the lack of eaukaryotes, but due to oxygen sinks in the earth crust and oceans, particularly iron. Once those sinks got saturated the Great Oxygenation Event happened about 2.4 gy ago. This was still largely done by prokaryotes (cyanobacteria).

    But if anyone wants to correct me on this, please do so.

  • kurt9 October 14, 2010, 12:50

    Guys, thanks.

    Let me change my question a little bit. Lets say we have a “Earth-like” planet, but that it has only cyanobacteria (Prokaryote) and no other life at all. Assuming all of the other geobiological processes (accumulation of “dead” cyanobacteria on the ocean floor, etc.) occur as they did on Earth. Would cyanobacteria on its own be sufficient to produce a 20% Oxygen environment like what we have on Earth without Eukaryotes at all?

  • Eniac October 14, 2010, 20:06

    I have exactly the same impression as Ronald, from what I have read. In answer to kurt9, it seems land plants might have accelerated things, but cyanobacteria ought to have been sufficient to eventually create an oxygen atmosphere, depending on how much carbon burying goes on at sea. I think this meshes with Adam’s view, which appears to be the most informed here.

    My personal theory is that oxygen concentration is limited by wildfire, and thus is indirectly, but tightly, related to humidity and finally temperature. Hotter climate means greater humidity means less flammability means more oxygen. I have not been able to find any data or even opinion to back this up, though.

  • kurt9 October 14, 2010, 22:42

    You guys can guess where I’m going with this.

    Based on endosymbiosis theory and particularly the hydrogen hypothesis, I think the evolution of the Eukaryote is so unlikely of event that the Earth is the only example of it in this galaxy. If so, the availability of habitable planets is going to be based on the ability of prokaryotic photosynthetic life to generate the free Oxygen enough that we can breath it with out special apparatus.

    If prokaryotes can do this, then there are likely lots of habitable planets in the galaxy awaiting us. If not, then there’s not.

  • Eniac October 15, 2010, 13:56

    @Kurt9: I realize that I am probably missing a book I should have read, but I would like to seriously challenge the notion that the hydrogen hypothesis (which seems reasonable) somehow entails that the evolution of Eukaryotes is unlikely.

    Endosymbiosis is common in nature, and happens all the time. What is so special about Eukaryotes?

    Thanks for any clarification. Wikipedia did not answer this for me, last I checked.

  • kurt9 October 15, 2010, 15:21


    I recommend “Power, Sex, and Suicide” by Nick Lane for detailed explanation why endosymbiosis (via the Hydrogen hypothesis) happened only once in the history of the Earth.

  • Eniac October 15, 2010, 17:44

    I grow suspicious when I am required to “go read the sacred text” if all I need is a sentence or two in answer to a specific question. Anyway, I’ll read it when I come across it and have some time….

  • Ronald October 17, 2010, 7:10

    kurt9: from what I read and think I understood, most of the Great Oxygenation Event (GOE) was done by cyanobacteria (blue-green algae) from about 2.4 gy onward. Because the atmospheric O2 level remained low (about 3 – 4%) for a very long time, until about 0.8 gy ago, the rise of it has often been attributed to the appearance of eukaryotes. However the first eukaryotes started appearing about 1.6 – 2.1 gy ago already (various algae).
    Therefore, the relatively low O2 levels of most of earth’s and life’s history is probably due to the slow and gradual saturating of the oxygen sinks in the crust and oceans.
    Nevertheless, it made me realize that the implications are the same and rather drastic with regard to biosignatures: for a very considerable part of its history a planet may have life present without it being detectable by means of abundant O2, simply because at first this life may be anaerobic, and then even with aerobic life, the O2 levels may remain low for long periods because of the oxygen sinks. Here on earth not until after some 3 gy (!) did atmospheric O2 level rise significantly (i.e. between 0.6 and 0.8 gy ago). For over 1 gy before that O2 level was some 3 – 4%.
    However, even a 3% O2 level is a good indicator of photosynthetic life, since O2 is so reactive.
    Technical challenge is now to detect such low O2 levels.

  • Eniac October 17, 2010, 12:14

    One particular endosymbiotic event rivaling the origin of the mitochondrion in importance is the uptake of a cyanobacterium to become the chloroplast of green plants. I am still hoping for a brief explanation of what about the mitochondrial genesis supports the quaint notion that it was a “once in the universe” type of event.

  • spaceman October 18, 2010, 1:34


    I think interstellar arks are much more technically feasible than relativistic rockets, based on near-term extrapolations. These behemoth spaceships would take hundreds of years to reach Alpha Centauri and thousands to reach a hypothetical planet lying ~10 pc from Sol. Calculations: ark travel time to Alpha Centauri : 4.36 ly/0.01c= 436 years; ark travel time to a hypothetical planet at 10 pc: 32.6 ly/0.01c= 3260 years.

    Some questions for the group:

    1. If they were to launch an ark to Alpha Centauri, how many of you would sign up for the trip?

    2. What technology could get a fairly large ark to travel at 1% the speed of light?

  • Ronald October 18, 2010, 15:58

    Spaceman: I like your poll in question nr. 1.


    – If it were a fast trip, say 50% of lightspeed, travel time about 8 years in combination with hibernation/suspended animation, survival chance >= 90%: I would definitely sign up.

    – If it were a somewhat fast trip, say 10% of lightspeed, travel time about 40 years in combination with hibernation/suspended animation and life extension (preventing me from arriving at the biological age of 88), survival chance >= 90%: I might up (50/50).

    – If it were a slow trip in an ark, say 1% of lightspeed, travel time about 400 years, generational thing i.e. no hibernation/suspended animation, no life extension, i.e. I will never get there myself: count me out.

  • Duncan Ivry October 19, 2010, 16:59

    @ Spaceman

    Enter an ark to Alpha Centauri: only if I’m desperate.

  • pier makanda October 22, 2010, 2:34

    I am only a layman and I fear my question must be totally ignorant because I haven’t seen it considered here, but what about the possibility of wormhole travel i.e. that we could get to anywhere in the Universe instantaneously? Do I just read too much sci-fi?

  • Paul Gilster October 22, 2010, 7:59

    pier, wormholes are definitely in the realm of science fiction, but they’re also being investigated. Let me quote Marc Millis in a recent article:

    “Before 1988, traversable wormholes were considered merely make-believe. And before 1994, warp drives were seen as impossible science-fiction plot devices. These challenges have now matured into normal scientific discourse and even appear as homework problems in general-relativity textbooks.

    “Rather than attempting to break the light speed limit through spacetime, these theoretical approaches manipulate spacetime itself to create shortcuts (wormholes) or to move ‘bubbles’ of spacetime (warp drives). The rate at which spacetime can move is inferred from the faster-than-light expansion that physicists say occurred just after the big bang.

    “The contentious issues are the implications of time travel, the magnitude of energy required, and the assertion that the energy must be “negative.” Although negative energy states are observed in nature, there are unresolved debates regarding how much energy these states can hold, and for how long. Another recent conclusion is that wormholes appear to be theoretically more energy-efficient than warp drives.”

    Here’s the link:


  • Duncan Ivry October 22, 2010, 20:54

    @ pier makanda and @ Paul Gilster

    Sorry, with respect to wormholes and warp drive space, I disagree. Those are totally speculative, and there has been no serious proposal — not in the far distance — how to construct — really construct and not handle merely theoretically — such “things”; or, at least, a proposal how to do a physical experiment by which space is warped the way *those* theories suggest — not even this (I emphasize *those* because space can indeed be warped, when be bring some mass along, but that’s only Einstein’s gravity theory). That all these “challenges have now matured into normal scientific discourse” is only an opinion.

  • hakgum kim October 22, 2010, 22:12

    At least spacecraft must be composed of three ships and more .

    One ship is very dangerous.

    In long journey one important part disorder make everything in vain.

    Like Columbus sailing.

  • Paul Gilster October 23, 2010, 8:58

    Re wormholes, Duncan Ivry writes:

    That all these “challenges have now matured into normal scientific discourse” is only an opinion.

    We’re certainly not in the realm of engineering studies, but that wormhole physics has entered the discussion is undeniable. The two papers that placed it there appeared in 1988, the work of Kip Thorne and (then) graduate students Michael Morris and Uli Yurtsever:

    Morris, M. S. and Thorne, K. S., “Wormholes in spacetime and their use for interstellar travel: A tool for teaching general relativity,” American Journal of Physics, Vol. 56, 1988, pp. 395-412.

    Morris, M. S., Thorne, K. S. and Yurtsever, U., “Wormholes, time machines, and the weak energy condition,” Physical Review Letters, Vol. 61, 1988, pp. 1446-1449.

    The major focus here was the stability of wormholes and the energy required to create and maintain them, found in these papers to be an amount of negative energy comparable to the mass of the planet Jupiter (for a wormhole of one-meter radius).

    Matt Visser’s book-length treatment appeared in 1995:

    Visser, M., Lorentzian Wormholes: From Einstein to Hawking, AIP Press, New York, 1995.

    The astronomical signature of a wormhole was discussed in this paper:

    Cramer, J., Forward, R. L., Morris, M., Visser, M., Benford, G., and Landis, G.L., “Natural wormholes as gravitational lenses,” Phyical Review D, Vol. 51, 1995, pp. 3117–3120.

    And Eric Davis has probed wormhole physics in this study:

    Davis, E., “Advanced Propulsion Study,” AFRL-PR-ED-TR-2004-0024, Air Force Research Laboratory, Edwards Air Force Base, p. 65, April 2004.

    Wormholes are a small and highly theoretical niche, but the idea that they are not under scrutiny in the literature is incorrect. Frontiers of Propulsion Science (2009) contains an overview of wormhole research with further references.

  • Marc G Millis October 23, 2010, 13:16

    Ideas mature through different levels as they march toward fruition and each of us have our own threshold level beyond which we will not (or cannot consider); even me. For you, obviously the space-warping notions are beyond that threshold and hence you can focus your attention on approaches closer to fruition. Those nearer fields are also in need of serious progress.

    That is not to say, however, that every one shares, or should share, that same threshold. Space-warping for space travel is indeed discussed in peer-reviewed scientific journals and books, even though they might still turn out to be impossible. Gilster’s list is just a sampling. Progress is not made by giving up. Others are finding value in pursing these notions. As a minimum example of their utility, the space-warping notions have provided a deeper level of understanding into the subtleties and limits of general relativity, such as the energy conditions, causality, and unsolved issues regarding the nature of empty space.

    Rather than trying to get everyone to agree on which prospects are best, I would much prefer that each of us pick the approaches that resonate most with us as individuals, and then work to improve the prospects of our chosen approaches. Given that we are still a long way of from launching anything, we’ve all got some time to make real progress in our chosen interests without needing to agree on the ‘best’ overall approach.

    Discouraging notions because they are beyond some personal threshold does not add value. Providing pointed critiques targeted at the genuine edge of knowledge does add valuable. Improving the state of art of a chosen approach is even better.

    Personally, the goal that resonates with me is to figure out the nature of inertial frames that pertain to non-propellant space drives. Even with this focus, I encourage all of you to pursue whatever approach that you can can make the most progress with, from the seemingly simple solar sails, to the seemingly impossible faster-than-light travel, and anything in between.

    Ad astra incrementis!

  • Kenneth Harmon October 23, 2010, 17:18

    Two quick questions. The first for Marc Millis. There seems to be extensive debate in the posts above about Warp Drive and Wormholes. How about the issue of Hyper-Space. Is there much serious discussion in Scientific Journals about the potential existence of Hyper-Space and travel within it as something distinct from Wormholes and Warp Drive?

    The second question is for Ron and anybody else inclined to answer. Why is there no significant discussion of Epsilon Indi as a likely potential Star system near Sol/Terra where life might be found. There is now strong evidence that Epsilon Indi is a much older star system then was once believed, and yet it does not seem to make anybody’s short list even though it is less than 11 lyrs from Sol/Terra. What am I missing about Epsilon Indi that does not make it a very good candidate?

    Thank You

  • Paul Gilster October 23, 2010, 18:24

    No planets found yet around Epsilon Indi (some speculation about a gas giant), though I believe there’s an interesting binary brown dwarf in the system. Like you, I’m also interested in what others think about Epsilon Indi.

  • Ronald October 23, 2010, 19:41

    Kenneth: the reason why I never include Epsilon Indi in my lists of promising solar type candidates is that it is a K5 star with a luminosity of only 0.17 solar. This means that the midpoint of the habitable zone of a terrestrial planet would be just within 0.5 AU, where tidal locking will become a problem within some (approx. 3-4) gyrs.

    I usually consider K2/K3 or 0.2 – 0.25 solar luminosity the minimum for a solar type star.

  • Duncan Ivry October 24, 2010, 19:38

    Paul Gilster and Marc Millis, thank you both for your answers.

    The point was whether wormholes and warp drive space have “matured into normal scientific discourse”, and *not* whether they “entered the discussion” and are “under scrutiny in the literature”. I very well recognize the subtle, but substantial, difference. And I do not agree, that my statements are only based on my own threshold level beyond which I will not or cannot consider. I insist on having *reasons*.

    Giving information about the shortcomings of some approaches is useful. Being encouraging or discouraging is of minor importance, whereas allocating scarce resources — human knowledge and skills, and money — is much more important.

    If you put too many resources into highly speculative discussions, you will have a high risk of achieving nothing, except only having talked nicely. Merely theoretical, physical research without experiments — the state of (e.g.) wormhole research — has to answer some critical questions. That’s how good science works. This should encourage the researcher to do better work.

    I recommend reading “Advanced Propulsion Study” by Eric W. Davis — Paul, you mentioned it above (google “Advanced Propulsion Study Davis”). It’s rather readable even for physics lay people, and gives you a good overview about many concepts (and it contains figures). It’s a “literature and program search to carry out and document a technical assessment of the latest concepts …”

    The “Alcubierre Warp Drive Effect” is only described briefly and “will not be considered further in the study” — it is not assessed.

    The topic “Traversable Wormholes” is handled a little bit broader. “The author (Davis, 2004)” — titled “Teleportation Physics Study” — “reviewed and summarized few schemes that might allow one to engineer a traversable wormhole in the lab.” Where is the technical assessment? Nowhere. And to engineer in the lab? “Engineering the Spacetime Metric” (equally “Engineering the Vacuum”) consists of merely theoretical formulae. Nothing to allow engineering.

    Davis cites one theoretical study after another, including his own studies, where purely theoretical statements are made, where purely speculative devices are proposed, e.g. “nuclear explosion magnetic compression or ultrahigh-intensity tabletop lasers to explore the possibility of creating a wormhole in the lab.” Did someone perform experiments? As far as I can tell: No. And even if someone does, there is always the risk of not creating something … er … looking like a wormhole.

    Davis did “probe” warmhole physics only in the sense of piling up formulae and references, supplemented with some incorrect statements. It’s similar with many pertinent publications I looked at.

    You may remember, several times I said: At least *propose* a physical experiment! Scanning literature, you will find a lot of “proposals”, but not of what a professional physicist would call an experiment.

    There are *reasons* for calling all this highly speculative, merely theoretical, non-mature, non-normal … well, okay, physics. But it’s not a case of a personal threshold level of being able to accept something.

  • Marc G Millis October 24, 2010, 21:47

    Kenneth Harmon;
    “Hyperspace” is a term that is used in general relativity, but it does not mean the same thing as the fictional spaceflight notion of a hyperspace. I have not yet run across a scientific article dealing with the notion of having another set of dimensions with different speed limits, but to be fair, I haven’t hunted yet for that sort of thing either. I’m not sure even what search terms to use that would separate it from the term hyperspace (and hyper-surface) as used in general relativity. If it’s out there, I’m sure someone will bring it to our attention.


  • Dan M. November 12, 2010, 3:22

    I don’t know what system planet “e” of Gliese 581 is in, but I believe it is 20ly’s away. It is my understanding that this planet (“e”) is the most “Earth like” that has been discovered. I was just wondering why it has not been mentioned in any of the previous posts?

  • Paul Gilster November 12, 2010, 8:31

    Dan M., the planet you’re talking about is Gliese 581g, not e. It was considered to be more ‘Earth-like’ than any other planet found, in that it seemed to be in the habitable zone of its star, an M-class red dwarf. But it now appears that the discovery is questionable, as higher resolution data from the HARPS spectrograph finds no trace of the planet. We’ve had quite a few discussions of the planets around Gliese 581 — to find them, plug the term ‘gliese’ into the search engine here.

  • Dan M. November 12, 2010, 17:26

    Paul Gilster,
    Thank you for taking the time to respond to my question. I was linked to this site from another one and really enjoyed this one.

  • Paul Gilster November 12, 2010, 21:15

    My pleasure, Dan. Feel free to have a look around and I hope you’ll become a regular.

  • Mephane July 14, 2011, 4:17

    There is also another issue some people tend to forget. It’s what I call the “Goolgolplex Analogy”.

    Someone once estimated that if you’d write a computer program to print the number Goolgolplex in decimal (i.e. 100000…) on screen, and start it now, for a couple of hundred years each generation of computers would be so much faster that if you start the program now, it will inevitably overtaken by a faster computer that was built later, and that computer would be overtaken again, just because the program would take so long.

    I think the same applies to interstellar travel. A probe sent out in the year 2200 with a speed of 0.1c would easily be able to overtake any of the Voyager probes. A probe sent in the year 2300 with 0.2c would overtake that one, too. A manned space ship with warp drive sent out in the year 3000 (just speculating) could go out and collect and bring back all of the probes (assuming they are able to precisely locate them still).

    Send out a multi-generation space ship to an inhabitable planet, and by the time they arrive they might find it already fully colonized by humans who started the same journey later, with a much faster ship.

    Now that I think about it, there surely must be a novel about exactly this, heh.

  • Paul Gilster July 14, 2011, 14:52

    A.E. van Vogt’s ‘Far Centaurus’ is one of many science fiction treatments of this idea. You might also find one of our earlier discussions of the issue interesting: