Centauri Dreams

Imagining and Planning Interstellar Exploration

A Machine-Driven Way to the Stars

Are humans ever likely to go to the stars? The answer may well be yes, but probably not if we’re referring to flesh-and-blood humans aboard a starship. That’s the intriguing conclusion of Keith Wiley (University of Washington), who brings his background in very large computing clusters and massively parallel image data processing to bear on the fundamental question of how technologies evolve. Wiley thinks artificial intelligence (he calls it ‘artificial general intelligence,’ or AGI) and mind-uploading (MU) will emerge before other interstellar technologies, thus disrupting the entire notion of sending humans and leading us to send machine surrogates instead.

It’s a notion we’ve kicked around in these pages before, but Wiley’s take on it in Implications of Computerized Intelligence on Interstellar Travel is fascinating because of the way he looks at the historical development of various technologies. To do this, he has to assume there is a correct ‘order of arrival’ for technologies, and goes to work investigating how that order develops. Some inventions are surely prerequisites for others (the wheel precedes the wagon), while others require an organized and complex society to conceive and build the needed tools.

Some technologies, moreover, are simply more complicated, and we would expect them to emerge only later in a given society’s development. Among the technologies needed to get us to the stars, Wiley flags propulsion and navigation as the most intractable. We might, for example, develop means of suspended animation, and conquer the challenges of producing materials that can withstand the rigors and timeframes of interstellar flight. But none of these are useful for an interstellar mission until we have the means of accelerating our payload to the needed speeds. AGI and MU, in his view, have a decided edge in development over these technologies.

Researchers report regularly on steady advancements in robotics and AI and many are even comfortable speculating on AGI and MU. It is true that there is wide disagreement on such matters, but the presence of ongoing research and regular discussion of such technologies demonstrates that their schedules are well under way. On the other hand, no expert in any field is offering the slightest prediction that construction of the first interstellar spaceships will commence in a comparable time frame. DARPA’s own call to action is a 100-year window, and rightfully so.

Wiley is assuming no disruptive breakthroughs in propulsion, of course, and relies on many of the methods we have long discussed on Centauri Dreams, such as solar sails, fusion, and antimatter. All of these are exciting ideas that are challenged by the current level of our engineering. In fact, Wiley believes that the development of artificial general intelligence, mind uploading and suspended animation will occur decades to over a century before the propulsion conundrum is resolved.

Consequently, even if suspended animation arrives before AGI and MU — admittedly, the most likely order of events — it is still mostly irrelevant to the discussion of interstellar travel since by the time we do finally mount the first interstellar mission we will already have AGI and MU, and their benefits will outweigh not just a waking trip, but probably also a suspended animation trip, thus undermining any potential advantage that suspended animation might otherwise offer. For example, the material needs of a computerized crew grow as a slower function of crew size than those of a human crew. Consider that we need not necessarily send a robotic body for every mind on the mission, thus vastly reducing the average mass per individual. The obvious intention would be to manufacture a host of robotic bodies at the destination solar system from raw materials. As wildly speculative as this idea is, it illustrates the considerable theoretical advantages of a computerized over a biological crew, whether suspended or not. The material needs of computerized missions are governed by a radically different set of formulas specifically because they permit us to separate the needs of the mind from the needs of the body.

We could argue about the development times of various technologies, but Wiley is actually talking relatively short-term, saying that none of the concepts currently being investigated for interstellar propulsion will be ready any earlier than the second half of this century, if then, and these would only be the options offering the longest travel times compared to their more futuristic counterparts. AGI and MU, he believes, will arrive much earlier, before we have in hand not only the propulsion and navigation techniques we need but also the resolution of issues like life-support and the sociological capability to govern a multi-generational starship.

The scenario assumes not that starflight is impossible, nor that generation ships cannot be built. It simply assumes that when we are ready to mount a genuine mission to a star, it will be obvious that artificial intelligence is the way to go, and while Wiley doesn’t develop the case for mind-uploading in any detail because of the limitations of space, he does argue that if it becomes possible, sending a machine with a mind upload on the mission is the same as sending ourselves. But put that aside: Even without MU, artificial intelligence would surmount so many problems that we are likely to deploy it long before we are ready to send biological beings to the stars.

Whether mediated by human or machine, Wiley thinks moving beyond the Solar System is crucial:

The importance of adopting a realistic perspective on this issue is self-evident: if we aim our sights where the target is expected to reside, we stand the greatest chance of success, and the eventual expansion of humanity beyond our own solar system is arguably the single most important long-term goal of our species in that the outcome of such efforts will ultimately determine our survival. We either spread and thrive or we go extinct.

If we want to reach the stars, then, Wiley’s take is that our focus should be on the thorny issues of propulsion and navigation rather than life support, psychological challenges or generation ships. These will be the toughest nuts to crack, allowing us ample time for the development of computerized intelligence capable of flying the mission. As for the rest of us, we’ll be vicarious spectators, which the great majority of the species would be anyway, whether the mission is manned by hyper-intelligent machines or actual people. Will artificial intelligence, and especially mind uploading, meet Wiley’s timetable? Or will they prove as intractable as propulsion?

tzf_img_post

The SN 1987A Experiment

If neutrinos really do travel at a velocity slightly higher than the speed of light, we have a measurement that challenges Einstein, a fact that explains the intense interest in explaining the results at CERN that we discussed on Friday. I think CERN is taking exactly the right approach in dealing with the matter with caution, as in this statement from a Saturday news release:

…many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations make an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.

And this is followed up by a statement from CERN research director Sergio Bertolucci:

“When an experiment finds an apparently unbelievable result and can find no artifact of the measurement to account for it, it’s normal procedure to invite broader scrutiny, and this is exactly what the OPERA collaboration is doing, it’s good scientific practice. If this measurement is confirmed, it might change our view of physics, but we need to be sure that there are no other, more mundane, explanations. That will require independent measurements.”

All this is part of the scientific process, as data are sifted, results are published, and subsequent experiments either confirm or question the original results. I’m glad to see that the supernova SN 1987A has turned up here in comments to the original post. The supernova, which exploded in February of 1987 in the Large Magellanic Cloud, was detected by the “Kamiokande II” neutrino detector in the Kamioka mine in Japan. It was also noted by the IMB detector located in the Morton-Thiokol salt mine near Fairport, Ohio and the ‘Baksan’ telescope in the North Caucasus Mountains of Russia.

Neutrinos scarcely interact with matter, which means they escape an exploding star more quickly than photons, something the SN 1987A measurements confirmed. But SN 1987A is 170,000 light years away. If neutrinos moved slightly faster than the speed of light, they would have arrived at the Earth years — not hours — before the detected photons from the supernova. The 25 detected neutrinos were a tiny fraction of the total produced by the explosion, but their timing matched what physicists believed about their speed. The OPERA result, in other words, is contradicted by an experiment in the sky, and we have a puzzle on our hands, one made still more intriguing by Friday’s seminar at CERN, where scientists like Nobel laureate Samuel Ting (MIT) congratulated the team on what he called an ‘extremely beautiful experiment,’ one in which systematic error had been carefully checked.

Image: In February 1987, light from the brightest stellar explosion seen in modern times reached Earth — supernova SN1987A. This Hubble Space Telescope image from the sharp Advanced Camera for Surveys taken in November 2003 shows the explosion site over 16 years later. Supernova SN1987A lies in the Large Magellanic Cloud, a neighboring galaxy some 170,000 light-years away. That means that the explosive event – the core collapse and detonation of a star about 20 times as massive as the Sun – actually occurred 170,000 years before February 1987. Credit: P. Challis, R. Kirshner (CfA), and B. Sugerman (STScI), NASA.

It’s true that OPERA was working with a large sample — some 16000 neutrino interaction events — but skepticism remains the order of the day, because as this New Scientist story points out, there is potential uncertainty in the neutrinos’ departure time, there being no neutrino detector at the CERN end. As for the GPS measurements, New Scientist labels them so accurate that they could detect the drift of the Earth’s tectonic plates. Can we still tease out a systematic error from the highly detailed presentation and paper produced by the CERN researchers? They themselves are cautious, as the paper makes clear:

Despite the large significance of the measurement reported here and the stability of the analysis, the potentially great impact of the result motivates the continuation of our studies in order to investigate possible still unknown systematic effects that could explain the observed anomaly. We deliberately do not attempt any theoretical or phenomenological interpretation of the results.

A prudent policy. Let’s see what subsequent experiments can tell us about neutrinos and their speed. The paper is The OPERA Collaboration, “Measurement of the neutrino velocity with the OPERA detector in the CNGS beam,” available as a preprint.

tzf_img_post

On Neutrinos and the Speed of Light

If you’re tracking the interesting news from CERN on neutrinos moving slightly faster than the speed of light, be advised that there is an upcoming CERN webcast on the matter at 1400 UTC later today (the 23rd). Meanwhile, evidence that the story is making waves is not hard to find. I woke up to find that my local newspaper had a headline — “Scientists Find Signs of Particles Faster than Light” — on the front page. This was Dennis Overbye’s story, which originally ran in the New York Times, but everyone from the BBC to Science Now is hot on the trail of this one.

The basics are these: A team of European physicists has measured neutrinos moving between the particle accelerator at CERN to the facility beneath the Gran Sasso in Italy — about 725 kilometers — at a speed about 60 nanoseconds faster that it would have taken light to make the journey. The measurement is about 0.0025 percent (2.5 parts in a hundred thousand) greater than the speed of light, a tiny deviation, but one of obvious significance if confirmed. The results are being reported by OPERA (Oscillation Project with Emulsion-Tracking Apparatus), a group led by physicist Antonio Ereditato (University of Bern).

Neutrinos are nearly massless subatomic particles that definitely should not, according to Einstein’s theory of special relativity, be able to travel faster than light, which accounts for the explosion of interest. According to this account in Science Now, the OPERA team measured roughly 16,000 neutrinos that made the trip from CERN to the detector, and Ereditato is quoted as saying that the measurement itself is straightforward: “We measure the distance and we measure the time, and we take the ratio to get the velocity, just as you learned to do in high school.” The measurement has an uncertainty of 10 nanoseconds.

It’s hard to do any better than Ereditato himself when bringing caution to these findings. Let me quote the Science Now story again:

…even Ereditato says it’s way too early to declare relativity wrong. “I would never say that,” he says. Rather, OPERA researchers are simply presenting a curious result that they cannot explain and asking the community to scrutinize it. “We are forced to say something,” he says. “We could not sweep it under the carpet because that would be dishonest.”

And the BBC quotes Ereditato to this effect: “My dream would be that another, independent experiment finds the same thing. Then I would be relieved.” One reason for the relief would be that other attempts to measure neutrino speeds have come up with results consistent with the speed of light. Is it possible there was a systematic error in the OPERA analysis that gives the appearance of neutrinos moving faster than light? The timing is obviously exquisitely precise and critical for these results, and a host of possibilities will now be investigated.

This paragraph from a NatureNews story is to the point:

At least one other experiment has seen a similar effect before, albeit with a much lower confidence level. In 2007, the Main Injector Neutrino Oscillation Search (MINOS) experiment in Minnesota saw neutrinos from the particle-physics facility Fermilab in Illinois arriving slightly ahead of schedule. At the time, the MINOS team downplayed the result, in part because there was too much uncertainty in the detector’s exact position to be sure of its significance, says Jenny Thomas, a spokeswoman for the experiment. Thomas says that MINOS was already planning more accurate follow-up experiments before the latest OPERA result. “I’m hoping that we could get that going and make a measurement in a year or two,” she says.

Unusual results are wonderful things, particularly when handled responsibly. The OPERA team is making no extravagant claims. It is simply putting before the scientific community a finding that even Ereditato calls a ‘crazy result,’ the idea being that the community can bring further resources to bear to figure out whether this result can be confirmed. Both the currently inactive T2K experiment in Japan, which directs neutrinos from its facility to a detector 295 kilometers away, and a neutrino experiment at Fermilab may be able to run tests to confirm or reject OPERA’s result. A confirmation would be, as CERN physicist Alvaro de Rujula says, ‘flabbergasting,’ but one way or another, going to work on these findings is going to take time, and patience.

The paper “Measurement of the neutrino velocity with the OPERA detector in the CNGS beam” is now up on the arXiv server (preprint).

Addendum: For an excellent backgrounder on neutrino detection and the latest measurements, replete with useful visuals, see Starts With a Bang. Thanks to @caleb_scharf for the tip.

And this comment from a new Athena Andreadis post is quite interesting:

If it proves true, it won’t give us hyperdrives nor invalidate relativity. What it will do is place relativity in an even larger frame, as Eisteinian theory did to its Newtonian counterpart. It may also (finally!) give us a way to experimentally test string theory… and, just maybe, open the path to creating a fast information transmitter like the Hainish ansible, proving that “soft” SF writers like Le Guin may be better predictors of the future than the sciency practitioners of “hard” SF.

tzf_img_post

Exoplanet Discoveries via PC

Because the financing for missions like Kepler is supported by tax dollars, it’s gratifying to see the public getting actively involved in working with actual data from the Kepler team. That’s what has been going on with the Planet Hunters site, where 40,000 users from a wide variety of countries and backgrounds have been analyzing what Kepler has found. Planet hunter Debra Fischer (Yale University), a key player in the launch of Planet Hunters, has this to say:

“It’s only right that this data has been pushed back into the public domain, not just as scientifically digested results but in a form where the public can actively participate in the hunt. The space program is a national treasure — a monument to America’s curiosity about the Universe. It is such an exciting time to be alive and to see these incredible discoveries being made.”

So far, so good on the citizen science front. Using publicly available Kepler data, Planet Hunters has found two new planets, both of them discarded initially by the Kepler team for a variety of technical reasons. Fischer believes the odds on the detections being actual planets are 95 percent or higher. The candidate planets have periods of 10 and 50 days, and radii from two and a half to eight times that of the Earth. One of them is conceivably a rocky world, though not in the habitable zone. Several dozen Planet Hunters users had spotted the planet candidates.

Image: One of the tutorial figures explaining how to use Planet Hunters at the site. The time it takes a planet to complete one orbit is called the orbital period. For transiting planets, this can be determined by counting the number of days from one transit to the next. Planets in longer period orbits will be more challenging to detect, both for humans and for computers because a transit will not appear in every 30-day set of light curve data. Large planets with short orbital periods are the easiest ones to detect. The most challenging detections will be small planets with long orbital periods. These will require patience and care, but are the real treasures in the Kepler data. Credit: Planet Hunters.

Following up the detection, astronomers used the Keck Observatory to study the host stars. A new study is to be published on the discoveries in the Monthly Notices of the Royal Astronomical Society, marking the first time the public has used NASA space mission data to find planets around other stars. So while the heavy lifting continues to be done by the Kepler team itself, public science has proven to be a helpful supplement, bringing more eyes to the data at hand. And, of course, the next round of Kepler data provides just as intriguing a hunting ground.

When it began, Planet Hunters was described as a bet on the ability of humans to beat computers, at least occasionally, because of the way people can use pattern recognition. The Kepler team uses computer algorithms fine-tuned to analyze light curve data because of the sheer number of stars the mission is working with. But while computers excel at finding what they are trained to find, the potential for surprise is always there as tens of thousands of users put pattern recognition to work to examine light curves, track down anomalies, and pay close attention to transit signals. For this kind of analysis, using powerful computers with a widely distributed human backup component is proving ideal for the task at hand.

tzf_img_post

NEOWISE: Rethinking the Dinosaur Killer

With a fierce interest in nearby brown dwarfs, I often neglect the significant part of the WISE mission devoted to asteroids. WISE (Wide-Field Infrared Explorer) has catalogued more than 157,000 asteroids in the main belt and discovered 33,000 new objects as part of its NEOWISE activities. Here the benefits of infrared wavelengths become apparent, for we know little about the reflectivity of a given asteroid and thus have trouble figuring out how large it is. Using infrared, WISE can relate light in these frequency ranges to the size and temperature of the object. Having established size, mission scientists can re-calculate the asteroid’s reflectivity.

NEOWISE is actually an enhancement to the WISE data processing system that makes for better detection of moving objects in the WISE data. In addition to the asteroids mentioned above, NEOWISE has also detected more than 500 Near-Earth Objects (NEOs) and roughly 120 comets. We’ve had plentiful studies at visible wavelengths from groups like the Catalina Sky Survey, Spacewatch and the Near Earth Asteroid Tracking Program, and we’ve also examined some objects with radar, although that requires the objects to pass fairly close to the Earth. Thus we have uncovered plenty of objects, including some 7600 NEOs and hundreds of thousands of main belt asteroids, but have been able to physically characterize only a small number of them.

NEOWISE begins to bridge this gap, and along the way is uncovering information about the so-called ‘dinosaur killer’ asteroid, once thought to be associated with the object known as (298) Baptistina. It was back in 2007 that Baptistina was suggested as the source of the object that struck the Earth in the Cretaceous/Tertiary extinction event 65.5 million years ago, contributing to the demise of the dinosaurs. The theory: Baptistina’s parent body, perhaps 170 kilometers in diameter, collided with another 60-kilometer wide asteroid about 160 million years ago, sending a chunk into an Earth-crossing trajectory. Moreover, Baptistina has the characteristics of a carbonaceous chondrite, matching the impactor at Chicxulub crater in the Yucatan.

Image: Scientists think that a giant asteroid, which broke up long ago in the main asteroid belt between Mars and Jupiter, eventually made its way to Earth and led to the extinction of the dinosaurs. Data from NASA’s WISE mission likely rules out the leading suspect, a member of a family of asteroids called Baptistina, so the search for the origins of the dinosaur-killing asteroid goes on. This artist’s concept shows a broken-up asteroid. Credit: NASA/JPL-Caltech.

Call it the ‘Baptistina bombardment,’ a prolonged surge in impact activity that affected both the Earth and the Moon — researchers in 2007 pointed to the prominent crater Tycho as another possible sign of these events. But WISE has now demonstrated, through study of 1,056 members of the Baptistina family, that the parent asteroid broke up much more recently than 160 million years ago, perhaps only 80 million years ago. The question then becomes whether a Baptistina fragment could still have hit the Earth about 15 million years after the collision. Amy Mainzer (JPL), principal investigator of NEOWISE, is co-author of a new paper on the issue:

“This doesn’t give the remnants from the collision very much time to move into a resonance spot, and get flung down to Earth 65 million years ago. This process is thought to normally take many tens of millions of years.”

Mainzer is referring to locations in the main belt where the gravitational forces of the gas giants can nudge asteroids out of their orbits and into a collision course with Earth. And it appears that Baptistina is now in the clear on this charge, with the actual dinosaur killer (if indeed the event was solely precipitated by an asteroid strike, which has not been conclusively proven) remaining unknown. What we do have is plenty of evidence that a 10-kilometer asteroid impacted the Earth 65 million years ago, and we have WISE data that may eventually trace the actual culprit.

The paper suggesting Baptistina’s involvement in the dinosaur extinction event is Bottke et al., “An asteroid breakup 160 Myr ago as the probable source of the K/T impactor,” Nature 449 (September 6, 2007), pp. 48-53 (abstract). And see Larry Klaes’ story Finding the Dino Killer for more background on the original Baptistina theory. The Mainzer paper is Masiero et al., “Main Belt Asteroids with WISE/NEOWISE I: Preliminary Albedos and Diameters,” accepted by the Astrophysical Journal (preprint).

tzf_img_post

A Wary Look at Habitable Worlds

The confirmation of a planet circling two stars, recounted in these pages yesterday, is actually the result of a long process. Jean Schneider (CNRS/LUTH – Paris Observatory) noted in a follow-up comment to the Kepler-16b story that investigation of such systems dates back to 1990 (see citation below), while Alex Tolley has pointed out that the great space artist Chesley Bonestell was painting imaginary planets orbiting binary stars fully sixty years ago. So the idea isn’t new, but the confirmation was obviously useful, and in more ways than we might have expected.

For one thing emerging from the Kepler-16b paper is that the smaller of the two stars in this binary system, an M-dwarf, is now the smallest low-mass star to have both its mass and radius measured at such precision. The question of stellar mass and M-dwarfs is significant because a new paper by Philip Muirhead (Cornell University) and colleagues goes to work on the parameters of low-temperature Kepler planetary host stars and finds stellar radii that are roughly half the values reported in the Kepler Input Catalogue. The authors believe these values correlate better with the estimated effective temperatures (Teff) of these stars and suggest a striking possibility:

The effective temperatures, radii and masses of the KOIs imply different planet-candidate equilibrium temperature estimates, such that 6 planet-candidates are terrestrial-sized and have equilibrium temperatures which may permit liquid water to reside on the planet surface, assuming Earth-like albedos and re-radiation fractions. Scaling the Earth’s equilibrium temperature of 255 K by the orbital semi-major axis, stellar Teff and stellar radius of the KOIs in this letter, we find that KOIs 463.01, 1422.02, 947.01, 812.03, 448.02 and 1361.01 all have equilibrium temperatures between 217 K and 261 K: the limits of the habitable zone as described in Kasting et al. (1993).

This one has struck a nerve and it’s easy to see why, as we are suddenly looking at six Earth-like planets in the habitable zone of their stars. I’ve received quite a few links to the paper (and thanks to all who sent them, as this is often how I find interesting work!), but we first have to note a few qualifiers. The authors point out, for example, that this work assumes “the same albedo, re-radiation fraction and greenhouse effect” as are found in our own system, an assumption that may well be challenged for a terrestrial planet orbiting a red dwarf star.

I’m also cautious because the physical parameters of exoplanet-hosting stars are so crucial to our understanding of the detected exoplanets themselves. Here we run into issues, and the authors are quick to point this out. We have detailed information about the Sun, for example, that helps us calibrate models for Sun-like stars, so our analyses of mass, effective temperature, radius and other values seems logical and well-founded. But M-dwarfs are a different story because few such stars are both bright enough and close enough for us to obtain accurate parallaxes and direct measurement of their radius. The authors also note a discrepancy between radii as measured in eclipsing binaries and the predictions of at least some stellar evolution models.

The authors go on to say this:

Although there remains a monotonic correspondence between spectral type (the observational parameter) and effective temperature, Teff, the calibration of this relationship is not as advanced as it is for solar-type stars. M dwarf atmospheres are fully convective, rich in molecular absorption features and depart substantially from blackbody emission at all wavelengths…, so the empirical effective temperature scale is particularly challenging.

Muirhead and team went at their work using the TripleSpec Spectrograph at Palomar, observing 84 Kepler ‘objects of interest’ (KOIs) with effective temperatures (as described by the Kepler Input Catalogue) of less than 4400 K. The resultant mass and radius estimates derived in this paper reduce the size of the planet candidates to the Earth-analogue worlds reported here. This would obviously be a significant finding, but I think we have to wait for a response from the Kepler team, and in particular those involved with the Kepler Input Catalogue, to put the work into perspective. An error of this size would be extreme and, as at least one commenter has noted here, such an error should have shown up in the work on Kepler-16b, yet evidently did not.

I’m a writer, not an astrophysicist, so I’m intrigued but waiting for follow-up work to sort this out. This is, after all, how science works, an interplay of data and analysis that is adjusted as new data emerge. We’ll soon learn whether we have to modify our views of other Kepler candidates to match this result. In the meantime, I’m interested to learn what readers think of the Muirhead team’s analysis.

The paper is Muirhead et al., “Near-Infrared Spectroscopy of Low-Mass Kepler Planet-Candidate Host Stars: Effective Temperatures, Metallicities, Masses and Radii,” submitted to Astrophysical Journal Letters (preprint). On early work on circumbinary planets, see Schneider & Chevreton, “The Photometric Search for Earth-sized Extrasolar Planets by Occultation in Binary Systems,” Astronomy & Astrophysics 232, pp. 251-257 (1990). Abstract available.

tzf_img_post

Charter

In Centauri Dreams, Paul Gilster looks at peer-reviewed research on deep space exploration, with an eye toward interstellar possibilities. For many years this site coordinated its efforts with the Tau Zero Foundation. It now serves as an independent forum for deep space news and ideas. In the logo above, the leftmost star is Alpha Centauri, a triple system closer than any other star, and a primary target for early interstellar probes. To its right is Beta Centauri (not a part of the Alpha Centauri system), with Beta, Gamma, Delta and Epsilon Crucis, stars in the Southern Cross, visible at the far right (image courtesy of Marco Lorenzi).

Now Reading

Version 1.0.0

Recent Posts

On Comments

If you'd like to submit a comment for possible publication on Centauri Dreams, I will be glad to consider it. The primary criterion is that comments contribute meaningfully to the debate. Among other criteria for selection: Comments must be on topic, directly related to the post in question, must use appropriate language, and must not be abusive to others. Civility counts. In addition, a valid email address is required for a comment to be considered. Centauri Dreams is emphatically not a soapbox for political or religious views submitted by individuals or organizations. A long form of the policy can be viewed on the Administrative page. The short form is this: If your comment is not on topic and respectful to others, I'm probably not going to run it.

Follow with RSS or E-Mail

RSS
Follow by Email

Follow by E-Mail

Get new posts by email:

Advanced Propulsion Research

Beginning and End

Archives