A New Tool for Researchers

Searching the Internet has always been dicey, given the wide range of sites you’re likely to pull up on any topic, and the varying degrees of quality each may bring. Google has done good work in restricting Web results — its ‘site-specific search,’ for example, allows you to search within a universe of sites related to a particular topic.

Now the company has gone one better with a new engine called Google Scholar, a test version of which is available.

The beauty of Google Scholar is that your search is limited to journal articles, books, preprints, technical reports and theses, the kind of material serious researchers need to uncover without having to sift out all the chaff. As this article in Nature makes clear, the new service does a fine job at finding the relevant articles on your topic, using variations on the familiar Google algorithms that study the linking that takes place between Web pages and offer a key to their utility.

But instead of studying links to other pages, Google Scholar uses citations, creating indices that offer still more valuable clues about which paper may be most useful. Suddenly the searcher is in a peer-reviewed realm, much of which as been made available to Google through CrossRef Search, a pilot engine that takes searchers to the publishers’ sites, where they can receive an abstract or, if they are subscribers to the content, the full text.

And as the Nature story indicates, there’s also an interesting twist: “Google Scholar has a subversive feature…Each hit also links to all the free versions of the article it has found saved on other sites, for example on personal home pages, elsewhere on the Internet.” Finding what you need is going to become a lot easier as this service matures.

Try conventional Google on a search for information about Clifford Singer’s ‘pellet propulsion’ concepts. Singer discussed firing a stream of pellets from an accelerator to drive a space probe, replacing the stream of photons that might be used to push a lightsail. The pellets would be vaporized on reaching the vehicle, creating a plasma exhaust. A key advantage: you avoid the collimation problems you get with a laser or particle beam, which spread at interstellar distances.

Using ‘singer,”pellet’ and ‘propulsion’ as search terms, I get 98 hits on regular Google, about ten of them high quality. Google Scholar gives me 18, but leads the citations off with the key one: Singer’s 1980 paper in JBIS called “Interstellar Propulsion Using a Pellet Stream for Momentum Transfer.” Even more useful is the link that comes with that top citation, offering a quick Web search for places where the paper is cited. Thus I can go immediately to papers that discuss my topic. Other top-rated items are also useful, and include Dana Andrews’ seminal “Interstellar Transportation Using Today’s Physics.”

We still face the problem that so many key resources, especially older ones, are not yet available on the Web (and it continues to boggle this mind that a source like the Journal of the British Interplanetary Society still offers no online bibliography). But Google Scholar is emphatically a step in the right direction, one that will be welcome in the toolbox of any serious researcher.

An Anomaly from the Edge of the Solar System

Those of us with still fresh memories of Voyager 2’s encounter with Neptune in 1989 find it gratifying that both Voyager probes are still returning good science. It’s even more remarkable that the Pioneer 10 and 11 probes are still in the thick of things, but anomalies in their journeys beyond the orbit of Pluto offer tantalizing clues of some unexplained phenomenon in the far ranges of the Solar System.

As this article in Nature points out, since 1980 the Pioneers have been returning radio signals that have kept shifting to shorter and shorter wavelengths. The implication: both spacecraft are decelerating, even if only by the slightest amount. Some are calling this the ‘Pioneer anomaly,’ and it may just point to a new principle in physics, perhaps involving exotic forces or undiscovered forms of matter.

On the other hand, it may have a much more mundane explanation, such as a fuel leak that could be affecting the probes’ progress. Either way, engineers faced with designing navigation systems for deep space need to know the answer, and a misson to follow the Pioneer craft is now being proposed by Slava Turyshev of NASA’s Jet Propulsion Laboratory in Pasadena, California. In a preprint available online at the ArXiv site, Turyshev and colleagues Michael Martin Nieto and John D. Anderson summarize the issue:

The inability to find a standard explanation for the anomaly, combined with the evident lack of suitable experimental opportunities, motivated an interest in developing a designated mission to study the detected signal… The mission could lead to a determination of the origin of the discovered anomaly; it could also characterize its properties to an accuracy of at least three orders of magnitude below the anomaly’s size. The mission must be capable to discover if the anomaly is due to some unknown physics or else to an on-board systematic. Either way the result would be of major significance. If the anomaly is a manifestation of new or unexpected physics, the result would be of truly fundamental importance. However, even if the anomaly turns out to be an unknown manifestation of an on-board systematic, its understanding would vitally affect the design of future precision space navigation, especially in deep space. Furthermore, technologies and mission design solutions envisioned for this experiment will be vital to many space missions that are to come.

Making this mission happen will mean designing a spacecraft with navigation and instrumentation capabilities beyond those of NASA’s Jupiter Icy Moons Orbiter, which is scheduled for a launch around 2015. A more realistic option may be to design new instrumentation that could be incorporated into the design of a mission already being planned. The Pioneers themselves won’t be able to help — both are far enough away that they no longer can communicate with Earth.

“Lessons Learned from the Pioneers 10/11 for A Mission to Test the Pioneer Anomaly” is now available here on the ArXiv server (PDF warning). Image credit (above): NASA.

By the Light of a Passing Star

Microlensing Planet Finder is a proposed mission that would use the gravitational lensing effect to achieve extraordinary detection capabilities. As presented in The Microlensing Planet Finder: Completing the Census of Extrasolar Planets in the Milky Way (PDF warning), MPF could find planets down to 0.1 Earth mass, in locations as close as 0.7 AU to their parent stars. And unlike any other planet-finding technology, MPF would be able to find free-floating planetary bodies unassociated with any star.

Gravitational microlensing is perhaps the most exotic planet-finding technique. A star or planet can act as a kind of lens, magnifying a more distant bright star behind it. It is the gravitational field of the foreground star that, as Einstein predicted, focuses the light of the distant star, just as a glass lens focuses light in a telescope. By analyzing the light produced by such an event, astronomers can find telltale anomalies that indicate the presence of planets around the foreground star.

And the method works. Last April a new planetary system was found in the constellation Sagittarius, some 17,000 light years away. The planetary system, consisting of a planet 1.5 times as large as Jupiter orbiting a red dwarf at about 3 AU, acted as a magnifying lens for a background star some 24,000 light years away in the direction of the galactic core. What scientists saw was a brightening that indicated the foreground star was affecting light from the object behind it. The fact that they observed two spikes of brightness rather than one indicated the presence of the planetary companion around the nearer star. An analysis of the light curve showed that the smaller object was only 0.4 percent of the mass of the red dwarf.

Location of microlensing star

Image: The location of the new discovery near the center of our Milky Way galaxy. The images span a 120-day period in which the star became much brighter due to microlensing. There were two spikes in brightness, indicating that the passing star had a planet in orbit around it. The “lens” or foreground star and planet are 17,000 light-years away in the constellation Sagittarius, while the background star is 24,000 light-years away. Credit: NASA/JPL-Caltech.

The kicker is that, for gravitational microlensing to work, stars have to be aligned almost perfectly. That means huge numbers of stars have to be monitored, a fact the Microlensing Planet Finder would obviate by focusing on the packed starfield in the direction of the Milky Way’s center. David Bennett, Ian Bond et al., authors of the paper cited above, summarize MPF’s goals:

• Provide the first census of planets like those in our own solar system, with sensitivity to analogs of all solar system planets except for Mercury and Pluto.
• Discover 66 terrestrial planets, 3300 gas giants, and 110 ice giant planets (if our solar System is typical).
• Discover Earth-like planets at 1-2.5 AU from their stars even if only a few per cent of stars have such planets.
• Discover Earth-like planets within a few months of launch.
• Discover free-floating planets, not gravitationally bound to any star.
• Provide direct measurement of star:planet mass ratios, plus planet masses and separations for planets orbiting solartype stars.

The authors believe that MPF and the upcoming Kepler mission would work together to “…provide measurements of the frequency of Earth-like and larger extrasolar planets at all star:planet separations and will complete the first census of extrasolar planets like those in our own Solar System.”

For more on gravitational imaging and the Sagittarius discovery, see this excellent JPL animation. A Notre Dame press release covering David Bennett’s work in the Sagittarius find can be found here.

A Quote for the Weekend

From the remarkable H. G. Wells, in a 1902 lecture at London’s Royal Institution:

“It is conceivable that some great unexpected mass of matter should presently rush upon us out of space, whirl sun and planets aside like dead leaves before the breeze, and collide with and utterly destroy every spark of life upon this earth… It is conceivable, too, that some pestilence may presently appear, some new disease, that will destroy not 10 or 15 or 20 per cent of the earth’s inhabitants as pestilences have done in the past, but 100 per cent, and so end our race… And finally there is the reasonable certainty that this sun of ours must some day radiate itself toward extinction… There surely man must end. That of all such nightmares is the most insistently convincing. And yet one doesn’t believe it. At least I do not. And I do not believe in these things because I have come to believe in certain other things–in the coherency and purpose in the world and in the greatness of human destiny. Worlds may freeze and suns may perish, but there stirs something within us now that can never die again.”

Of the three outcomes Wells gauges to be most fearsome for mankind’s future, the first is the most telling when it comes to encouraging interstellar research. We know how many massive impacts from asteroids and comets have caused species extinction on our planet, and there is no reason to believe such events could not happen again. That means we need a space-based infrastructure in the outer Solar System to intercept and alter the course of objects likely to hit the Earth. The technology developed in building that infrastructure will make possible our first probes to nearby stars.

Finding Planets in the Datastream

A new project called PlanetQuest will soon offer a way to get involved personally with the hunt for extrasolar planets. The idea is to use the power of distributed computing, as the hugely influential SETI@home project has already done, letting people run data analysis software as a screensaver that operates whenever their computer is idle. PlanetQuest will be designed to hunt for planets by studying high-density star regions looking for occlusions — in other words, for evidence that an extrasolar planet has moved between us and its star.

Laurance Doyle, an astrophysicist at the SETI Institute, notes that while occlusions may be rare (after all, the stellar system must be lined up with ours so that planetary orbits cross our line of vision), the hunt will also yield dividends in terms of our knowledge of variable stars, as well as broader issues like stellar stability and evolution.

But even as we accumulate new data, we still have the problem of managing what we have. Consider the Sloan Digital Sky Survey, which intends to map one-quarter of the entire sky in detail, covering hundreds of millions of celestial objects. The results of the Survey are offered electronically to the scientific community and the public; by Survey’s end, about 15 terabytes of information will be produced, a figure that compares favorably to the 24 or so terabytes of textual information held by the Library of Congress.

Here’s what astronomer and computer scientist Alexander Szalay has to say about the problem of massive data sets in a Johns Hopkins press release:

“The massive amounts of data emerging from our newest instruments – telescopes, particle detectors, gene sequencers – demand a novel method of analysis that coalesces the skills of astronomers, biologists and others with those of the computer scientist, the computational scientist and the statistician… Most scientific data collected today will never be directly examined as ‘raw data’ by scientists; instead, it will be put online into ‘smart databases’ where it will be analyzed and summarized by computer programs before scientists even see or use it.”

Szalay has been working for a decade creating multidisciplinary ways to consolidate huge data sets as part of the science archive for the Sloan Digital Sky Survey. This work has also played a role in the creation of the National Virtual Observatory, which is, according to its Web site, “…developing tools that make it easy to locate,
retrieve, and analyze astronomical data from archives and catalogs worldwide, and to compare theoretical models and simulations with observations.”

Szalay has just received a $1.2 million grant from the Gordon and Betty Moore Foundation to continue his team’s work on data analysis in cases where the sheer amount of information demands new and creative ways to extract good science from the dataflow. Just as digital techniques like adaptive optics have brought unexpected clarity to observations from Earth-bound telescopes, so data manipulation will play an ever larger role in exploring star systems for evidence of distant planets.