An Internet for Deep Space

Networking deep space should be a priority for future missions. If we can set up robust networking between spacecraft, we relieve the Deep Space Network of a huge burden, that of having to communicate directly with each spacecraft for tasks that are essentially routine. No more maneuvering huge dishes to catch one fleeting signal, at least not for missions to come. Instead, we could rely on spacecraft to create their own file transfers, move their own traffic to astronauts (remember the video mail in 2001: A Space Odyssey?), and manage local operations.

Why not use the Internet we’ve already got? Unfortunately, the TCP/IP (Transmission Control Protocol/Internet Protocol) tools we use today are ‘chatty,’ a term that means the computers that run them exchange data over and over again through the course of a transaction. Suppose you want to send a file through FTP (File Transfer Protocol). Doing so takes eight round trips of data between the computers involved before the file can be sent. Not only that, but FTP will time out after a few minutes of inactivity. Try that on a target that’s ten light hours away and the delay times will stretch a routine file swap into a multi-day event.

Vinton Cerf, the visionary behind the original TCP/IP work (and now a Google vice president), has been developing new protocols to handle this problem for a decade. The term to master is Disruption-Tolerant Networking (DTN), and it’s just been through a successful shakedown. Because DTN assumes no continuous connection, it uses store-and-forward methods to hold data until they can be sent. Says Leigh Torgerson (JPL):

“In space today, an operations team has to manually schedule each link and generate all the commands to specify which data to send, when to send it, and where to send it. With standardized DTN, this can all be done automatically.”

It’s satisfying to learn that the EPOXI spacecraft, part of the extended mission of the original Deep Impact vehicle, is one of the nodes on the nascent network, the other nine being simulated spacecraft run through the Jet Propulsion Laboratory. The goal is to get the technology ready for regular use, with a new round of testing scheduled for the International Space Station next summer. And EPOXI, which runs the EPOCH (Extrasolar Planet Observation and Characterization) observing program, adds a nice touch of interstellar drama to the first round, even though the spacecraft itself is, at 32 million kilometers, relatively close to home.

The first intensive use of interplanetary networking is surely going to occur around Mars, where landers can communicate with rovers and the satellites orbiting above. But it’s interesting from the point of view of pure communications that the DTN techniques could have uses here on Earth, particularly in areas where Internet access is needed but sender and receiver may have entirely different schedules. Until Net access is ubiquitous, there will be many places on this planet that can benefit from a system that knows enough to hold data until a signal is available.

For useful background documents (although the pages have not been updated in some time), check the Interplanetary Internet Project site. This NASA news release is also available.

‘Smart Dust’ for Planetary Exploration

Bringing computer networking to space exploration is a major step forward. It allows us to go beyond the old model of pointing radio dishes at a specific spacecraft and downloading information — a time-consuming process as we move from one spacecraft to another — to communicate instead with a single hub vehicle that could be processing data from a cluster of sources. That maximizes precious communications resources here on Earth and allows us to connect planetary rovers, for example, with base stations, orbiting spacecraft and other nearby vehicles.

We’ve talked about interplanetary networking before in terms of the InterPlanetary Internet Project (IPN), a key player in which is Internet legend Vinton Cerf. But extend the idea further, as John Barker (University of Glasgow) is doing today at the Royal Astronomical Society’s national meeting in Lancashire (UK). What Barker has in mind is using ‘smart dust’ — tiny computer chips surrounded by a polymer sheath — to form intelligent swarms for research on a planetary surface.

Here’s how Barker explains the idea:

“We envisage that most of the particles can only talk to their nearest neighbours but a few can communicate at much longer distances. In our simulations we’ve shown that a swarm of 50 smart dust particles can organise themselves into a star formation, even in turbulent wind. The ability to fly in formation means that the smart dust could form a phased array. It would then be possible to process information between the distributed computer chips and collectively beam a signal back to an orbiting spacecraft.”

Smart dust swarms on Mars

How to control the motion of smart dust in such an environment? The surface of the polymer sheath surrounding the chip can be smoothed or wrinkled depending on the application of a small voltage. Switching between rough and smooth modes adjusts the drag on the particle, allowing it to rise or settle toward the surface. Controlling these modes allows the particles to move toward their target despite ambient weather conditions.

Image: The University of Glasgow team has studied the collective movement of motes towards a target located in a portion of the Martian surface that extends over a range of several kilometers. A smart dust mote has approximately the density and volume of a coarse grain of sand. It follows that the uncontrolled motion of smart dust is determined by the same Aeolian processes of saltation, settling and surface creep that govern the movement of sand in desert regions. Adjusting the surface of smart dust particles provides one way to control their motion. Credit: John Barker/University of Glasgow.

Barker notes a key problem with current technology. Today’s chemical sensors are too large for particles the size of sand-grains that might be carried by the Martian wind for deployment. But the denser atmosphere of Venus could carry particles up to a few centimeters in size. In any case, as we move forward not many years, our chip components should reach sizes of a few nanometers across, making them, in Barker’s thinking, more like molecules diffusing through an atmosphere than dust grains.

Nanotechnology will have an extraordinary effect on space exploration, offering in a few short years options that would otherwise have been impossible. The team at Glasgow knows that and is clearly unafraid to think big. Acknowledging that smart dust is years away from deployment on an actual mission, Barker looks well down the road at interstellar implications: “Our first close-up studies of extra-solar planets could come from a smart dust swarm delivered to another solar system by ion-drive.”

Quantum Weirdness and Communications

‘Spooky action at a distance’ is still spooky no matter how you explain it. Einstein famously used the phrase to describe quantum entanglement, where two entangled particles appear to interact instantaneously even though separated in space. Now we’re talking about using the effect for communications, following the news that European scientists have proven that entanglement persists over a distance of 144 kilometers.

Fortunately for would be communicators, a pair of entangled photons can be created in a process called Spontaneous Parametric Down Conversion. Once entangled, the photons stay entangled until one of them interacts with a third particle. When that happens, the other photon changes its quantum state instantaneously. The beauty of entanglement for communications is that anyone trying to listen in on a message invariably disrupts the entangled system, a result that would be easily detectable.

The security potential is obvious in a world where so much banking information takes digital form, and where the security needs of military communications are greater than ever. But is entanglement a theoretical exercise or can it operate in real-world conditions? To find out, the researchers needed to learn not just how far the effect could travel but also how it might be affected by local conditions. Would it be possible for a ground station, for example, to communicate with an orbiting satellite? Or would the atmosphere destroy the entanglement effect?

To find out, the European team used the European Southern Observatory’s one-meter instrument on Tenerife (Canary Islands), situated 144 kilometers from an observatory on the nearby island of La Palma. The entangled pair was created on La Palma, with one photon sent toward Tenerife while the other remained at La Palma for comparison and study. The entanglement survived, implying that a ground-to-orbit connection is workable.

“We were sending the single-photon beam on a 144 kilometres path through the atmosphere, so this horizontal quantum link can be considered a ‘worst case scenario’ for a space to ground link,” says Josep Perdigues, ESA’s Study Manager. Up next: studying quantum entanglement at much greater distances, something that might be done by putting a quantum optical terminal on a dedicated satellite.

We’ll follow that mission concept as it develops. Meanwhile, theorists still have their work cut out for them. Just why does entanglement survive a journey through a medium in which it might be expected to interact with atmospheric molecules? We have much to learn about such bizarre effects, but the recent demonstration of a workable quantum computer from D-Wave Systems highlights how swiftly ‘spooky’ quantum properties are being harnessed for work in the macroscopic world around us.

A Boost for Optical Communications

Given how tricky it is to pick up accidental radio signals — “leakage” — from extraterrestrial civilizations, how hard would it be to communicate with our own probes once they’ve reached a system like Alpha Centauri? A front-runner for interstellar communications is the laser. JPL’s James Lesh analyzed the problem in a 1996 paper, concluding that a 20-watt laser system with a 3-meter telescope as the transmitting aperture could beam back all necessary data to Earth. It’s a system feasible right now.

Right now, that is, if we had some way to get the telescope, just a bit larger than the Hubble instrument, into Centauri space. But even though propulsion lags well behind laser technology for such a mission, we’re continuing to study how lasers can help closer to home. Their high frequencies allow far more data to be packed into the signal, but the highly focused beam also uses a fraction of the power of radio. Data return becomes less of a trickle and more of a flood (imagine high-definition moving video from Mars).

How to handle atmospheric effects that can hamper Earth-based receivers? It’s a problem even on cloud-free days because dust, dirt and water vapor can still scatter light and deflect parts of the beam. Listen to Penn State’s Mohsen Kavehrad: “Free space optical communications offer enormous data rates but operate much more at the mercy of the environment…All of the laser beam photons travel at the speed of light, but different paths make them arrive at different times.”

The result: data ‘echoes’ that confound accurate reception. But the project Kavehrad is working on, funded through the Defense Advanced Research Agency, aims at achieving almost 3 gigabytes per second of data over a distance of 6 to 8 miles through the atmosphere. What the Penn State team has done is to bring digital signal processing methods to bear on laser communications to make the optical link more reliable. They call their approach free-space optical communications. Here’s how a Penn State news release describes the system’s operation:

Using a computer simulation called the atmospheric channel model developed by Penn State’s CICTR, the researchers first process the signal to shorten the overlapping data and reduce the number of overlaps. Then the system processes the remaining signal, picking out parts of the signal to make a whole and eliminate the remaining echoes. This process must be continuous with overlap shortening and then filtering so that a high-quality, fiber optic caliber message arrives at the destination. All this, while one or both of the sender and receiver are moving.

The system works both for air-to-air and air-to-ground links, and provides fiber-optic quality signals. But extend the premise to the growing needs of the Deep Space Network to relieve spectrum overcrowding and provide reliable high-bandwidth links to spacecraft around the Solar System. We’re moving toward a future model of networked space vehicles, communicating not only with Earth but also with each other to coordinate data transfers that will one day be optical.

The bright future of optical communications relies on resolving complications like atmospheric distortion. NASA’s Table Mountain facility in the San Bernadino Mountains houses a one-meter laser telescope used as a testbed for refining data tracking in future space missions. That and a variety of space-borne tests have already demonstrated the viability of the concept. One day we may use it for deep space work and who knows, the reach of the laser may someday carry data from a distant star.

For those who want more details on the Alpha Centauri communications paper mentioned above, it’s Lesh et al., “Space Communications Technologies for Interstellar Missions,” Journal of the British Interplanetary Society 49 (1996): 7-14.

Out into the Celestial Pacific

It won’t get us to the stars, but the navigation practiced by ancient Polynesians — sailing by the stars — continues to fascinate a new generation. And since Centauri Dreams often cites the remarkable voyages of these people as they populated the Pacific, it seems appropriate to focus today on an Australian Broadcasting Company story about an art that has been all but lost. A man named Hoturoa Kerr, who is a lecturer at the University of Waikato (Auckland, NZ), is teaching celestial navigation in an oceanic context to his students.

Finding your way over ocean swells on a body of water as big as the Pacific sounds all but impossible, particularly if your vessel is a small, double-hulled canoe. But Kerr took a GPS with him on a canoe journey from New Zealand to the Cook Islands in a vessel called the “Te Aurere”, checking the work of a navigator aboard the craft who used the old methods. At the end of the journey, he found that at any time, the navigator was no more than twelve miles off the GPS reading.

The Polynesians call it ‘way-finding,’ and it’s a method that relies on more than stars. As the night ends, the navigator takes a final bearing based on star positions, then checks the motion of the canoe as it travels over the ocean swells. During the daylight hours, he will keep the canoe in the same position with regard to the swells, which will usually change little before the Sun sets and the stars again emerge.

Way-finding was good enough to direct the diaspora that began 5000 years ago as the ancestors of the Polynesian peoples pushed eastward into the Pacific. 3,500 years ago they occupied in less than a half-dozen generations the island chains of Fiji, Tonga, and Samoa. The next wave took them, now using larger double canoes, to Tahiti and the Marquesas, then across thousands of miles of open water to Hawaii, Easter Island, and New Zealand, navigating just by the stars, the wind, the ocean swells, and the flight of birds.

Te Aurere has thus far journeyed over 30,000 nautical miles using way-finding alone. “Spiritually it’s a canoe that carries us in terms of our mind and our thinking and everything else,” says Kerr. “And when you sail a canoe, you sail for distant horizons. So what I’m hoping is that with these young people it makes them look towards distant horizons as goals for them in their life.” Such horizons are always worthwhile. And perhaps they’re not so different from the far more distant horizons we may one day embark for out in the Orion Arm.

Optical Communications Success at JAXA

As we move up the frequency ladder toward optical communications, each step takes us closer to the kind of data traffic we’ll need for deep space missions into the Kuiper Belt and beyond. The idea is to pack as much information as possible into the signal. A stream of data transmitted from an antenna spreads at a diffraction rate that is determined by the wavelength of the signal divided by the diameter of the antenna. Higher frequencies, then, give us a much narrower signal, alleviating bandwidth crowding. And a laser communications system makes fewer demands upon a spacecraft’s power sources than radio.

So watch developments like the recent experiment performed by the Japan Aerospace Exploration Agency (JAXA) with interest. The agency carried out a successful optical test using laser beams between its ‘Kirari’ satellite (also known as the Optical Inter-orbit Communication Engineering Test Satellite) and a mobile ground station in Germany. The downlink occurred with the satellite at about 600 kilometers altitude and lasted for three minutes.

We have much to do to iron out a laser communications infrastructure, but demonstrating communications with a mobile station on Earth points to a newfound flexibility in these operations. Lasers will give us data rates a hundred times faster than current radio systems, and will offer mission planners the ability to pack more and more high-resolution tools onto their vehicles for uses such as synthetic aperture radar and hyper-spectral imaging that are far more demanding than photographs. And someday, lasers will carry data from our first dedicated interstellar probes as they close on nearby stars.