Our Best View of Europa

by Paul Gilster on November 25, 2014

Apropos of yesterday’s post questioning what missions would follow up the current wave of planetary exploration, the Jet Propulsion Laboratory has released a new view of NASA’s intriguing moon Europa. The image, shown below, looks familiar because it was published in 2001, though at lower-resolution and with considerable color enhancement. The new mosaic gives us the largest portion of the moon’s surface at the highest resolution, and without the color enhancement, so that it approximates what the human eye would see.

The mosaic of images that go into this view was put together in the late 1990s using imagery from the Galileo spacecraft, which again makes me thankful for Galileo, a mission that succeeded despite all its high-gain antenna problems, and anxious for renewed data from this moon. The original data for the mosaic were acquired by the Galileo Solid-State Imaging experiment on two different orbits through the system of Jovian moons, the first in 1995, the second in 1998.

NASA is also offering a new video explaining why the interesting fracture features merit investigation, given the evidence for a salty subsurface ocean and the potential for at least simple forms of life within. It’s a vivid reminder of why Europa is a priority target.

europa_highres

Image (click to enlarge): The puzzling, fascinating surface of Jupiter’s icy moon Europa looms large in this newly-reprocessed color view, made from images taken by NASA’s Galileo spacecraft in the late 1990s. This is the color view of Europa from Galileo that shows the largest portion of the moon’s surface at the highest resolution. Credit: NASA/Jet Propulsion Laboratory.

Areas that appear blue or white are thought to be relatively pure water ice, with the polar regions (left and right in the image — north is to the right) bluer than the equatorial latitudes, which are more white. This JPL news release notes that the variation is thought to be due to differences in ice grain size in the two areas. The long cracks and ridges on the surface are interrupted by disrupted terrain that indicates broken crust that has re-frozen. Just what do the reddish-brown fractures and markings have to tell us about the chemistry of the Europan ocean, and the possibility of materials cycling between that ocean and the ice shell?

tzf_img_post

{ 1 comment }

Rosetta: Building Momentum for Deep Space?

by Paul Gilster on November 24, 2014

Even though its arrival on the surface of comet 67P/Churyumov-Gerasimenko did not go as planned, the accomplishment of the Rosetta probe is immense. We have a probe on the surface that was able to collect 57 hours worth of data before going into hibernation, and a mother ship that will stay with the comet as it moves ever closer to the Sun (the comet’s closest approach will be on August 13 of next year).

What a shame the lander’s ‘docking’ system, involving reverse thrusters and harpoons to fasten it to the surface, malfunctioned, leaving it to bounce twice before it landed with solar panels largely shaded. But we do know that the Philae lander was able to detect organic molecules on the cometary surface, with analysis of the spectra and identification of the molecules said to be continuing. The comet appears to be composed of water ice covered in a thin layer of dust. There is some possibility the lander will revive as the comet moves closer to the Sun, according to Stephan Ulamec (DLR German Aerospace Center), the mission’s Philae Lander Manager, and we can look forward to reams of data from the still functioning Rosetta.

What an audacious and inspiring mission this first soft landing on a comet has been. Congratulations to all involved at the European Space Agency as we look forward to continuing data return as late as December 2015, four months after the comet’s closest approach to the Sun.

Rosetta-Osiris_3109773b

Image: The travels of the Philae lander as it rebounds from its touchdown on Comet 67P/Churyumov Gerasimenko. Credit: ESA/Rosetta/Philae/ROLIS/DLR.

A Wave of Discoveries Pending

Rosetta used gravitational assists around both Earth and Mars to make its way to the target, hibernating for two and a half years to conserve power during the long journey. Now we wait for the wake-up call to another distant probe, New Horizons, as it comes out of hibernation for the last time on December 6. Since its January, 2006 launch, the Pluto-bound spacecraft has spent 1,873 days in hibernation, fully two-thirds of its flight time, in eighteen hibernation periods ranging from 36 days to 202 days, a way to reduce wear on the spacecraft’s electronics and to free up an overloaded Deep Space Network for other missions.

When New Horizons transmits a confirmation that it is again in active mode, the signal will take four hours and 25 minutes to reach controllers on Earth, at a time when the spacecraft will be more than 2.9 billion miles from the Earth, and less than twice the Earth-Sun distance from Pluto/Charon. According to the latest report from the New Horizons team, direct observations of the target begin on January 15, with closest approach on July 14.

Nor is exploration slowing down in the asteroid belt, with the Dawn mission on its way to Ceres. Arrival is scheduled for March of 2015. Eleven scientific papers were published last week in the journal Icarus, including a series of high-resolution geological maps of Vesta, which the spacecraft visited between July of 2011 and September of 2012.

vesta_map

Image (click to enlarge): This high-resolution geological map of Vesta is derived from Dawn spacecraft data. Brown colors represent the oldest, most heavily cratered surface. Purple colors in the north and light blue represent terrains modified by the Veneneia and Rheasilvia impacts, respectively. Light purples and dark blue colors below the equator represent the interior of the Rheasilvia and Veneneia basins. Greens and yellows represent relatively young landslides or other downhill movement and crater impact materials, respectively. This map unifies 15 individual quadrangle maps published this week in a special issue of Icarus. Credit: NASA/JPL.

Geological mapping develops the history of the surface from analysis of factors like topography, color and brightness, a process that took two and a half years to complete. We learn that several large impacts, particularly the Veneneia and Rheasilvia impacts in Vesta’s early history and the much later Marcia impact, have been transformative in the development of the small world. Panchromatic images and seven bands of color-filtered images from the spacecraft’s framing camera, provided by the Max Planck Society and the German Aerospace Center, helped to create topographic models of the surface that could be used to interpret Vesta’s geology. Crater statistics fill out the timescale as scientists date the surface.

With a comet under active investigation, an asteroid thoroughly mapped, a spacecraft on its way to the largest object in the asteroid belt, and an outer system encounter coming up for mid-summer of 2015, we’re living in an exciting time for planetary discovery. But we need to keep looking ahead. What follows New Horizons to the edge of the Solar System and beyond? What assets should we be hoping to position around Jupiter’s compelling moons? Is a sample-return mission through the geysers of Enceladus feasible, and what about Titan? Let’s hope Rosetta and upcoming events help us build momentum for following up our current wave of deep space exploration.

tzf_img_post

{ 14 comments }

Slingshot to the Stars

by Paul Gilster on November 21, 2014

Back in the 1970s, Peter Glaser patented a solar power satellite that would supply energy from space to the Earth, one involving space platforms whose cost was one of many issues that put the brakes on the idea, although NASA did revisit the concept in the 1980’s and 90’s. But changing technologies may help us make space-based power more manageable, as John Mankins (Artemis Innovations) told his audience at the Tennessee Valley Interstellar Workshop.

What Mankins has in mind is SPS-ALPHA (Solar Power Satellite by means of Arbitrarily Large Phased Array), a system of his devising that uses modular and reconfigurable components to create large space systems in the same way that ants and bees form elegant and long-lived ecosystems on Earth. The goal is to harvest sunlight using thin-film reflector surfaces as part of an ambitious roadmap for solar power. Starting small — using small satellites and beginning with propulsion stablization modules — we begin scaling up, one step at a time, to full-sized solar power installations. The energies harvested are beamed to a receiver on the ground.

mankins_1

Image: An artist’s impression of SPS-ALPHA at work. Credit: John Mankins.

All this is quite a change from space-based solar power concepts from earlier decades, which demanded orbital factories to construct and later maintain the huge platforms needed to harvest sunlight. But since the late 1990s, intelligent modular systems have come to the fore as the tools of choice. Self-assembly involving modular 10 kg units possessed of their own artificial intelligence, Mankins believes, will one day allow us to create structures of sufficient size that can essentially maintain themselves. Thin-film mirrors to collect sunlight keep the mass down, as does the use of carbon nanotubes in composite structures.

There is no question that we need the energy if we’re thinking in terms of interstellar missions, though some would argue that fusion may eventually resolve the problem (I’m as dubious as ever on that idea). Mankins harked back to the Daedalus design, estimating its cost at $4 trillion and noting that it would require an in-space infrastructure of huge complexity. Likewise Starwisp, a Robert Forward beamed-sail design, which would need to power up beamers in close solar orbit to impart energy to the spacecraft. Distance and time translates into energy and power.

Growing out of the vast resources of space-based solar power is a Mankins idea called Star Sling, in which SPS-ALPHA feeds power to a huge maglev ring as a future starship accelerates. Unlike a fusion engine or a sail, the Star Sling allows acceleration times of weeks, months or even years, its primary limitation being the tensile strength of the material in the radial acceleration direction (a fraction of what would be needed in a space elevator, Mankins argues). The goal is not a single starship but a stream of 50 or 100 one to ten ton objects sent one after another to the same star, elements that could coalesce and self-assemble into a larger starship along the way.

Like SPS-ALPHA itself, Star Sling also scales up, beginning with an inner Solar System launcher that helps us build the infrastructure we’ll need. Also like SPS-ALPHA, a Star Sling can ultimately become self-sustaining, Mankins believes, perhaps within the century:

“As systems grow, they become more capable. Consider this a living mechanism, insect-class intelligences that recycle materials and print new versions of themselves as needed. The analog is a coral atoll in the South Pacific. Our systems are immortal as we hope our species will be.”

All of this draws from a 2011-2012 Phase 1 project for the NASA Innovative Advanced Concepts program on SPS-ALPHA, one that envisions “…the construction of huge platforms from tens of thousands of small elements that can deliver remotely and affordably 10s to 1000s of megawatts using wireless power transmission to markets on Earth and missions in space.” The NIAC report is available here. SPS-ALPHA is developed in much greater detail in Mankins’ book The Case for Space Solar Power.

Ultra-Lightweight Probes to the Stars

Knowing of John Rather’s background in interstellar technologies (he examined Robert Forward’s beamed sail concepts in important work in the 1970s, and has worked with laser ideas for travel and interstellar beacons in later papers), I was anxious to hear his current thoughts on deep space missions. I won’t go into the details of Rather’s long and highly productive career at Oak Ridge, Lawrence Livermore and the NRAO, but you can find a synopsis here, where you’ll also see how active this kind and energetic scientist remains.

Like Mankins, Rather (Rather Creative Innovations Group) is interested in structures that can build and sustain themselves. He invoked self-replicating von Neumann machines as a way we might work close to the Sun while building the laser installations needed for beamed sails. But of course self-replication plays out across the whole spectrum of space-based infrastructure. As Rather noted:

“Tiny von Neumann machines can beget giant projects. Our first generation projects can include asteroid capture and industrialization, giving us the materials to construct lunar colonies and expand to Mars and the Jovian satellites. We can see some of the implementing technologies now in the form of MEMS – micro electro-mechanical systems – along with 3D printers. As we continue to explore tiny devices that build subsequent machines, we can look toward expanding from colonization of our own Solar System into the problems of interstellar transfer.”

Building our system infrastructure requires cheap access to space. Rather’s concept is called StarTram, an electromagnetic accelerator that can launch unmanned payloads at Mach 10 (pulling 30 g’s at launch). The key here is to drop launch costs down from roughly $20,000 per kilogram to $100 per kilogram. Using these methods, we can turn our attention to asteroid materials that can, via self-replicating von Neumann technologies, build solar concentrators, lightsails and enormous telescope apertures (imagine a Forward-class lens 1000-meters in radius). 100-meter solar concentrators could change asteroid orbits for subsequent mining.

This is an expansive vision that comprises a blueprint for an eventual interstellar crossing. With reference to John Mankins’ Star Slinger, Rather mused that a superconducting magnetically inflated cable 50,000 kilometers in radius could be spun around the Earth, allowing the kind of solar power concentrator just described to power up the launcher. Taking its time to accelerate, a lightweight probe could reach three percent of lightspeed within 300 days, launching a 30 kg payload to the stars. The macro-engineering envisioned by Robert Forward still lives, to judge from both Rather’s and Mankins’ presentations, transformed by what may one day be our ability to create the largest of structures from tiny self-replicating machines.

The Solar Power Pipeline

Back when I was writing Centauri Dreams in 2004, I spent some time at Marshall Space Flight Center in Huntsville interviewing people like Les Johnson and Sandy Montgomery, who were both in the midst of the center’s work on advanced propulsion. A major player in the effort that brought us NanoSail-D, Sandy has been interstellar-minded all long, as I discovered the first time I talked to him. I had asked whether people would be willing to turn their back on everything they ever knew to embark on a journey to another star, and he reminded me of how many people had left their homes in our own history to voyage to and live at the other side of the world.

montgomery

Image: Edward “Sandy” Montgomery, NanoSail-D payload manager at Marshall (in the red shirt) and Charlie Adams, NanoSail-D deputy payload manager, Gray Research, Huntsville, Ala. look on as Ron Burwell and Rocky Stephens, test engineers at Marshall, attach the NanoSail-D satellite to the vibration test table. In addition to characterizing the satellite’s structural dynamic behavior, a successful vibration test also verifies the structural integrity of the satellite, and gauges how the satellite will endure the harsh launch environment. Credit: NASA/MSFC/D. Higginbotham.

We’re a long way from making such decisions, of course, but Montgomery’s interest in Robert Forward’s work has stayed active, and in Oak Ridge he described a way to power up a departing starship that didn’t have to rely on Forward’s 1000-kilometer Fresnel lens in the outer Solar System. Instead, Montgomery points to building a power collector in Mercury orbit that would use optical and spatial filtering to turn sunlight into a coherent light source and stream it out into the Solar System through a series of relays built out of lightweight gossamer structures.

Work the calculations as Montgomery has and you wind up with 23 relays between Earth orbit and the Sun, with more extending deeper into the Solar System. Sandy calls this a ‘solar power pipeline’ that would give us maximum power for a departing sailcraft. The relaying of coherent light has been demonstrated already in experiments conducted by the Department of Defense, in a collector and re-transmitter system developed by Boeing and the US Air Force. Although some loss occurs because of jitter and imperfect coatings, the concept is robust enough to warrant further study. I suspect Forward would have been eager to run the calculations on this idea.

Wrapping Up TVIW

Les Johnson closed the formal proceedings at TVIW late on the afternoon of the 11th, and that night held a public outreach session, where I gave a talk running through the evolution of interstellar propulsion concepts in the last sixty years. Following that was a panel with science fiction writers Sarah Hoyt, Tony Daniel, Baen Books’ Toni Weisskopf and Les Johnson on which I, a hapless non-fiction writer, was allowed to have a seat. A book signing after the event made for good conversations with a number of Centauri Dreams readers.

All told, this was an enthusiastic and energizing conference. I’m looking forward to TVIW 2016 in Chattanooga. What a pleasure to spend time with these people.

tzf_img_post

{ 18 comments }

TVIW: From Wormholes to Orion

by Paul Gilster on November 20, 2014

People keep asking what I think about Christopher Nolan’s new film ‘Interstellar.’ The answer is that I haven’t seen it yet, but plan to early next week. Some of the attendees of the Tennessee Valley Interstellar Workshop were planning to see the film on the event’s third day, but I couldn’t stick around long enough to join them. I’ve already got Kip Thorne’s The Science of Interstellar queued up, but I don’t want to get into it before actually seeing the film. I’m hoping to get Larry Klaes, our resident film critic, to review Nolan’s work in these pages.

Through the Wormhole

Wormholes are familiar turf to Al Jackson, who spoke at TVIW on the development of our ideas on the subject in science and in fiction. Al’s background in general relativity is strong, and because I usually manage to get him aside for conversation at these events, I get to take advantage of his good humor by asking what must seem like simplistic questions that he always answers with clarity. Even so, I’ve asked both Al and Marc Millis to write up their talks in Oak Ridge, because both of them get into areas of physics that push beyond my skillset.

wormhole1

Al’s opening slide was what he described as a ‘traversable wormhole,’ and indeed it was, a shiny red apple with a wormhole on its face. What we really want to do, of course, is to connect two pieces of spacetime, an idea that has percolated through Einstein’s General Relativity down through Schwarzchild, Wheeler, Morris and Thorne. The science fiction precedents are rich, with a classic appearance in Robert Heinlein’s Starman Jones (1953), the best of his juveniles, in my opinion. Thus our hero Max explains how to get around the universe:

You can’t go faster than light, not in our space. If you do, you burst out of it. Buf it you do it where space is folded back and congruent, you pop right back into our space again but it’s a long way off. How far off depends on how it’s folded. And that depends on the mass in the space, in a complicated fashion that can’t be described in words but can be calculated.

I chuckled when Al showed this slide because the night before we had talked about Heinlein over a beer in the hotel bar and discovered our common admiration for Starman Jones, whose description of ‘astrogators’ — a profession I dearly wanted to achieve when I read this book as a boy — shows how important it is to be precisely where you need to be before you go “poking through anomalies that have been calculated but never tried.” Great read.

If natural wormholes exist, we do have at least one paper on how they might be located, a team effort from John Cramer, Robert Forward, Michael Morris, Matt Visser, Gregory Benford and Geoffrey Landis. As opposed to gravitational lensing, where the image of a distant galaxy has been magnified by the gravitational influence of an intervening galaxy, a wormhole should show a negative mass signature, which means that it defocuses light instead of focusing it.

Al described what an interesting signature this would be to look for. If the wormhole moves between the observer and another star, the light would suddenly defocus, but as it continues to cross in front of the star, a spike of light would occur. So there’s your wormhole detection: Two spikes of light with a dip in the middle, an anomalous and intriguing observation! It’s also one, I’ll hasten to add, that’s never been found. Maybe we can manufacture wormholes? Al described plucking a tiny wormhole from the quantum Planck foam, the math of which implies we’d have to be way up the Kardashev scale to pull off any such feat. For now, about the best we can manage is to keep our eyes open for that astronomical signature, which would at least indicate wormholes actually exist. The paper cited above, by the way, is “Natural Wormholes as Gravitational Lenses,” Physical Review D (March 15, 1995): pp. 3124–27.

wormhole_2

Enter the Space Drive

To dig into wormholes, the new Thorne book would probably be a good starter, though I base this only on reviews, as I haven’t gotten into it yet. Frontiers of Propulsion Science (2009) also offers a look into the previous scholarship on wormhole physics and if you really want to dig deep, there’s Matt Visser’s Lorentzian Wormholes: From Einstein to Hawking (American Institute of Physics, 1996). I wanted to talk wormholes with Marc Millis, who co-edited the Frontiers of Propulsion Science book with Eric Davis, but the tight schedule in Oak Ridge and Marc’s need to return to Ohio forced a delay.

mmillis

In any event, Millis has been working on space drives rather than wormholes, the former being ways of moving a spacecraft without rockets or sails. Is it possible to make something move without expelling any reaction mass (rockets) or in some way beaming momentum to it (lightsails)? We don’t know, but the topic gets us into the subject of inertial frames — frames of reference defined by the fact that the law of inertia holds within them, so that objects observed from this frame will resist changes to their velocity. Juggling balls on a train moving at a constant speed (and absent visual or sound cues), you could not determine whether the train was in motion or parked. The constant-velocity train is considered an inertial frame of reference.

Within the inertial frame, in other words, Newton’s laws of motion hold. An accelerating frame of reference is considered a non-inertial frame because the law of inertia is not maintained in it. If the conductor pulls the emergency brake on the train, you are pushed forward suddenly in this decelerating frame of reference. From the standpoint of the ground (an inertial frame), you aboard the train simply continue with your forward motion when the brake is applied.

We have no good answers on what causes an inertial frame to exist, an area where unsolved physics regarding the coupling of gravitation and inertia to other fundamental forces leave open the possibility that one could be used to manipulate the other. We’re at the early stages of such investigations, asking whether an inertial frame is an intrinsic property of space itself, or whether it somehow involves, as Ernst Mach believed, a relationship with all matter in the universe. That leaves us in the domain of thought experiments, which Millis illustrated in a series of slides that I hope he will discuss further in an article here.

Fusion’s Interstellar Prospects

Rob Swinney, who is the head of Project Icarus, used his time at TVIW to look at a subject that would seem to be far less theoretical than wormholes and space drives, but which still has defeated our best efforts at making it happen. The subject is fusion and how to drive a starship with it. The Daedalus design of the 1970s was based on inertial confinement fusion, using electron beams to ignite fusion in fuel pellets of deuterium and helium-3. Icarus is the ongoing attempt to re-think that early Daedalus work in light of advances in technology since.

But like Daedalus, Icarus will need to use fusion to push the starship to interstellar speeds. Robert Freeland and Andreas Hein, also active players in Icarus, were also in Oak Ridge, and although Andreas was involved with a different topic entirely (see yesterday’s post), Robert was able to update us on the current status of the Icarus work. He illustrated one possibility using Z-pinch methods that can confine a plasma to heat it to fusion conditions.

Three designs are still in play at Icarus, with the Z-pinch version (Freeland coined it ‘Firefly’ because of the intense glow of waste heat that would be generated) relying on the same Z-pinch phenomenon we see in lightning. The trick with Z-pinch is to get the plasma moving fast enough to create a pinch that is free of hydrodynamic instabilities, but Icarus is tracking ongoing work at the University of Washington on the matter. As to fuel, the team has abandoned deuterium/helium-3 in favor of deuterium/deuterium fusion, a choice that must flow from the problem of obtaining the helium-3, which Daedalus assumed would be mined at Jupiter.

Freeland described the Firefly design as having an exhaust velocity of 10,000 kilometers per second, with a 25 year acceleration period to reach cruise speed. The cost: $35 billion a year spread out over 15 years. I noted in Rob Swinney’s talk that the Icarus team is also designing interstellar precursor missions, with the idea of building a roadmap. All told, 35,000 hours of volunteer research are expected to go into this project (I believe Daedalus was 10,000), with the goal of not just reaching another star but decelerating at the target to allow close study.

icarus_pathfinder_vasimr_1l

Image: Artist’s conception of Icarus Pathfinder. Credit: Adrian Mann.

Let me also mention a design from the past that antedates Daedalus, which was begun in 1973. Brent Ziarnick is a major in the US Air Force who described the ARPA-funded work on nuclear pulse propulsion that grew into Orion, with work at General Atomics from 1958 to 1965. Orion was designed around the idea of setting off nuclear charges behind the spacecraft, which would be protected by an ablation shield and a shock absorber system to cushion the blasts.

We’ve discussed Orion often in these pages as a project that might have opened up the outer Solar System, and conceivably produced an interstellar prototype if Freeman Dyson’s 1968 paper on a long-haul Orion driven by fusion charges had been followed up. Ziarnick’s fascinating talk explained how the military had viewed Orion. Think of an enormous ‘battleship’ of a spacecraft that could house a nuclear deterrent in a place that Soviet weaponry couldn’t reach. At least, that was how some saw the Cold War possibilities in the early years of the 1960s.

The military was at this time looking at stretch goals that went way beyond the current state of the art in Project Mercury, and had considered systems like Dyna-Soar, an early spaceplane design. With a variety of manned space ideas in motion and nuclear thermal rocket engines under investigation, a strategic space base that would be invulnerable to a first strike won support all the way up the command chain to Thomas Power at the Strategic Air Command and Curtis LeMay, who was then Chief of Staff of the USAF. Ziarnick followed Orion’s budget fortunes as it ran into opposition from Robert McNamara and ultimately Harold Brown, who worked under McNamara as director of defense research and engineering from 1961 to 1965.

Orion would eventually be derailed by the Atmospheric Test Ban Treaty of 1963, but the idea still has its proponents as a way of pushing huge payloads to deep space. Ziarnick called Orion ‘Starfleet Deferred’ rather than ‘Starflight Denied,’ and noted the possibility of renewed testing of pulse propulsion without nuclear pulse units. The military lesson from Orion:

“The military is not against high tech and will support interstellar research if they can find a defense reason to justify it. We learn from Orion that junior officers can convince senior leaders, that operational commanders like revolutionary tech. Budget hawks distrust revolutionary tech. Interstellar development will be decided by political, international, defense and other concerns.”

Several other novel propulsion ideas, as well as a book signing event, will wrap up my coverage of the Tennessee Valley Interstellar Workshop tomorrow.

tzf_img_post

{ 35 comments }

Building Large Structures in Space

by Paul Gilster on November 19, 2014

One thing the Tennessee Valley Interstellar Workshop did not offer was a lot of spare time. Les Johnson told attendees at the beginning that we would be working straight through. Between presentations and workshop sessions, that was pretty much the case, with no break at all between an 8:00 start and lunch, and afternoon sessions punctuated by breakout workshop sessions on four topics: communications and SETI; biology in small ecosystems; safety issues for interstellar missions; and a competition to reverse-engineer famous starships from science fiction literature. I finished up the after-dinner workshop session around 9:30 that first night.

An Encounter with ‘Dr. SETI’

It was a pleasure to finally meet the SETI League’s Paul Shuch in Oak Ridge. Paul and I have exchanged email for some time now, mostly about material we might use on our respective sites, and I’ve long admired the engineering and leadership skills he brings to a SETI all-sky survey that coordinates the efforts of 127 small receiving stations around the world. If you’re not aware of his Searching for Extraterrestrial Intelligence (Springer, 2011), realize that it contains 26 essays not only from some of SETI’s biggest names but also science fiction writers like Stephen Baxter and David Brin in encapsulating the key issues of the field.

tviw_shuch

Image: The SETI League’s Paul Shuch (center) receiving a copy of Interstellar Migration and the Human Experience from Robert Kennedy (left) and Sam Lightfoot.

Introduced by David Fields of Tamke-Allen Observatory at nearby Roane State Community College, Shuch ran through a synopsis of SETI’s history at the conference. He lingered over the beginnings of radio astronomy, when Karl Jansky tried to chase down the source of the interference that Bell Telephone Laboratories was picking up in trans-Atlantic communications (he was detecting the galactic center), and also pointed to Grote Reber, the Illinois ham radio operator who was, back in the 1930s and for almost a decade, the only practicing radio astronomer in the world. Paul’s discussion of the Ohio State WOW! signal, logged by a surprised Jerry Ehman in 1977, reminded me how much the enigmatic reception still haunts us.

A chance reception of an extraterrestrial beacon? If so, it was one that only swept past our planet, for the WOW signal was never detected again despite later efforts, and we’ll surely never know its true origin. Nor can we jump to conclusions, remembering Frank Drake’s first strong Project Ozma signal. It seemed to come from Epsilon Eridani (could SETI be this easy, Drake wondered?), but as Shuch explained, it turned out to be the reception of a U-2 spy plane, then highly classified but about to become public news after the shootdown of Francis Gary Powers.

Decades ago I wrote up a few articles for the journal of the Society of Amateur Radio Astronomers (SARA), a non-engineer addressing an audience of crack equipment makers and instrument tweakers. I hate to think how many mistakes I made in my analysis of the Drake Equation back then, but it’s a pleasure to recall those days considering Paul’s advocacy of SARA and the recent loss of Mike Gingell, a good friend and SARA member who had known Paul from meetings at Green Bank (where Project Ozma was born) and had a substantial backyard antenna array of his own.

The Beamed Sail and Survival

Jim Benford’s work on beamed sails continues under the aegis of Project Forward, an attempt to characterize and advance the science on strategies to get a sail up to interstellar velocities. Deployment of large sails is always an issue, but spin deployment has been demonstrated in the laboratory, with sails unfolding from their tight packages like Japanese origami. The big capital cost goes into the transmitters — a phased array left behind in the Solar System — with major expenses in operating the beamer that make the construction of each individual sail relatively inexpensive in contrast. Build the infrastructure and you can launch many missions.

These are going to be installations that will require considerable expertise at handling large construction projects in space. In Benford’s words:

“We are going to have to learn the art of building very large arrays, and we’re going to have to build them in space. We know how to build the transmitters, but the structures will be on the scale of hundreds of kilometers, creating the same issues we will face in space-based solar power projects. Construction in space will invariably be managed by robots. Early work in that direction can be seen in SpiderFab, a robotic construction idea being studied by NASA.”

This wasn’t the only time SpiderFab came up at Oak Ridge, as we’ll see in a moment. The idea, championed by Robert Forward’s company Tethers Unlimited, would use 3D printing methods to build the needed systems without further human intervention. Robert Hoyt, a co-founder of Tethers Unlimited, describes SpiderFab as combining “… the techniques of fused deposition modeling (FDM) with methods derived from automated composite layup to enable rapid construction of very large, very high-strength-per-mass, lattice-like structures combining both compressive and tensile elements.” What you wind up with is a way to incorporate high-strength materials into system components like antennas that are optimized for space conditions.

Beamer pointing accuracy will have to be extreme, presenting a major challenge for sailship missions. The accuracy needed, Benford said, is micro-radian for early work, more or less the state of the art in the phased arrays used by the military. Taking a sailcraft all the way to the Oort Cloud would require accuracy to reach the nano-radian level, and would push into pico-radians when we’re talking about actual interstellar missions. A key here is that as the pointing accuracy of the array lowers, the acceleration on the sail has to increase because it will not be able to stay under a tightly focused beam as long as it would with a more precise array.

tviw_dinner

Image: Dinner with the interstellar crowd after the first day’s last plenary session. That’s Jim Benford at far left, then James Early, Sandy Montgomery and Michael Lynch.

What a pleasure to spend time at dinner not only with Benford but James Early, who has written up (with Richard London) ideas on how sails will interact with the interstellar medium. The researchers worked with a beryllium sail as a reference point and studied the effect of local interstellar dust on both sail and payload. In this study, dust grains and atoms of interstellar gas actually pass through the thin sail materials with little loss of energy, creating scant damage.

Moreover, sails turn out to offer a way of protecting the interstellar vehicle because the deployed thin foil offers a way to convert dust grains or neutral gas atoms into free electrons and ions. “These charged particles,” the authors write, “can then be easily deflected away from the vehicle with electrostatic shields.” I wrote these ideas up in a 2012 essay called Lightsails: Safe Passage After All?, but I want to catch up with Early to see whether he’s done further work on the matter. The original paper is “Dust Grain Damage to Interstellar Laser-Pushed Lightsail,” Journal of Spacecraft and Rockets, July-Aug. 2000, Vol. 37, No. 4, pp. 526-531.

Building on Complexity

SpiderFab certainly has its advocates, as do any ideas that advance the notion of building deep space structures on large scales. Andreas Hein, who is not only deputy director of the Initiative for Interstellar Studies but also the head of Project Hyperion for Icarus Interstellar, has been asking whether current trends in engineering — and SpiderFab is certainly indicative of one of these — point to a future where even highly complex products can be produced in a fraction of the time they currently demand. Where do projects like SpiderFab ultimately take us?

spiderfab2

Image: SpiderFab combines techniques evolved from terrestrial additive manufacturing and composite layup with robotic assembly to enable on-orbit construction of large spacecraft components optimized for the zero-g environment. Credit: Tethers Unlimited/NASA.

The factors in play are numerous and include the advent of mass customization, highly flexible production lines, additive manufacturing (3D printing) and artificial intelligence in the factory. As computation leads to the replacement of cognitive tasks, we are exploring new domains in design that in the future may allow us to automate a function we always considered purely human: Our innate creativity. As engineers deal with higher-level building blocks, Hein noted, productivity increases as the technological ecosystem increasingly becomes commoditized.

“A smartphone is today’s case in point,” Hein said. “We have the computing capability to make it the basis of a CubeSat surrounded by the added supporting structure, a satellite that can be created for approximately $3500. Mass produced technology opens up opportunities like this. Additive manufacturing, ubiquitous computing, the ‘Internet of things’ and artificial intelligence are all relevant for a future in which we will create complex systems in space on demand.”

It’s an optimistic trend that when extrapolated to 2060, as Hein did, gives us the possibility of serious deep space missions funded by private capital, assuming the continued growth not only of engineering productivity but of wealth. Whether or not these trends cooperate is another matter, for we can’t anticipate social or economic upheaval that can break our best projections. But taking a shot at a perceived future is a way to provoke scientific thought, not to mention stimulating the minds of science fiction authors, of which there were several in Oak Ridge. It will be interesting to see what stories may spin out of the ideas they heard described at TVIW.

tzf_img_post

{ 4 comments }

TVIW: Caveats for Long-Duration Missions

by Paul Gilster on November 18, 2014

When he opened the Tennessee Valley Interstellar Workshop in Oak Ridge last week, Les Johnson told the audience that sessions would begin and end on time. Punctuality is a trait that I assume works well in Johnson’s day job at Marshall Space Flight Center, and it certainly was appreciated in Oak Ridge, where the delays and overruns that mar so many conferences just didn’t occur. That kept the pace brisk and the presenters solidly on topic throughout.

les

That sense of pace and direction is making TVIW into one of my favorite gatherings. Today I’m going to run through some of the presentations from the first day, beginning with the multidisciplinary note with which I closed yesterday’s post. What we gain by keeping a wide range of background in play among the presenters is a chance to spot hidden assumptions, some of which can prove deadly when not properly evaluated. Monday’s TVIW talks helped clarify what we’ve learned about the human presence in space and just how much we have yet to determine.

Image: Les Johnson calls the first session into order in Oak Ridge.

Problems of Survival in Deep Space

Biologist Robert Hampson (Wake Forest School of Medicine) was a familiar face when he took the podium on Monday morning, having appeared at the last TVIW in Huntsville. What Dr. Hampson brings to the table is a rich background in the neurosciences that includes research into cognition, behavior and learning.

All of these come into play when we’re talking about the issues astronauts will face when dealing with long-duration spaceflight. In Huntsville, Hampson had outlined our need for a biomedical research laboratory in weightless conditions, so that we could do the kind of detailed research into artificial gravity that we need before we can think about how to provide it on a mission. The Oak Ridge talk followed up on the idea, explaining the need for a livable habitat where access to vacuum and solar radiation is readily available. A further option would be to place it outside Earth’s magnetosphere to study radiation in that environment and how to mitigate it.

We tend to shrug off the gravity problem by assuming that we can create a rotating habitat, but the ‘tin cans on a string’ notion — two segments joined by a tether — leaves unanswered the question of how long the tether should be and how fast the rotation. The speed of rotation turns out to be critical because while the vestibular system can adapt to linear velocity, angular momentum is perceived as acceleration. Vertigo can be the result of a sudden head turn.

Moreover, all the work we’ve done in zero-g aboard vehicles like the International Space Station has led only to marginal results. Microcravity causes physiological changes that can range from loss of calcium to fluid retention to a reduction in muscle mass and a decrease in the volume and pumping capacity of the heart. Only gravity has the ability to resolve these problems, which is why we need the space lab to explore what forms artificial gravity can take. Hampson said that if astronauts took an extended zero-g mission to Mars, they might be unable to function upon arrival because of Mars’ own gravity, even though it is a paltry 38 percent of that found on Earth.

The lab design resulting from Hampson’s research would allow research subjects and scientists to live in an eight-deck space divided into two four-deck structures connected by a tether, an installation that contained both a human and animal lab, with each of the two segments creating about 1000 square feet of research space. Another significant issue for study here: The degradation of memory, found on Earth in those with radiation therapy for cancer, that can likewise be produced by an overdose of radiation in space. The ideal, then, would be to place the biomedical laboratory at the Earth-Moon L2 point outside the magnetosphere, where all these issues can be studied in the best environment for microbiological and biochemical tests.

Human Prospects on Mars

Oak Ridge National Laboratory’s Fred Sloop also delved into the question of gravity’s effects, noting the huge role that evolution under 1 g has played in the development of our physiology. We’re already talking about private colony missions to places like Mars, but we have to overcome the factors Hampson talked about as well as the embrittlement of bone in zero-g, which can cause as much bone loss for an astronaut in a single month as a menopausal woman loses in a year. Bone demineralization appears most strongly in the pelvis, said Sloop, and with loss of bone we get calcium phosphate released into the body, along with calcium oxalate.

The result: The formation of kidney stones. We also see that extended microgravity causes muscle atrophy, with muscle mass down to 70 percent of preflight after 270 days in orbit. Fluid shifts occur as bodily fluids distribute to the upper portion of the body, a shift that can involve cardiovascular changes and a decrease in blood volume as the red blood cell count drops. The injury potential upon re-entry is significant, for on a long-duration mission, the spine can lengthen more than 7 centimeters. Changes in cognition and mental imagery can impair function.

Sloop believes that despite mechanical countermeasures — MIT, for example, is studying a ‘skin suit’ that mimics a 1 g load on bone and muscle — the best recourse will be artificial gravity created by rotation. “We need to find out what the minimum gravity to retain physiological health really is,” Sloop added. “Is 1 g necessary, or can we get by with less? Mars gravity at .38 g may be sufficient for long-term colonists once they have arrived, but at this point we don’t really know.” In space, a nominal design for a 1 g habitat rotating at 4 rpms with a rotational radius of 56 meters may work, but will it ward off these ills over a 30-month mission to Mars?

A Historical Perspective on Colonization

You can see why rushing a deep-space mission under the assumption that we have sufficient experience in nearby space would be a mistake. But the issues aren’t solely biological. Sam Lightfoot (South Georgia State College) tackled the assumptions we bring with us when we attempt to colonize new lands, as revealed in historical precedent. The first colony planted by the Spanish in the United States was not St. Augustine but an attempt in the barrier islands of Georgia by the conquistador Lucas Vazquez de Ayllon, who landed in the area in 1526.

Allyon thought he had brought what he needed — after all, he had tools, livestock and weapons — but many of the tools proved unsuited to the environment. Allyon’s horses did not adapt well in the humid, sandy islands, and European methods of farming failed. The colony’s maps were incomplete and inaccurate, water was in short supply and disease became rampant. Unwilling to exploit local food resources, the colonists refused to eat wheat. Their housing disintegrated because they were using wattle and daub techniques suited for the dry climate of Spain.

Allyon, whose colony had to be evacuated back to Havana, was one of a string of failures whose colonization efforts have been all but forgotten. Pánfilo de Narváez made even Allyon’s attempt look good. Equally unprepared for the actual conditions he found, de Narváez took over 300 conquistadores with him, a group with few skills adapted to local conditions. Only four of his men would survive the colonization attempt, walking up the length of Florida and making their way somehow to Mexico City. In sharp contrast, Hernando de Soto was able to survive because he brought equipment suited to the terrain, along with flexibility in leadership.

The lessons are clear enough, and even more stark when we consider that the next wave of human colonization will be in an environment far more unyielding, and much more difficult to reach and resupply, than even the conquistadores had to contend with. I took away from these multidisciplinary sessions the need to question our most basic assumptions. Fred Sloop’s point about Mars’ gravity stands out: We don’t really know whether humans living at 0.38 g will be able to survive over the long haul. Such basic questions drive the need for research into areas we have found difficult to explore with the existing space infrastructure in low Earth orbit.

More tomorrow as I turn to issues not just of planetary but interstellar migration, looking at presentations that covered everything from beamed sails to ‘worldship’ habitats and the possibilities for space drives. Can we imagine a day when artificial intelligence and additive manufacturing produces the space infrastructure we need in decades rather than centuries? The Tennessee Valley Interstellar Workshop was an opportunity to talk about issues like these not only in the sessions but in informal dinner conversation. More about the proceedings tomorrow.

tzf_img_post

{ 31 comments }

Going Interstellar at Oak Ridge

by Paul Gilster on November 17, 2014

When I was last in Oak Ridge, TN for the Tennessee Valley Interstellar Workshop in 2011, I arrived late in the evening and the fog was so thick that, although I had a map, I decided against trying to find Robert Kennedy’s house, where the pre-conference reception was being held. This year the fog held off until the first morning of the conference (it soon burned off even then), and I drove with Al Jackson out to the Kennedy residence, finding the quiet street surrounded by woods still lit with fall colors and the marvelous clean air of the Cumberland foothills.

A house full of interstellar-minded people makes for lively conversation almost anywhere you turn. I quickly met the SETI League’s Paul Shuch, with whom I’ve often corresponded but never spoken to in person, and our talk ranged over SETI’s history, the division into a targeted search and a broader survey (the latter is the SETI League’ bread and butter), and why looking for signals through a very narrow pipe (Arecibo) should only be one out of a spectrum of strategies.

Robert’s 15 cats were largely locked in a room somewhere, but four of them had been allowed to roam, along with a small, inquisitive dog. I spent time at the reception with Marc Millis, with Icarus Interstellar’s Robert Freeland (Andreas Hein was also at the conference, having flown over all the way from Germany, and so was Rob Swinney, who came in from Lincoln in the UK — Rob leads Project Icarus, the ongoing attempt to redesign the original Daedalus starship), and conference organizers David Fields, Les Johnson and Martha Knowles.

John Rather, a tall, friendly man extended a hand, and I suddenly realized this was the John Rather who had done so much to analyze Robert Forward’s sail concepts in the 1970’s, working under a contract with JPL. Later I would see a photo of him with Heinlein and felt the familiar surge of science fictional associations that many scientists bring to this work. I didn’t see Jim Early until the next night, when we had dinner at a table with Jim Benford and Al Jackson, but I have much to say about his paper on sails and the interstellar medium, which showed in 2000 that damage from deep space gas and dust should be minimal. I had covered this paper with Jim’s email help just two years ago and only later put Jim’s face together with the story.

And so it goes at events like this. You meet people with whom you’ve had correspondence and there is a slight mental lock before you put them in the context of the work they have done. I would say that these mental blocks show I’m getting older, but the fact is that I’ve always been slow on the uptake. That’s why I find conferences so valuable, because as soon as I make the needed connections, ideas start to sprout and connect with older materials I’ve written about. In any case, we may benefit here by getting some new material from several TVIW attendees, with whom I discussed writing up concepts from their presentations and the workshop sessions.

2014-11-09 18.28.53

Image: TVIW 2014’s reception getting started at Robert Kennedy’s house in Oak Ridge.

Sara Seager: Of Exoplanets and Starshades

“Think about this. People growing up today have always had exoplanets in their lives.” So said Sara Seager, launching the conference after Les Johnson’s introduction on Monday the 10th. Based at MIT, Seager is a major figure in the exoplanet hunt whose work has earned plaudits in the scientific community and also in the popular press. To get to know not just the details of her work but also her character, I’d recommend Lee Billings’ Five Billion Years of Solitude (Current, 2014), which offers a fine look at the scientist and the challenges she has faced in terms of personal loss.

We always talk about habitable zones, Seager reminded the audience, because of our fascination with finding an Earth 2.0. But in fact a habitable zone can be hard to predict, especially when you’re dealing with variable exoplanet atmospheres, a Seager specialty. In fact, exoplanet habitability could be planet-specific.

“You can have planets with denser atmospheres that are much farther from the habitable zone than ours,” Seager said. “Molecular hydrogen is a greenhouse gas. It absorbs continuously across the wavelength spectrum and in some cases could extend the habitable zone out as far as 10 AU. You could also have planets closer to their star than Venus is to the Sun if the planet had less water vapor to begin with, and thus less absorbing power. The boundaries of what we call the habitable zone are controversial.”

SeagerPhoto

Seager has written two textbooks, one of them — Exoplanet Atmospheres: Physical Processes (Princeton University Press, 2010) — specifically on how we characterize such atmospheres. Transmission spectroscopy helps us study a gaseous envelope even when the planet itself cannot be seen, because we’re able to compare the spectra of the star itself when a transiting planet is behind it and again when that same planet begins or exits its transit. Teasing out atmospheric molecules isn’t easy but five exoplanet spectra have been studied in great detail using these methods, and according to Seager, several dozen more have been measured.

The problem here is that transits rely on the fortuitous alignment between planet and star, so that we observe the planet moving across the face of the star. But transits are hugely helpful nonetheless, and given the transit depth, small M-dwarf stars with a ‘super-Earth’ around them would be the easiest to work with — Seager calls these not Earth 2.0 but Earth 2.5. Missions like TESS (Transiting Exoplanet Survey Satellite) will home in on the closest 1000 M-dwarfs to look for transits. TESS launches in 2017 and Seager believes we might find such a world by the early 2020′s, a tidally locked planet no more than tens, or hundreds, of light years from our Sun.

To get around thet transit alignment problem, NASA has long been studying starshade concepts, with a precisely shaped starshade flying tens of thousands of kilometers from a space-based telescope. Using such a configuration, we can start to overcome the problem of glare from the star masking the presence of a planet. Earth is ten billion times fainter than the Sun in visible light, but a properly shaped starshade can reduce the contrast, particularly in the infrared. Can the upcoming Wide-Field Infrared Survey Telescope (WFIRST), designed for wide-field imaging and spectroscopic surveys of the near-infrared sky, be adapted for starshade capability?

Seager thinks that it can, and gives the idea an 80 percent chance of happening. This will involve a guide camera and communications system for closed loop formation flying. That leaves us with a host of issues including deployment — Seager showed testing on small starshade segments — and propulsion — how do you move the starshade around as you change the alignment between shade and telescope to fix upon a new target? Re-targeting, Seager noted, takes time, and solar-electric propulsion may be one way to handle the propulsion requirement. Centauri Dreams regular Ashley Baldwin, who follows space telescope issues in great detail, will be writing up starshade concepts here in the near future.

starshade2

Image: Schematic of the starshade-telescope system (not to scale). Starshade viewing geometry with inner working angle (IWA) independent of telescope size. Credit: Exo-S: Starshade Probe-Class Exoplanet Direct Imaging Mission Concept: Interim Report. For more on the most recent work on starshades, including this report, see exep.jpl.nasa.gov/stdt.

The Great Days of 1983

The Tennessee Valley Interstellar Workshop, now in its third iteration with a fourth planned for 2016 in Chattanooga, is beginning to remind me of a storied conference held in 1983. The Conference on Interstellar Migration was held at Los Alamos in May of that year. It was designed to be multidisciplinary and included practitioners of anthropology, sociology, physics and astronomy as the attendees engaged on issues of emerging technologies, historical migrations, and the future of our species. The proceedings, Interstellar Migration and the Human Experience is a key text for those trying to place our interstellar ambitions in context.

TVIW has always had a bit of the multidisciplinary about it, as we’ll see tomorrow, when I talk about papers not only from the physics perspective (Jim Benford on beamed sail work), but biology (Robert Hampson), biochemistry (Fred Sloop) and anthropology (Sam Lightfoot). This conference did not have as striking a mix among disciplines as the Los Alamos conference, but I’ve appreciated that the organizers have continued to bring in perspectives from a variety of sciences, the result of which is usually a helpful cross-pollination of ideas. We’ll be looking at how some of these played out as this week continues with more of my report from Oak Ridge.

tzf_img_post

{ 7 comments }

The Transition from Rocky to Non-Rocky Planets

by Paul Gilster on November 14, 2014

As I decompress from the Tennessee Valley Interstellar Workshop (and review my notes for next week’s report), I have the pleasure of bringing you Andrew LePage’s incisive essay into a key exoplanet question. Are some of the planets now considered potentially habitable actually unlikely to support life? Recent work gives us some hard numbers on just how large and massive a planet can be before it is more likely to be closer to Neptune than the Earth in composition. The transition from rocky to non-rocky planets is particularly important now, when our instruments are just becoming able to detect planets small enough to qualify as habitable. LePage, who writes the excellent Drew ex Machina, remains optimistic about habitable planets in the galaxy, but so far the case for many of those identified as such may be weaker than we had thought. A prolific writer, Drew is also a Senior Project Scientist at Visidyne, Inc., where he specializes in the processing and analysis of remote sensing data.

by Andrew LePage

Andrew_LePage_2014

For much of the modern era, astronomy has benefitted greatly from the efforts of amateur scientists. But while amateur astronomers equipped with telescopes have certainly filled many important niches left by the far less numerous professionals in the field, others interested in astronomy equipped with nothing more than a computer and an Internet connection are capable of making important contributions as well. One project taking advantage of this resource is Planet Hunters.

The Planet Hunters project was originally started four years ago by the Zooinverse citizen science program to enlist the public’s help in searching through the huge photometric database of NASA’s Kepler mission looking for transits caused by extrasolar planets. While automated systems have been able to uncover thousands of candidate planets, they are limited to finding only what their programmers designed them to find – multiple, well defined transits occurring at regular intervals. The much more adaptable human brain is able to spot patterns in the changes in the brightness of stars that a computer program might miss but could still indicate the presence of an extrasolar planet. Currently in Version 2.0, the Planet Hunters project has uncovered 60 planet candidates to date through the efforts of 300,000 volunteers worldwide.

A paper by a team of astronomers with Joseph Schmitt (Yale University) as the lead author was just published in The Astrophysical Journal which describes the latest find by Planet Hunters. The target of interest for this paper is a billion year old, Sun-like star called Kepler 289 located about 2,300 light years away. Automated searches of the Kepler data had earlier found two planets orbiting this distant star: a large super-Earth with a radius 2.2 times that of the Earth (or RE) in a 34.5-day orbit originally designated Kepler 289b (called PH3 b in the new paper) and a gas giant with a radius of 11.6 RE in 125.8-day orbit, Kepler 289c (now also known as PH3 d). The new planet, PH3 c, has a radius of 2.7 RE and a mean orbital period of 66.1 days. With a mean stellar flux about 11 times that of Earth, this planet is highly unlikely to be habitable but its properties have profound implications for assessing the potential habitability of other extrasolar planets.

The planet had been missed by earlier automated searches because its orbital period varies regularly by 10.5 hours over the course of ten orbits due to its strong interactions with the other two planets, especially PH3 d. Because of this strong dynamical interaction, it was possible for Schmitt et al. to use the Transit Timing Variations or TTVs observed in the Kepler data to compute the masses of these three planets much more precisely than could be done using precision radial velocity measurements. The mass of the outer planet, PH3 d, was found to be 132±17 times that of Earth (or ME) or approximately equivalent to that of Saturn. The mass of the inner planet, PH3 b, was poorly constrained with a value of 7.3±6.8 ME. The newest discovery, PH3 c, was found to have a mass of 4.0±0.9 ME which, when combined with the radius determined using Kepler data, yields a mean density of 1.2±0.3 g/cm3 or only about one-fifth that of the Earth. Models indicate that this density is consistent with PH3 c possessing a deep, hot atmosphere of hydrogen and helium making up about half of its radius or around 2% of its total mass.

PH3 c is yet another example of a growing list of known low-density planets with masses just a few times that of the Earth that are obviously not terrestrial or rocky in composition. Before the Kepler mission, such planets were thought to exist but their exact properties were unknown because none are present in our solar system. As a result, the position in parameter space of the transition from rocky to non-rocky planets and the characteristics of this transition were unknown. So when astronomers were developing size-related nomenclature to categorize the planets they expected to find using Kepler, they somewhat arbitrarily defined “super-Earth” to be any planet with a radius in the 1.25 to 2.0 RE range regardless of its actual composition. Planets in the 2.0 to 4.0 RE range were dubbed “Neptune-size”. This has generated some confusion over the term “super-Earth” and has led to claims about the potential habitability of these planets being made in the total absence of an understanding of the true nature of these planets. Now that Kepler has found planets in this size range, astronomers have started to examine the mass-radius relationship of super-Earths.

The first hints about the characteristics of this transition from rocky to non-rocky planets were discussed in a series of papers published earlier this year. Using planetary radii determined from Kepler data and masses found by precision radial velocity measurements and analysis of TTVs, it was found that the density of super-Earths tended to rise with increasing radius as would be expected of rocky planets. But somewhere around the 1.5 to 2.0 RE range, a transition is passed where larger planets tended to become less dense instead. The interpretation of this result is that planets with radii greater than about 1.5 RE are increasingly likely to have substantial envelopes of various volatiles such as water (including high pressure forms of ice at high temperatures) and thick atmospheres rich in hydrogen and helium that decrease a planet’s bulk density. As a result, these planets can no longer be considered terrestrial or rocky planets like the Earth but would be classified as mini-Neptunes or gas dwarfs depending on the exact ratios of rock, water and gas.

Kepler_super_Earth_lineup

Image: It now appears that many of the fanciful artist depictions of super-Earths are wrong and that most of these planets are more like Neptune than the Earth (NASA Ames/JPL-Caltech).

A detailed statistical study of this transition was submitted for publication this past July by Leslie Rogers (a Hubble Fellow at the California Institute of Technology) who is also one of the coauthors of the PH3 c discovery paper. In her study, Rogers confined her analysis to transiting planets with radii less than 4 RE whose masses had been constrained by precision radial velocity measurements. She excluded planets with masses determined by TTV analysis since this sample may be affected by selection biases that favor low-density planets (for a planet of a given mass, a large low-density planet is more likely to produce a detectable transit event than a smaller high-density planet). Rogers then determined the probability that each of the 47 planets in her Kepler-derived sample were rocky planets by comparing the properties of those planets and the associated measurement uncertainties to models of planets with various compositions. Next, she performed a statistical analysis to assess three different models for the mass-radius distribution for the sample of planets. One model assumed an abrupt, step-wise transition from rocky to non-rocky planets while the other two models assumed different types of gradual transitions where some fraction of the population of planets of a given radius were rocky while the balance were non-rocky.

Rogers’ analysis clearly showed that a transition took place between rocky and non-rocky planets at 1.5 RE with a sudden step-wise transition being mildly favored over more gradual ones. Taking into account the uncertainties in her analysis, Rogers found that the transition from rocky to non-rocky planets takes place at no greater than about 1.6 RE at a 95% confidence level. Assuming a simple linear transition in the proportions of rocky and non-rocky planets, no more than 5% of planets with radii of about 2.6 RE will have densities compatible with a rocky composition to a 95% confidence level. PH3 c, with a radius of 2.7 RE, exceeds the threshold found by Rogers and, based on its density, is clearly not a terrestrial planet.

An obvious potential counterexample to Rogers’ maximum rocky planet size threshold is the case of Kepler 10c, which made the news early this year. Kepler 10c, with a radius of 2.35 RE determined by Kepler measurements and a Neptune-like mass of 17 ME determined by radial velocity measurements, was found to have a density of 7.1±1.0 g/cm3. While this density, which is greater than Earth’s, might lead some to conclude that Kepler 10c is a solid, predominantly rocky planet, Rogers counters that its density is in fact inconsistent with a rocky composition by more than one-sigma. Comparing the measured properties of this planet with various models, she finds that there is only about a 10% probability that Kepler 10c is in fact predominantly rocky in composition. It is much more likely that it possesses a substantial volatile envelope albeit smaller than Neptune’s given its higher density.

While much more work remains to be done to better characterize the planetary mass-radius function and the transition from rocky to non-rocky planets, one of the immediate impacts of this work is on the assessment of the potential habitability of extrasolar planets. About nine planets found to date in the Kepler data have been claimed by some to be potentially habitable. Unfortunately, all but two of these planets, Kepler 62f and 186f, have radii greater than 1.6 RE and it is therefore improbable that they are terrestrial planets, never mind potentially habitable planets.

This still leaves about a dozen planets that have been frequently cited as being potentially habitable that were discovered by precision radial velocity surveys whose radii are not known. However, we do know their MPsini values where MP is the planet’s actual mass and i is the inclination of the orbit to our line of sight. Since this angle cannot be derived from radial velocity measurements alone, only the minimum mass of the planet can be determined or the probability that the actual mass is in some range. Despite this limitation, the MPsini values can serve as a useful proxy for radius.

Rogers optimistically estimates that her 1.6 RE threshold corresponds to a planet with a mass of about 6 ME assuming an Earth-like composition (which is still ~50% larger than the measured mass of PH3 c, which is now known to be a non-rocky planet). About half of the planets that some have claimed to be potentially habitable have minimum masses that exceed this optimistic 6 ME threshold while the rest have better than even odds of their actual masses exceeding this threshold. If the threshold for the transition from rocky to non-rocky planets is closer to the 4 ME mass of PH3 c, the odds of any of these planets being terrestrial planets are worse still. The unfortunate conclusion is that none of the planets discovered so far by precision radial velocity surveys are likely to be terrestrial planets and are therefore poor candidates for being potentially habitable.

Please do not get me wrong: I have always been a firm believer that the galaxy is filled with habitable terrestrial planets (and moons, too!). But in the rush to find such planets, it now seems that too many overly optimistic claims have been made about too many planets before enough information was available to properly gauge their bulk properties. Preliminary results of the planetary mass-radius relationship now hints that the maximum size of a terrestrial planet is probably about 1½ times the radius of the Earth or around 4 to 6 times Earth’s mass. Any potentially habitable planet, in addition to having to be inside the habitable zone of the star it orbits, must also be smaller than this. Unfortunately, while recent work suggests that planets of this size might be common, our technology is only just able to detect them at this time. With luck, over the coming years as more data come in, we will finally have a more realistic list of potentially habitable planet candidates that will bear up better under close scrutiny.

The discovery paper for PH3 c by Schmitt et al., “Planet Hunters VII: Discovery of a New Low-Mass, Low Density Planet (PH3 c) Orbiting Kepler-289 with Mass Measurements of Two Additional Planets (PH3 b and d)”, The Astrophysical Journal, Vol. 795, No. 2, ID 167 (November 10, 2014) can be found here. The paper by Leslie Rogers submitted to The Astrophysical Journal, “Most 1.6 Earth-Radius Planets are not Rocky”, can be found here.

For a fuller discussion of how Rogers’ work impacts the most promising planets thought by many to be potentially habitable, please refer to Habitable Planet Reality Check: Terrestrial Planet Size Limit on my website Drew Ex Machina.

tzf_img_post

{ 51 comments }

Tennessee Valley Interstellar Workshop

by Paul Gilster on November 10, 2014

I’m at the Tennessee Valley Interstellar Workshop in Oak Ridge for the next few days. As I’ve done at past conferences, I’ll need to spend my time taking the notes that will be turned into next week’s entries here. That means no further posts until Friday, though I’ll try to keep the comment moderation going, perhaps with a few delays. TVIW 2014 has lined up a good group of speakers including, besides MSFC’s Les Johnson himself (TVIW’s founder), exoplanet hunter Sara Seager, beamed sail specialist Jim Benford, the SETI League’s Paul Shuch and TZF founder Marc Millis, along with a healthy representation from Icarus Interstellar. I’m also looking forward to the workshop tracks and will be participating in one called “Language as Reality: A Near-Term Roadmap for Exploiting Opportunities and Natural Experiments Here on Terra Firma to Inform *C*ETI.” Expect a complete report when I get back.

tviw

{ 8 comments }

Interstellar Arrival: Slowing the Sail

by Paul Gilster on November 7, 2014

Some final thoughts on hybrid propulsion will wrap up this series on solar sails, which grew out of ideas I encountered in the new edition of the Matloff, Johnson and Vulpetti book Solar Sails: A Novel Approach to Interplanetary Travel (Copernicus, 2014). The chance to preview the book (publication is slated for later this month) took me in directions I hadn’t anticipated. Solar Sails offers a broad popular treatment of all the sail categories and their history, as you’d expect, but this time through I focused on its four technical chapters on sail theory that helped me review the details.

And because I kept running into the idea of multiple modes of propulsion, my thoughts on avoiding doctrinaire solutions continue to grow. In fact, I’d venture to say that probing into the possibilities of multimodal propulsion may offer a serious opportunity for insights. Centauri Dreams regular Alex Tolley came up with one of these yesterday, asking whether a sail mission to Jupiter space might deploy the planet’s huge magnetic field as an assist. Alex invokes Pekka Janhunen’s ideas about electric sails. Let me quote from the Solar Sails book on what Janhunen has in mind:

Similar to the magsail, this concept uses the solar wind for producing thrust. However, different from the magsail, this sail interacts with the solar plasma via a mesh of long and thin tethers kept at high positive voltage by means of an onboard electron gun. In its baseline configuration, the spacecraft spins and the tethers are tensioned by centrifugal acceleration. It should be possible to control each wire voltage singly, at least to within certain limits.

We get thrust out of this when protons from the solar wind, positively charged, are repelled by the positive voltage of the spacecraft’s tethers, while electrons are captured and ejected — otherwise, their growing numbers would neutralize the voltage in the tether mesh. But Alex also brings to mind Mason Peck’s interesting work at Cornell on miniaturized ‘Sprites,’ tiny chip-like spacecraft that could use the Lorentz force to accelerate in directions perpendicular to the magnetic field. Remember that Jupiter’s magnetic field is 18,000 times stronger than Earth’s, a useful resource if we can tap it even so far as to adjust the orbits of planetary probes.

Alex’s thoughts on the matter deserve to be quoted:

We often think of sail ships as clipper ships – i.e. using large surfaces to capture or direct the wind to move. But modern ships use screws. There have also been numerous wind turbine designs that offer advantages over canvas sails, even if they are not as aesthetic to the eye. (Clipper ships were possibly the most pleasing ship designs ever built). Might we be thinking too much in terms of sails that mimic the romance of traditional sails, rather than designs that might offer better performance, albeit with some aesthetic loss?

Interstellar Arrival

A sense of aesthetics produces pleasing designs but what looks best isn’t always what we need. Back in my flying days some of us used to talk about (and in a few cases actually fly) some of the great aircraft designs of the 1930s and later, and although I never got my hands on the controls of one, a great favorite was the Beech Staggerwing, a gorgeous design with a negative wing stagger, meaning that the lower wing is farther forward than the upper. Designs like this could be sleek and lovely because of the medium they worked in. But spacecraft don’t need wings and streamlined fuselages, and our Voyagers and Cassinis look nothing like the wilder designs of early science fiction because they don’t need to, never encountering a planetary atmosphere.

staggerwing

Image: The Beech Model 17 Staggerwing, first produced in 1932. Credit: Wikimedia Commons.

A beamed lasersail on its way to Alpha Centauri may be anything but a thing of beauty. Once the mission enters its cruise phase, the sail can be safely stowed, and one good use for it would be to shroud the payload to offer additional protection against radiation. We’re always trying to think of ways to get more value out of existing assets, which is what extended missions are all about. Or think about the Benfords’ JPL work that revealed desorption. No one with an eye for design would come up with painting a desorption layer on a sailcraft, but it’s conceivable that desorption, which is the release of CO2, hydrocarbons and hydrogen from within the manufactured sail as it heats up under the beam, could give an added kick to interplanetary sails being pushed by powerful microwave beams.

Mentioning Forward brings me back around to the ‘staged sail’ concept that he worked out for stopping at another star. The sail has three divisions, as shown in the diagram below, which is taken from his paper on a manned mission to Epsilon Eridani. ‘Staging’ the sail means losing first the outer ring, then the middle one, until only the inner ring is left. In sequence, the spacecraft slows down by using laser light beamed from our Solar System, reflected off the now separated outer sail as it approaches the star — the light is directed back at the two remaining sail segments with payload. Ingenious tinkering let Forward use the second sail detachment as the way the crew got home, with laser light boosting the much smaller inner sail by reflection from the middle segment.

forward_decel

Image: Robert Forward’s staged sail concept. What he calls a ‘paralens’ in the diagram is an enormous Fresnel lens in the outer Solar System, made of concentric rings of lightweight, transparent material with free space between the rings. Credit: Robert Forward.

Staged sails are hard to see as anything but a longshot — the success of the mission depends not only upon perfect execution of the staging process but, crucially, upon the laser beam from Earth being able to illuminate the sail segments effectively. Forward was fully aware of the possibilities here, and you can find discussion in places like his novel Rocheworld (Baen, 1990) about how politics on Earth might affect the use of the expensive beam. I for one wouldn’t want to put my life in the hands of a design like this, which depends so crucially upon decisions made far from the spacecraft.

Interestingly, like Mason Peck, Forward had some thoughts on how we might use the Lorentz force as well. Remember that a charged object moving through a magnetic field experiences this force at right angles to its direction of motion and the magnetic field itself. Out of this you get ‘thrustless turning,’ which both Forward and Philip Norem thought could be used for deceleration. Instead of staged sails, you get an electrostatically charged probe — think of Janhunen’s electric sail tethers — on a trajectory that goes well beyond the target star. The spacecraft’s interactions with the galactic magnetic field bend its trajectory so that it approaches the target from behind.

Once it’s inbound to the destination system, a laser beam from Earth can be turned upon it to slow it down for arrival. The idea is anything but aesthetic, just as the Janhunen sail would look like something closer to a porcupine than the silvery lozenge of an early SF starship. It’s also hampered by the fact that mission times, already measured in decades at minimum, are tripled with the use of this maneuver. I should mention that Solar Sails authors Gregory Matloff and Les Johnson have also explored the uses of electrodynamic tethers to supply power to an Alpha Centauri expedition, even if a Norem-style arrival seems too lengthy.

Creative thinking about these matters often springs from putting two or more solutions together to see what can happen. What I’ve always admired about the interstellar community is its ability to re-examine older concepts to look for interesting cross-pollination of ideas. As we move into the era of increasingly tiny components, it’s heartening to think how many designs will be affected by new nanotechnological possibilities. Mason Peck has talked about using Jupiter’s magnetic field to spew thousands of ‘Sprites’ out on interstellar trajectories. What else can we imagine as we look for extended uses of existing tech and ponder where they might lead us?

Forward’s paper on staged sails is “Roundtrip Interstellar Travel Using Laser-Pushed Lightsails,” Journal of Spacecraft and Rockets 21 (1984), pp. 187-195. The Norem paper is “Interstellar Travel: A Round Trip Propulsion System with Relativistic Capabilities,” AAS 69-388 (June, 1969). Forward’s paper on Lorentz force turning is “Zero-Thrust Velocity Vector Control for Interstellar Probes: Lorentz Force Navigation and Circling,” AIAA Journal 2 (1964), pp. 885-889. Matloff and Johnson discuss electrodynamic tethers in “Applications of the Electrodynamic Tether to Interstellar Travel,” JBIS 58 (June, 2005), pp. 398-402.

tzf_img_post

{ 48 comments }