Building Large Structures in Space

One thing the Tennessee Valley Interstellar Workshop did not offer was a lot of spare time. Les Johnson told attendees at the beginning that we would be working straight through. Between presentations and workshop sessions, that was pretty much the case, with no break at all between an 8:00 start and lunch, and afternoon sessions punctuated by breakout workshop sessions on four topics: communications and SETI; biology in small ecosystems; safety issues for interstellar missions; and a competition to reverse-engineer famous starships from science fiction literature. I finished up the after-dinner workshop session around 9:30 that first night.

An Encounter with ‘Dr. SETI’

It was a pleasure to finally meet the SETI League’s Paul Shuch in Oak Ridge. Paul and I have exchanged email for some time now, mostly about material we might use on our respective sites, and I’ve long admired the engineering and leadership skills he brings to a SETI all-sky survey that coordinates the efforts of 127 small receiving stations around the world. If you’re not aware of his Searching for Extraterrestrial Intelligence (Springer, 2011), realize that it contains 26 essays not only from some of SETI’s biggest names but also science fiction writers like Stephen Baxter and David Brin in encapsulating the key issues of the field.

tviw_shuch

Image: The SETI League’s Paul Shuch (center) receiving a copy of Interstellar Migration and the Human Experience from Robert Kennedy (left) and Sam Lightfoot.

Introduced by David Fields of Tamke-Allen Observatory at nearby Roane State Community College, Shuch ran through a synopsis of SETI’s history at the conference. He lingered over the beginnings of radio astronomy, when Karl Jansky tried to chase down the source of the interference that Bell Telephone Laboratories was picking up in trans-Atlantic communications (he was detecting the galactic center), and also pointed to Grote Reber, the Illinois ham radio operator who was, back in the 1930s and for almost a decade, the only practicing radio astronomer in the world. Paul’s discussion of the Ohio State WOW! signal, logged by a surprised Jerry Ehman in 1977, reminded me how much the enigmatic reception still haunts us.

A chance reception of an extraterrestrial beacon? If so, it was one that only swept past our planet, for the WOW signal was never detected again despite later efforts, and we’ll surely never know its true origin. Nor can we jump to conclusions, remembering Frank Drake’s first strong Project Ozma signal. It seemed to come from Epsilon Eridani (could SETI be this easy, Drake wondered?), but as Shuch explained, it turned out to be the reception of a U-2 spy plane, then highly classified but about to become public news after the shootdown of Francis Gary Powers.

Decades ago I wrote up a few articles for the journal of the Society of Amateur Radio Astronomers (SARA), a non-engineer addressing an audience of crack equipment makers and instrument tweakers. I hate to think how many mistakes I made in my analysis of the Drake Equation back then, but it’s a pleasure to recall those days considering Paul’s advocacy of SARA and the recent loss of Mike Gingell, a good friend and SARA member who had known Paul from meetings at Green Bank (where Project Ozma was born) and had a substantial backyard antenna array of his own.

The Beamed Sail and Survival

Jim Benford’s work on beamed sails continues under the aegis of Project Forward, an attempt to characterize and advance the science on strategies to get a sail up to interstellar velocities. Deployment of large sails is always an issue, but spin deployment has been demonstrated in the laboratory, with sails unfolding from their tight packages like Japanese origami. The big capital cost goes into the transmitters — a phased array left behind in the Solar System — with major expenses in operating the beamer that make the construction of each individual sail relatively inexpensive in contrast. Build the infrastructure and you can launch many missions.

These are going to be installations that will require considerable expertise at handling large construction projects in space. In Benford’s words:

“We are going to have to learn the art of building very large arrays, and we’re going to have to build them in space. We know how to build the transmitters, but the structures will be on the scale of hundreds of kilometers, creating the same issues we will face in space-based solar power projects. Construction in space will invariably be managed by robots. Early work in that direction can be seen in SpiderFab, a robotic construction idea being studied by NASA.”

This wasn’t the only time SpiderFab came up at Oak Ridge, as we’ll see in a moment. The idea, championed by Robert Forward’s company Tethers Unlimited, would use 3D printing methods to build the needed systems without further human intervention. Robert Hoyt, a co-founder of Tethers Unlimited, describes SpiderFab as combining “… the techniques of fused deposition modeling (FDM) with methods derived from automated composite layup to enable rapid construction of very large, very high-strength-per-mass, lattice-like structures combining both compressive and tensile elements.” What you wind up with is a way to incorporate high-strength materials into system components like antennas that are optimized for space conditions.

Beamer pointing accuracy will have to be extreme, presenting a major challenge for sailship missions. The accuracy needed, Benford said, is micro-radian for early work, more or less the state of the art in the phased arrays used by the military. Taking a sailcraft all the way to the Oort Cloud would require accuracy to reach the nano-radian level, and would push into pico-radians when we’re talking about actual interstellar missions. A key here is that as the pointing accuracy of the array lowers, the acceleration on the sail has to increase because it will not be able to stay under a tightly focused beam as long as it would with a more precise array.

tviw_dinner

Image: Dinner with the interstellar crowd after the first day’s last plenary session. That’s Jim Benford at far left, then James Early, Sandy Montgomery and Michael Lynch.

What a pleasure to spend time at dinner not only with Benford but James Early, who has written up (with Richard London) ideas on how sails will interact with the interstellar medium. The researchers worked with a beryllium sail as a reference point and studied the effect of local interstellar dust on both sail and payload. In this study, dust grains and atoms of interstellar gas actually pass through the thin sail materials with little loss of energy, creating scant damage.

Moreover, sails turn out to offer a way of protecting the interstellar vehicle because the deployed thin foil offers a way to convert dust grains or neutral gas atoms into free electrons and ions. “These charged particles,” the authors write, “can then be easily deflected away from the vehicle with electrostatic shields.” I wrote these ideas up in a 2012 essay called Lightsails: Safe Passage After All?, but I want to catch up with Early to see whether he’s done further work on the matter. The original paper is “Dust Grain Damage to Interstellar Laser-Pushed Lightsail,” Journal of Spacecraft and Rockets, July-Aug. 2000, Vol. 37, No. 4, pp. 526-531.

Building on Complexity

SpiderFab certainly has its advocates, as do any ideas that advance the notion of building deep space structures on large scales. Andreas Hein, who is not only deputy director of the Initiative for Interstellar Studies but also the head of Project Hyperion for Icarus Interstellar, has been asking whether current trends in engineering — and SpiderFab is certainly indicative of one of these — point to a future where even highly complex products can be produced in a fraction of the time they currently demand. Where do projects like SpiderFab ultimately take us?

spiderfab2

Image: SpiderFab combines techniques evolved from terrestrial additive manufacturing and composite layup with robotic assembly to enable on-orbit construction of large spacecraft components optimized for the zero-g environment. Credit: Tethers Unlimited/NASA.

The factors in play are numerous and include the advent of mass customization, highly flexible production lines, additive manufacturing (3D printing) and artificial intelligence in the factory. As computation leads to the replacement of cognitive tasks, we are exploring new domains in design that in the future may allow us to automate a function we always considered purely human: Our innate creativity. As engineers deal with higher-level building blocks, Hein noted, productivity increases as the technological ecosystem increasingly becomes commoditized.

“A smartphone is today’s case in point,” Hein said. “We have the computing capability to make it the basis of a CubeSat surrounded by the added supporting structure, a satellite that can be created for approximately $3500. Mass produced technology opens up opportunities like this. Additive manufacturing, ubiquitous computing, the ‘Internet of things’ and artificial intelligence are all relevant for a future in which we will create complex systems in space on demand.”

It’s an optimistic trend that when extrapolated to 2060, as Hein did, gives us the possibility of serious deep space missions funded by private capital, assuming the continued growth not only of engineering productivity but of wealth. Whether or not these trends cooperate is another matter, for we can’t anticipate social or economic upheaval that can break our best projections. But taking a shot at a perceived future is a way to provoke scientific thought, not to mention stimulating the minds of science fiction authors, of which there were several in Oak Ridge. It will be interesting to see what stories may spin out of the ideas they heard described at TVIW.

tzf_img_post

TVIW: Caveats for Long-Duration Missions

When he opened the Tennessee Valley Interstellar Workshop in Oak Ridge last week, Les Johnson told the audience that sessions would begin and end on time. Punctuality is a trait that I assume works well in Johnson’s day job at Marshall Space Flight Center, and it certainly was appreciated in Oak Ridge, where the delays and overruns that mar so many conferences just didn’t occur. That kept the pace brisk and the presenters solidly on topic throughout.

les

That sense of pace and direction is making TVIW into one of my favorite gatherings. Today I’m going to run through some of the presentations from the first day, beginning with the multidisciplinary note with which I closed yesterday’s post. What we gain by keeping a wide range of background in play among the presenters is a chance to spot hidden assumptions, some of which can prove deadly when not properly evaluated. Monday’s TVIW talks helped clarify what we’ve learned about the human presence in space and just how much we have yet to determine.

Image: Les Johnson calls the first session into order in Oak Ridge.

Problems of Survival in Deep Space

Biologist Robert Hampson (Wake Forest School of Medicine) was a familiar face when he took the podium on Monday morning, having appeared at the last TVIW in Huntsville. What Dr. Hampson brings to the table is a rich background in the neurosciences that includes research into cognition, behavior and learning.

All of these come into play when we’re talking about the issues astronauts will face when dealing with long-duration spaceflight. In Huntsville, Hampson had outlined our need for a biomedical research laboratory in weightless conditions, so that we could do the kind of detailed research into artificial gravity that we need before we can think about how to provide it on a mission. The Oak Ridge talk followed up on the idea, explaining the need for a livable habitat where access to vacuum and solar radiation is readily available. A further option would be to place it outside Earth’s magnetosphere to study radiation in that environment and how to mitigate it.

We tend to shrug off the gravity problem by assuming that we can create a rotating habitat, but the ‘tin cans on a string’ notion — two segments joined by a tether — leaves unanswered the question of how long the tether should be and how fast the rotation. The speed of rotation turns out to be critical because while the vestibular system can adapt to linear velocity, angular momentum is perceived as acceleration. Vertigo can be the result of a sudden head turn.

Moreover, all the work we’ve done in zero-g aboard vehicles like the International Space Station has led only to marginal results. Microcravity causes physiological changes that can range from loss of calcium to fluid retention to a reduction in muscle mass and a decrease in the volume and pumping capacity of the heart. Only gravity has the ability to resolve these problems, which is why we need the space lab to explore what forms artificial gravity can take. Hampson said that if astronauts took an extended zero-g mission to Mars, they might be unable to function upon arrival because of Mars’ own gravity, even though it is a paltry 38 percent of that found on Earth.

The lab design resulting from Hampson’s research would allow research subjects and scientists to live in an eight-deck space divided into two four-deck structures connected by a tether, an installation that contained both a human and animal lab, with each of the two segments creating about 1000 square feet of research space. Another significant issue for study here: The degradation of memory, found on Earth in those with radiation therapy for cancer, that can likewise be produced by an overdose of radiation in space. The ideal, then, would be to place the biomedical laboratory at the Earth-Moon L2 point outside the magnetosphere, where all these issues can be studied in the best environment for microbiological and biochemical tests.

Human Prospects on Mars

Oak Ridge National Laboratory’s Fred Sloop also delved into the question of gravity’s effects, noting the huge role that evolution under 1 g has played in the development of our physiology. We’re already talking about private colony missions to places like Mars, but we have to overcome the factors Hampson talked about as well as the embrittlement of bone in zero-g, which can cause as much bone loss for an astronaut in a single month as a menopausal woman loses in a year. Bone demineralization appears most strongly in the pelvis, said Sloop, and with loss of bone we get calcium phosphate released into the body, along with calcium oxalate.

The result: The formation of kidney stones. We also see that extended microgravity causes muscle atrophy, with muscle mass down to 70 percent of preflight after 270 days in orbit. Fluid shifts occur as bodily fluids distribute to the upper portion of the body, a shift that can involve cardiovascular changes and a decrease in blood volume as the red blood cell count drops. The injury potential upon re-entry is significant, for on a long-duration mission, the spine can lengthen more than 7 centimeters. Changes in cognition and mental imagery can impair function.

Sloop believes that despite mechanical countermeasures — MIT, for example, is studying a ‘skin suit’ that mimics a 1 g load on bone and muscle — the best recourse will be artificial gravity created by rotation. “We need to find out what the minimum gravity to retain physiological health really is,” Sloop added. “Is 1 g necessary, or can we get by with less? Mars gravity at .38 g may be sufficient for long-term colonists once they have arrived, but at this point we don’t really know.” In space, a nominal design for a 1 g habitat rotating at 4 rpms with a rotational radius of 56 meters may work, but will it ward off these ills over a 30-month mission to Mars?

A Historical Perspective on Colonization

You can see why rushing a deep-space mission under the assumption that we have sufficient experience in nearby space would be a mistake. But the issues aren’t solely biological. Sam Lightfoot (South Georgia State College) tackled the assumptions we bring with us when we attempt to colonize new lands, as revealed in historical precedent. The first colony planted by the Spanish in the United States was not St. Augustine but an attempt in the barrier islands of Georgia by the conquistador Lucas Vazquez de Ayllon, who landed in the area in 1526.

Allyon thought he had brought what he needed — after all, he had tools, livestock and weapons — but many of the tools proved unsuited to the environment. Allyon’s horses did not adapt well in the humid, sandy islands, and European methods of farming failed. The colony’s maps were incomplete and inaccurate, water was in short supply and disease became rampant. Unwilling to exploit local food resources, the colonists refused to eat wheat. Their housing disintegrated because they were using wattle and daub techniques suited for the dry climate of Spain.

Allyon, whose colony had to be evacuated back to Havana, was one of a string of failures whose colonization efforts have been all but forgotten. Pánfilo de Narváez made even Allyon’s attempt look good. Equally unprepared for the actual conditions he found, de Narváez took over 300 conquistadores with him, a group with few skills adapted to local conditions. Only four of his men would survive the colonization attempt, walking up the length of Florida and making their way somehow to Mexico City. In sharp contrast, Hernando de Soto was able to survive because he brought equipment suited to the terrain, along with flexibility in leadership.

The lessons are clear enough, and even more stark when we consider that the next wave of human colonization will be in an environment far more unyielding, and much more difficult to reach and resupply, than even the conquistadores had to contend with. I took away from these multidisciplinary sessions the need to question our most basic assumptions. Fred Sloop’s point about Mars’ gravity stands out: We don’t really know whether humans living at 0.38 g will be able to survive over the long haul. Such basic questions drive the need for research into areas we have found difficult to explore with the existing space infrastructure in low Earth orbit.

More tomorrow as I turn to issues not just of planetary but interstellar migration, looking at presentations that covered everything from beamed sails to ‘worldship’ habitats and the possibilities for space drives. Can we imagine a day when artificial intelligence and additive manufacturing produces the space infrastructure we need in decades rather than centuries? The Tennessee Valley Interstellar Workshop was an opportunity to talk about issues like these not only in the sessions but in informal dinner conversation. More about the proceedings tomorrow.

tzf_img_post

Going Interstellar at Oak Ridge

When I was last in Oak Ridge, TN for the Tennessee Valley Interstellar Workshop in 2011, I arrived late in the evening and the fog was so thick that, although I had a map, I decided against trying to find Robert Kennedy’s house, where the pre-conference reception was being held. This year the fog held off until the first morning of the conference (it soon burned off even then), and I drove with Al Jackson out to the Kennedy residence, finding the quiet street surrounded by woods still lit with fall colors and the marvelous clean air of the Cumberland foothills.

A house full of interstellar-minded people makes for lively conversation almost anywhere you turn. I quickly met the SETI League’s Paul Shuch, with whom I’ve often corresponded but never spoken to in person, and our talk ranged over SETI’s history, the division into a targeted search and a broader survey (the latter is the SETI League’ bread and butter), and why looking for signals through a very narrow pipe (Arecibo) should only be one out of a spectrum of strategies.

Robert’s 15 cats were largely locked in a room somewhere, but four of them had been allowed to roam, along with a small, inquisitive dog. I spent time at the reception with Marc Millis, with Icarus Interstellar’s Robert Freeland (Andreas Hein was also at the conference, having flown over all the way from Germany, and so was Rob Swinney, who came in from Lincoln in the UK — Rob leads Project Icarus, the ongoing attempt to redesign the original Daedalus starship), and conference organizers David Fields, Les Johnson and Martha Knowles.

John Rather, a tall, friendly man extended a hand, and I suddenly realized this was the John Rather who had done so much to analyze Robert Forward’s sail concepts in the 1970’s, working under a contract with JPL. Later I would see a photo of him with Heinlein and felt the familiar surge of science fictional associations that many scientists bring to this work. I didn’t see Jim Early until the next night, when we had dinner at a table with Jim Benford and Al Jackson, but I have much to say about his paper on sails and the interstellar medium, which showed in 2000 that damage from deep space gas and dust should be minimal. I had covered this paper with Jim’s email help just two years ago and only later put Jim’s face together with the story.

And so it goes at events like this. You meet people with whom you’ve had correspondence and there is a slight mental lock before you put them in the context of the work they have done. I would say that these mental blocks show I’m getting older, but the fact is that I’ve always been slow on the uptake. That’s why I find conferences so valuable, because as soon as I make the needed connections, ideas start to sprout and connect with older materials I’ve written about. In any case, we may benefit here by getting some new material from several TVIW attendees, with whom I discussed writing up concepts from their presentations and the workshop sessions.

2014-11-09 18.28.53

Image: TVIW 2014’s reception getting started at Robert Kennedy’s house in Oak Ridge.

Sara Seager: Of Exoplanets and Starshades

“Think about this. People growing up today have always had exoplanets in their lives.” So said Sara Seager, launching the conference after Les Johnson’s introduction on Monday the 10th. Based at MIT, Seager is a major figure in the exoplanet hunt whose work has earned plaudits in the scientific community and also in the popular press. To get to know not just the details of her work but also her character, I’d recommend Lee Billings’ Five Billion Years of Solitude (Current, 2014), which offers a fine look at the scientist and the challenges she has faced in terms of personal loss.

We always talk about habitable zones, Seager reminded the audience, because of our fascination with finding an Earth 2.0. But in fact a habitable zone can be hard to predict, especially when you’re dealing with variable exoplanet atmospheres, a Seager specialty. In fact, exoplanet habitability could be planet-specific.

“You can have planets with denser atmospheres that are much farther from the habitable zone than ours,” Seager said. “Molecular hydrogen is a greenhouse gas. It absorbs continuously across the wavelength spectrum and in some cases could extend the habitable zone out as far as 10 AU. You could also have planets closer to their star than Venus is to the Sun if the planet had less water vapor to begin with, and thus less absorbing power. The boundaries of what we call the habitable zone are controversial.”

SeagerPhoto

Seager has written two textbooks, one of them — Exoplanet Atmospheres: Physical Processes (Princeton University Press, 2010) — specifically on how we characterize such atmospheres. Transmission spectroscopy helps us study a gaseous envelope even when the planet itself cannot be seen, because we’re able to compare the spectra of the star itself when a transiting planet is behind it and again when that same planet begins or exits its transit. Teasing out atmospheric molecules isn’t easy but five exoplanet spectra have been studied in great detail using these methods, and according to Seager, several dozen more have been measured.

The problem here is that transits rely on the fortuitous alignment between planet and star, so that we observe the planet moving across the face of the star. But transits are hugely helpful nonetheless, and given the transit depth, small M-dwarf stars with a ‘super-Earth’ around them would be the easiest to work with — Seager calls these not Earth 2.0 but Earth 2.5. Missions like TESS (Transiting Exoplanet Survey Satellite) will home in on the closest 1000 M-dwarfs to look for transits. TESS launches in 2017 and Seager believes we might find such a world by the early 2020’s, a tidally locked planet no more than tens, or hundreds, of light years from our Sun.

To get around thet transit alignment problem, NASA has long been studying starshade concepts, with a precisely shaped starshade flying tens of thousands of kilometers from a space-based telescope. Using such a configuration, we can start to overcome the problem of glare from the star masking the presence of a planet. Earth is ten billion times fainter than the Sun in visible light, but a properly shaped starshade can reduce the contrast, particularly in the infrared. Can the upcoming Wide-Field Infrared Survey Telescope (WFIRST), designed for wide-field imaging and spectroscopic surveys of the near-infrared sky, be adapted for starshade capability?

Seager thinks that it can, and gives the idea an 80 percent chance of happening. This will involve a guide camera and communications system for closed loop formation flying. That leaves us with a host of issues including deployment — Seager showed testing on small starshade segments — and propulsion — how do you move the starshade around as you change the alignment between shade and telescope to fix upon a new target? Re-targeting, Seager noted, takes time, and solar-electric propulsion may be one way to handle the propulsion requirement. Centauri Dreams regular Ashley Baldwin, who follows space telescope issues in great detail, will be writing up starshade concepts here in the near future.

starshade2

Image: Schematic of the starshade-telescope system (not to scale). Starshade viewing geometry with inner working angle (IWA) independent of telescope size. Credit: Exo-S: Starshade Probe-Class Exoplanet Direct Imaging Mission Concept: Interim Report. For more on the most recent work on starshades, including this report, see exep.jpl.nasa.gov/stdt.

The Great Days of 1983

The Tennessee Valley Interstellar Workshop, now in its third iteration with a fourth planned for 2016 in Chattanooga, is beginning to remind me of a storied conference held in 1983. The Conference on Interstellar Migration was held at Los Alamos in May of that year. It was designed to be multidisciplinary and included practitioners of anthropology, sociology, physics and astronomy as the attendees engaged on issues of emerging technologies, historical migrations, and the future of our species. The proceedings, Interstellar Migration and the Human Experience is a key text for those trying to place our interstellar ambitions in context.

TVIW has always had a bit of the multidisciplinary about it, as we’ll see tomorrow, when I talk about papers not only from the physics perspective (Jim Benford on beamed sail work), but biology (Robert Hampson), biochemistry (Fred Sloop) and anthropology (Sam Lightfoot). This conference did not have as striking a mix among disciplines as the Los Alamos conference, but I’ve appreciated that the organizers have continued to bring in perspectives from a variety of sciences, the result of which is usually a helpful cross-pollination of ideas. We’ll be looking at how some of these played out as this week continues with more of my report from Oak Ridge.

tzf_img_post

The Transition from Rocky to Non-Rocky Planets

As I decompress from the Tennessee Valley Interstellar Workshop (and review my notes for next week’s report), I have the pleasure of bringing you Andrew LePage’s incisive essay into a key exoplanet question. Are some of the planets now considered potentially habitable actually unlikely to support life? Recent work gives us some hard numbers on just how large and massive a planet can be before it is more likely to be closer to Neptune than the Earth in composition. The transition from rocky to non-rocky planets is particularly important now, when our instruments are just becoming able to detect planets small enough to qualify as habitable. LePage, who writes the excellent Drew ex Machina, remains optimistic about habitable planets in the galaxy, but so far the case for many of those identified as such may be weaker than we had thought. A prolific writer, Drew is also a Senior Project Scientist at Visidyne, Inc., where he specializes in the processing and analysis of remote sensing data.

by Andrew LePage

Andrew_LePage_2014

For much of the modern era, astronomy has benefitted greatly from the efforts of amateur scientists. But while amateur astronomers equipped with telescopes have certainly filled many important niches left by the far less numerous professionals in the field, others interested in astronomy equipped with nothing more than a computer and an Internet connection are capable of making important contributions as well. One project taking advantage of this resource is Planet Hunters.

The Planet Hunters project was originally started four years ago by the Zooinverse citizen science program to enlist the public’s help in searching through the huge photometric database of NASA’s Kepler mission looking for transits caused by extrasolar planets. While automated systems have been able to uncover thousands of candidate planets, they are limited to finding only what their programmers designed them to find – multiple, well defined transits occurring at regular intervals. The much more adaptable human brain is able to spot patterns in the changes in the brightness of stars that a computer program might miss but could still indicate the presence of an extrasolar planet. Currently in Version 2.0, the Planet Hunters project has uncovered 60 planet candidates to date through the efforts of 300,000 volunteers worldwide.

A paper by a team of astronomers with Joseph Schmitt (Yale University) as the lead author was just published in The Astrophysical Journal which describes the latest find by Planet Hunters. The target of interest for this paper is a billion year old, Sun-like star called Kepler 289 located about 2,300 light years away. Automated searches of the Kepler data had earlier found two planets orbiting this distant star: a large super-Earth with a radius 2.2 times that of the Earth (or RE) in a 34.5-day orbit originally designated Kepler 289b (called PH3 b in the new paper) and a gas giant with a radius of 11.6 RE in 125.8-day orbit, Kepler 289c (now also known as PH3 d). The new planet, PH3 c, has a radius of 2.7 RE and a mean orbital period of 66.1 days. With a mean stellar flux about 11 times that of Earth, this planet is highly unlikely to be habitable but its properties have profound implications for assessing the potential habitability of other extrasolar planets.

The planet had been missed by earlier automated searches because its orbital period varies regularly by 10.5 hours over the course of ten orbits due to its strong interactions with the other two planets, especially PH3 d. Because of this strong dynamical interaction, it was possible for Schmitt et al. to use the Transit Timing Variations or TTVs observed in the Kepler data to compute the masses of these three planets much more precisely than could be done using precision radial velocity measurements. The mass of the outer planet, PH3 d, was found to be 132±17 times that of Earth (or ME) or approximately equivalent to that of Saturn. The mass of the inner planet, PH3 b, was poorly constrained with a value of 7.3±6.8 ME. The newest discovery, PH3 c, was found to have a mass of 4.0±0.9 ME which, when combined with the radius determined using Kepler data, yields a mean density of 1.2±0.3 g/cm3 or only about one-fifth that of the Earth. Models indicate that this density is consistent with PH3 c possessing a deep, hot atmosphere of hydrogen and helium making up about half of its radius or around 2% of its total mass.

PH3 c is yet another example of a growing list of known low-density planets with masses just a few times that of the Earth that are obviously not terrestrial or rocky in composition. Before the Kepler mission, such planets were thought to exist but their exact properties were unknown because none are present in our solar system. As a result, the position in parameter space of the transition from rocky to non-rocky planets and the characteristics of this transition were unknown. So when astronomers were developing size-related nomenclature to categorize the planets they expected to find using Kepler, they somewhat arbitrarily defined “super-Earth” to be any planet with a radius in the 1.25 to 2.0 RE range regardless of its actual composition. Planets in the 2.0 to 4.0 RE range were dubbed “Neptune-size”. This has generated some confusion over the term “super-Earth” and has led to claims about the potential habitability of these planets being made in the total absence of an understanding of the true nature of these planets. Now that Kepler has found planets in this size range, astronomers have started to examine the mass-radius relationship of super-Earths.

The first hints about the characteristics of this transition from rocky to non-rocky planets were discussed in a series of papers published earlier this year. Using planetary radii determined from Kepler data and masses found by precision radial velocity measurements and analysis of TTVs, it was found that the density of super-Earths tended to rise with increasing radius as would be expected of rocky planets. But somewhere around the 1.5 to 2.0 RE range, a transition is passed where larger planets tended to become less dense instead. The interpretation of this result is that planets with radii greater than about 1.5 RE are increasingly likely to have substantial envelopes of various volatiles such as water (including high pressure forms of ice at high temperatures) and thick atmospheres rich in hydrogen and helium that decrease a planet’s bulk density. As a result, these planets can no longer be considered terrestrial or rocky planets like the Earth but would be classified as mini-Neptunes or gas dwarfs depending on the exact ratios of rock, water and gas.

Kepler_super_Earth_lineup

Image: It now appears that many of the fanciful artist depictions of super-Earths are wrong and that most of these planets are more like Neptune than the Earth (NASA Ames/JPL-Caltech).

A detailed statistical study of this transition was submitted for publication this past July by Leslie Rogers (a Hubble Fellow at the California Institute of Technology) who is also one of the coauthors of the PH3 c discovery paper. In her study, Rogers confined her analysis to transiting planets with radii less than 4 RE whose masses had been constrained by precision radial velocity measurements. She excluded planets with masses determined by TTV analysis since this sample may be affected by selection biases that favor low-density planets (for a planet of a given mass, a large low-density planet is more likely to produce a detectable transit event than a smaller high-density planet). Rogers then determined the probability that each of the 47 planets in her Kepler-derived sample were rocky planets by comparing the properties of those planets and the associated measurement uncertainties to models of planets with various compositions. Next, she performed a statistical analysis to assess three different models for the mass-radius distribution for the sample of planets. One model assumed an abrupt, step-wise transition from rocky to non-rocky planets while the other two models assumed different types of gradual transitions where some fraction of the population of planets of a given radius were rocky while the balance were non-rocky.

Rogers’ analysis clearly showed that a transition took place between rocky and non-rocky planets at 1.5 RE with a sudden step-wise transition being mildly favored over more gradual ones. Taking into account the uncertainties in her analysis, Rogers found that the transition from rocky to non-rocky planets takes place at no greater than about 1.6 RE at a 95% confidence level. Assuming a simple linear transition in the proportions of rocky and non-rocky planets, no more than 5% of planets with radii of about 2.6 RE will have densities compatible with a rocky composition to a 95% confidence level. PH3 c, with a radius of 2.7 RE, exceeds the threshold found by Rogers and, based on its density, is clearly not a terrestrial planet.

An obvious potential counterexample to Rogers’ maximum rocky planet size threshold is the case of Kepler 10c, which made the news early this year. Kepler 10c, with a radius of 2.35 RE determined by Kepler measurements and a Neptune-like mass of 17 ME determined by radial velocity measurements, was found to have a density of 7.1±1.0 g/cm3. While this density, which is greater than Earth’s, might lead some to conclude that Kepler 10c is a solid, predominantly rocky planet, Rogers counters that its density is in fact inconsistent with a rocky composition by more than one-sigma. Comparing the measured properties of this planet with various models, she finds that there is only about a 10% probability that Kepler 10c is in fact predominantly rocky in composition. It is much more likely that it possesses a substantial volatile envelope albeit smaller than Neptune’s given its higher density.

While much more work remains to be done to better characterize the planetary mass-radius function and the transition from rocky to non-rocky planets, one of the immediate impacts of this work is on the assessment of the potential habitability of extrasolar planets. About nine planets found to date in the Kepler data have been claimed by some to be potentially habitable. Unfortunately, all but two of these planets, Kepler 62f and 186f, have radii greater than 1.6 RE and it is therefore improbable that they are terrestrial planets, never mind potentially habitable planets.

This still leaves about a dozen planets that have been frequently cited as being potentially habitable that were discovered by precision radial velocity surveys whose radii are not known. However, we do know their MPsini values where MP is the planet’s actual mass and i is the inclination of the orbit to our line of sight. Since this angle cannot be derived from radial velocity measurements alone, only the minimum mass of the planet can be determined or the probability that the actual mass is in some range. Despite this limitation, the MPsini values can serve as a useful proxy for radius.

Rogers optimistically estimates that her 1.6 RE threshold corresponds to a planet with a mass of about 6 ME assuming an Earth-like composition (which is still ~50% larger than the measured mass of PH3 c, which is now known to be a non-rocky planet). About half of the planets that some have claimed to be potentially habitable have minimum masses that exceed this optimistic 6 ME threshold while the rest have better than even odds of their actual masses exceeding this threshold. If the threshold for the transition from rocky to non-rocky planets is closer to the 4 ME mass of PH3 c, the odds of any of these planets being terrestrial planets are worse still. The unfortunate conclusion is that none of the planets discovered so far by precision radial velocity surveys are likely to be terrestrial planets and are therefore poor candidates for being potentially habitable.

Please do not get me wrong: I have always been a firm believer that the galaxy is filled with habitable terrestrial planets (and moons, too!). But in the rush to find such planets, it now seems that too many overly optimistic claims have been made about too many planets before enough information was available to properly gauge their bulk properties. Preliminary results of the planetary mass-radius relationship now hints that the maximum size of a terrestrial planet is probably about 1½ times the radius of the Earth or around 4 to 6 times Earth’s mass. Any potentially habitable planet, in addition to having to be inside the habitable zone of the star it orbits, must also be smaller than this. Unfortunately, while recent work suggests that planets of this size might be common, our technology is only just able to detect them at this time. With luck, over the coming years as more data come in, we will finally have a more realistic list of potentially habitable planet candidates that will bear up better under close scrutiny.

The discovery paper for PH3 c by Schmitt et al., “Planet Hunters VII: Discovery of a New Low-Mass, Low Density Planet (PH3 c) Orbiting Kepler-289 with Mass Measurements of Two Additional Planets (PH3 b and d)”, The Astrophysical Journal, Vol. 795, No. 2, ID 167 (November 10, 2014) can be found here. The paper by Leslie Rogers submitted to The Astrophysical Journal, “Most 1.6 Earth-Radius Planets are not Rocky”, can be found here.

For a fuller discussion of how Rogers’ work impacts the most promising planets thought by many to be potentially habitable, please refer to Habitable Planet Reality Check: Terrestrial Planet Size Limit on my website Drew Ex Machina.

tzf_img_post

Tennessee Valley Interstellar Workshop

I’m at the Tennessee Valley Interstellar Workshop in Oak Ridge for the next few days. As I’ve done at past conferences, I’ll need to spend my time taking the notes that will be turned into next week’s entries here. That means no further posts until Friday, though I’ll try to keep the comment moderation going, perhaps with a few delays. TVIW 2014 has lined up a good group of speakers including, besides MSFC’s Les Johnson himself (TVIW’s founder), exoplanet hunter Sara Seager, beamed sail specialist Jim Benford, the SETI League’s Paul Shuch and TZF founder Marc Millis, along with a healthy representation from Icarus Interstellar. I’m also looking forward to the workshop tracks and will be participating in one called “Language as Reality: A Near-Term Roadmap for Exploiting Opportunities and Natural Experiments Here on Terra Firma to Inform *C*ETI.” Expect a complete report when I get back.

tviw