Kepler-16b: Inside a Chilly Habitable Zone?

The annual meeting of the American Astronomical Society is now in session in Austin, sure to provide us with interesting fodder for discussion in coming days. Just coming off embargo yesterday was news of further study of the interesting Kepler-16 system. This one made quite a splash last fall when the planet known as Kepler-16b was discovered to orbit two stars, with the inevitable echoes of Star Wars and the twin suns that warmed the planet Tatooine. This planet, though, was a gas giant more reminiscent of chilly Saturn than a cozily terrestrial world.

Image: An artist’s conception of the Kepler-16 system (white) from an overhead view, showing the planet Kepler-16b and the eccentric orbits of the two stars it circles (labeled A and B). For reference, the orbits of our own solar system’s planets Mercury and Earth are shown in blue. New work out of the University of Texas at Arlington explores the question of habitability in a system like this. Credit: NASA/Ames/JPL-Caltech.

You’ll recall, too, that Kepler-16b circles both a K-dwarf with about 70 percent of the Sun’s mass and a red dwarf of about a fifth of a solar mass. Although the planet’s orbit takes it within Venus-like distances of them, Kepler-16b’s central stars are small enough that temperatures would appear to be too cold for life. At the Austin meeting, however, researchers from the University of Texas at Arlington have made the case that an Earth-class planet could exist here as an exomoon orbiting the gas giant. They have no indication that such a planet actually exists, as Zdzislaw Musielak (UT-Arlington) is quick to point out, but the work is interesting nonetheless:

“This is an assessment of the possibilities,” said Musielak. We’re telling them where a planet has to be in the system to be habitable. We’re hoping they will look there.”

Making conditions on such a moon habitable would require an atmosphere with a strong warming effect that could be provided by high levels of greenhouse gases like carbon dioxide or methane. Such an atmosphere would widen what we would normally consider to be the habitable zone around the two stars. Al Jackson noted today in an email from Austin that all kinds of new ways to study Kepler candidates are coming to the fore, and remember that Kepler still has a long way to go before its primary mission is accomplished. As to exomoons, we’ve yet to identify one, but so much good work has been accomplished on how to achieve such a detection that it’s surely not going to be long before we have candidate exomoons to focus in on.

The paper on this work is not yet out, but I’ll announce it here when it’s available. Meanwhile, thoughts on how many habitable worlds are out there continue to be expansive. More on this tomorrow, as we return to news coming out of the Austin conference.

tzf_img_post

Innovative Interstellar Explorer: A Response to Questions

Ralph McNutt’s recent update on the progress of the Innovative Interstellar Explorer concept elicited plenty of comments, enough that Dr. McNutt wanted to answer them in a new post. Now at Johns Hopkins University Applied Physics Laboratory, McNutt is Project Scientist and a Co-Investigator on NASA’s MESSENGER mission to Mercury, Co-Investigator on NASA’s Solar Probe Plus mission to the solar corona, Principal Investigator on the PEPSSI investigation on the New Horizons mission to Pluto, a Co-Investigator for the Voyager PLS and LECP instruments, and a Member of the Ion Neutral Mass Spectrometer Team on the Cassini Orbiter spacecraft. With all that on his plate, it’s hard to see how he has time for anything else, but McNutt also continues his work as a consultant on the Project Icarus interstellar design study. His Innovative Interstellar Explorer is a precursor mission designed to push our technologies hard.

by Ralph McNutt

I typically do not get involved with commenting on comments just because of the time constraints of protracted discussions, but some of the questions raised by your readers are, I think, very good and deserving of a response. [The original post is Update on Innovative Interstellar Explorer — readers may want to skim through the comments there to get up to speed — PG].

Let me try to take the comments, for the most part, in order. At one point we did take a look at Sedna and the other large trans-Neptunian Objects (TNOs). The orbit of Sedna (can be found here) will move through ~60° of arc and through its perihelion between now and 2100 (just prior to the aphelion of Pluto) — this is out of an orbital period of ~12,600 years. All of this motion is within 90 AU of the Sun, the orbital inclination is ~12° and is certainly accessible with the appropriate “tweak” at a Jupiter gravity assist. Such an aim point also puts constraints on exactly where with respect to the direction of the incoming interstellar wind one is aiming. To exit the solar system rapidly, one wants a speed as high as possible. Traveling “only” ~17 km/s (about the flyby speed of Voyager 1 past Titan and faster than the speed of New Horizons past Pluto of ~13 km/s), close imaging is problematic (with a radius of 1500 km, this is an object radius travelled in ~100s). Several months of high resolution imaging are possible with a large camera such as LORRI on New Horizons but not with a cell phone camera (which would die rapidly in the space radiation environment shortly after launch anyway).

Eris, with an orbital inclination approaching 45° is currently about 97 AU from the Sun (orbit here) and is inbound to perihelion, crossing the plane of the ecliptic at ~90 AU in the early 2070’s, but still outside of 83 AU in 2100. Makemake (orbit) is currently well above the plane of the ecliptic, passing through the plane of the ecliptic just after the end of this century and just inside of 50 AU; its orbit has a relatively small eccentricity of ~0.16 and an inclination of ~29, etc. The real problem is that doing a flyby of a TNO really is a different mission.

It is perhaps also worth noting that nuclear electric propulsion has been looked at – and in some detail under NASA’s Project Prometheus. The problem is that the power system needs to have a specific mass no greater than ~30 kg/kW (something noted by Ernst Stuhlinger back in the 1960’s — Stuhlinger literally wrote the book on ion propulsion) to have an advantage in speed delivered by nuclear electric propulsion (NEP). But that has to include the mass of the system for dumping the waste heat of the reactor (from the second law of thermodynamics) as well as its mechanical supports. The Prometheus architecture came in at over twice that, and that is the problem. To date all NEP designs come in underpowered when engineering closure on the system as a whole is examined. Think of Hiram Maxim’s steam-powered airplane versus the gasoline-powered airplane of the Wright Brothers. This is ultimately the problem with VASIMIR as well – a more mass-efficient means of providing the wall-plug electricity is needed, if it is to ever become a real system.

The spacecraft mass question is a good one as well. We tried pushing that on the precursor to IIE that was funded by NIAC – an “all the stops pulled out” approach that reduces the spacecraft mass to ~150 kg including a payload. Again the problem is engineering closure. Even if I miniaturize the electronics to microminiaturized solid state items, I need communications, guidance and control, power, thermal control, and a payload. The payload sensors have to be a finite size just to collect the data if all I am fighting is Poisson statistics – which can be traded against integration time (but it makes no sense to spend 10 years to make one measurement). Even with an iPad or equivalent that is radiation hardened, one cannot reduce the mass arbitrarily and then still make the measurements that are the raison d’etre for the effort in the first place. Ultimately, one runs into physical limits set by the properties of the materials from which one constructs components.

Part of this is manufacturing and part is the physics of the material itself. Practicalities are also involved. For the NIAC effort, we looked at the idea (and not a new one) of using ultra-low power (ULP) electronics running at liquid nitrogen temperatures. But now I have a real problem in testing such devices, as the coefficients of thermal expansion of the materials as well as the Johnson noise can preclude operation at room temperature. I could fix that with a lab and facilities on the Moon, but now that infrastructure is required, and the technicians would have to work in space suits – and I have a scenario that does not close economically (and may not technically either). Everyone in the deep-space robotic business has mass reduction as a primary goal – on everything. One can build *something* for less than ~250 kg, but the indications are that to build the desired functionality, that type of mass limit will be “sporty.”

Image: IIE Initial Concept Closeup. Credit: JHU/APL.

With respect to launch vehicles, the use of “really, big” vehicles for robotic missions has always been problematic because of the cost. There was a Voyager Mars Program in the late 1960’s which envisioned using a Saturn V to send to large rovers to Mars. Similarly we looked at implementing IIE with an Ares V combined with either a Centaur or NERVA upper stage. While flyout times are reduced, the decrease is not a factor of two.

With respect to communications, in the NIAC work we looked at an IR optical communications systems running at about 890 nm (see this paper). That was not the problem. The problem was holding spacecraft pointing well enough to keep the laser spot on the Earth from 1000 AU (the requirement for that more aggressive mission). One can certainly do the pointing with a sufficiently capable guidance system – but that drove the mass even more. We found that the best trade was a high gain antenna (HGA) of just under 3 meters diameter (about what is on Pioneer 10 and 11 and New Horizons). One driver is holding tolerances during manufacture and another is holding them under the vibration environment imposed by the launch. Materials are not infinitely stiff (which is good, because then they would break), but that means corrections and feedbacks as required. The other interesting thing about a laser com system running from ~5 light days out is that closed-loop operation is not credible, and the beam is sufficiently small and the distance sufficiently large that to minimize power, I need a clock with an ephemeris that can be used to point the transmitter to where the Earth will be when the modulated laser carry arrives there.

tzf_img_post

Our Meaning-Stuffed Dreams

Gregory Benford’s work is so widely known that it almost seems absurd to introduce him, but for any Centauri Dreams readers who have somehow missed it, I challenge you to read In the Ocean of Night and not become obsessed with reading this author’s entire output. This week has been a science fictional time for Centauri Dreams, with discussion of SF precedents to modern discoveries in the comments for stories like Marc Millis’ ‘Future History.’ So it seems appropriate to end the week with an essay Greg published yesterday on his own site, one that appealed to me so much that I immediately asked him for permission to run it again here.

In the essay, Greg takes a look at science fiction writer Thomas Disch and in particular the way his thoughts on SF illuminate not just the genre but the world we live in. It’s insightful stuff, and makes me reflect on how our ideas of the future shape our upcoming realities. I will also admit to a fascination with science fiction’s history that never wanes, a passion that is reignited whenever I see serious thought being given to the intricate machinery of modern prose.

by Gregory Benford

I recently reread The Dreams Our Stuff Is Made Of by my old friend Thomas Disch (Free Press, 1998, $25). Tom is now gone, but his ideas seem fresh as ever about science fiction and where it’s gone.

Here are some thoughts on the book, which still bears consideration. This sadly sardonic survey of science fiction worries its subject from many angles: historical, literary, sociological. Science fiction (sf) is perhaps the defining genre of the twentieth century, its conquering armies still camped outside the Rome of the literary citadels.

It’s an old story. Throughout this century, conventional literature persistently avoided thinking about conceptually altered tomorrows, and retreated into a realist posture of fiction of ever-smaller compass. By foregrounding personal relations, the novel of character came-especially in a classic debate around World War I between Henry James and H.G. Wells-to claim the pinnacle of orthodox fiction. James won that argument, surrendering the future to the genre that would later increasingly set the terms of social debate.

Disch underlies his wryly witty observations with poet Delmore Schwartz’s resonant title from 1938, *In Dreams Begin Responsibilities*. This “pregnant truth” is his clarion call to the genre that once fascinated him but plainly calls to him less since the mid-1980s. Sf takes up Big Ideas, but does not always treat them well. This unfulfilled promise vexes Disch, and he rummages among the cranks, fakes and crazies that often camped near the Legions of the Future. He treats us to tours of mesmerism from the time of Poe, to UFOs and their exploiters (Whitley Strieber, a flagrant example), to the huge religion invented in an sf magazine, Scientology. These unseemly neighbors of the genre betray America’s great historical trouble: high dreams, ready gullibility. Some skepticism is quite in order, particularly in the New Age.

The persistence of cranks and fools in the ranks of sf is sobering. We’ll scarcely be invited to tea if we keep such companions. This blends with Disch’s class analysis of literature.

Still, “The difference between highbrow and low — between Eliot and Poe, between mainstream and scifi-is not one that can be mapped by the conventional criteria of criticism.” He supports this by showing that Poe is more a formalist than Eliot, and less given to overt lecturing and preachiness. Instead, “The essential difference is not one of aesthetics or of some subtler metaphysical nature, but of the two writers’ antithetical social and economic positions.” Poe was a popular, market-driven writer, a “magazinist,” while Eliot was supported by a high culture with subtle patronage.

Sf is best seen as the voice of a rising class that sprang from the burgeoning American masses, hopeful middle class technological types. Their very earnestness carried their arguments and visions into the souls of the one country most responsible for our visions of the future; sf is notably an American creation, since the great era of Wells.

Predictably, its grandiose dreams lead to its worse faults. Sf’s greatest vice is lecturing. In the face of such large ideas, many authors became the “School Teacher Absolute, a fate that would befall so many later sf writers-Heinlein, Asimov, Bradbury, Le Guin, Delany-that it must be considered an occupational hazard.” It can carry a writer away. Disch sees the later work of Philip K. Dick, particularly the important Valis, as “madness recollected in a state of borderline lucidity.”

Such faults go with the territory, but they do not dominate. The true strength of the genre lies in its power to convince by imagining. “A theory can be controverted; a myth persuades at gut level.”

We sf writers were often great makers of myth, some lifted from written sf and tarted up for media consumption *Star Trek* is notorious for looting the more thoughtful work of writers for their striking effects, leaving behind most of the thought and subtlety. Of the show’s huge global audience, he observes, “few audiences like to be challenged,” for after all, “it is traditionally the prelude to a duel, not to a half-hour of light entertainment. Any artist’s first order of business is not to challenge but to entice.”

He views this most persistent of any TV show from a fashion angle: actors in pajamas. Their starship looks much like an office from the inside, with lookalike uniforms: “the same parables of success-through-team effort that can be found on such later workplace-centered sitcoms as The Mary Tyler Moore Show and Designing Women.”

Trek was thus the prophet of the politically correct multicultural future just ahead of us, with workplace equality conspicuously displayed. Disch wrings much humor from this insight, yet surely the crucial nature of both Star Trek and Star Wars lies in their invocation of family. The strangeness of outer space futures had before been so daunting for audiences that typically it is the backdrop of horror (the Alien series, etc.).

Star Trek’s insight lay in the promise of going to the stars together, with well defined stereotypes who could supply the emotional frame for the potentially jarring truths of these distant places. That is why the cultures they meet proved so boring: “Blandness and repetition can be comforting, and comfort is a major deseratum in bedtime stories.” Alas, the genre set out to do more than rock us to sleep.

The market now mirrors his withering analysis. Despite his assertion that “three or four slots on the best-seller lists are occupied by SF titles” in fact their occupants are fantasy tomes and Michael Crichton clones, not actual sf at all. Only one true sf novel I can recall from the 1990s made the lists for long, Arthur Clarke’s 3001, a media-driven sequel to a sequel to a sequel. Instead, fantasy reigns supreme.

Indeed, Disch believes that once space travel, sf’s grand metaphor, proved to mean long voyages to inhospitable places, the genre reverted to fantasy-like motifs. There is truth in this, both in the rise of genre fantasy in books (now plagued with a numbing sameness and endless trilogies) and in the Joseph Campbell (savant of the mythic archetype theory of storytelling, as used by George Lucas in Star Wars) over John W. Campbell (tough-minded editor of Astounding magazine, the font of sf’s Golden Age, yet also the crucible of Scientology and crank ideas like the infamous Dean Drive).

This retreat from the observable fact-that the moon in indeed a harsh mistress-to Disch signals the end of sf’s best days. Though he scorns the Heinlein-Pournelle wing of hard sf (“Space is like Texas, only larger.”) he confesses a fondness for that seminal work of physical exploration, Hal Clement’s Heavy Planet.

Certainly, “hardness” in the sense of scrupulous concern for the facts and methods of science remains for many the core of the field, and its always hopeful promise. Hardness has been appropriated by some for political hard-nosed analysis, often with a libertarian bias, sometimes even for a conservative one — a seeming contradiction, for a “literature of change.”

Clement’s seminal world-building took us to far exotica, to meet the strange face to face. Indeed, aliens are the most pointed sf motif. “If God can’t be coerced into breaking his silence, at least he can send emissaries,” a neat compression of science’s failure to reveal the holy, and sf’s literary attempt to find it metaphorically in the alien. Aliens are only passingly interesting to see; what one wants to do is talk to them, sense the strangeness of another mind.

Yet this is not the focus of the movies and TV, which have turned sf’s aliens into horror shows or neat parables. “Screenwriters do not have the luxury that novelists enjoy of taking the time to explain things, to pose riddles and work them out, to think. Such bemusements can be the glory of sf (as of the deductive mystery, another genre poorly served by film)” and we see it seldom in the torrent of special effects circuses pouring from our screens.

In the late 1990s we have entered an era when special effects can show us just about anything, sometimes at surprisingly little cost. This could liberate sf in the arena by which it is increasingly judged, the visual.

I believe this to be the great challenge to the genre: to use its insights and methods to reach the great potential audience with more than simple spectacle. The western made such a transition in the 1950s, producing its highest works (High Noon, The Searchers, Shane) before running out of conceptual gas.

Written sf may have lesser prospects. Media tie-in work fills a (thankfully) separate section of the sf division in the larger book stores. In the rising tide of media spinoff novels and “sharecropping” of imaginative territories pioneered by early greats, Disch seen the genre’s probable fate: “more of the same and more of the sameness.”

Need this be so? I find the quantity of fine written sf has never been higher, counter-balancing the media tie-in clones. This goes little noticed in the windy passageways of the literary castles, for the division of that Wells-James debate persists. There is a curious mismatch between the reviewing media and the reading public. One would expect an efficient market to shape book reviewing to the great strengths of contemporary America: genres, from the hardboiled detective to cutting-edge sf to wispy, traditional fantasy.

In the end, Disch seems saddened because the promise of the New Wave, just breaking when he entered the field in the 1960s, hissed away into the sands of time. But the legacy of his generation is deeper, raising the net in the genre’s perpetual tennis match between conventional literature’s subtle, stylish stamina versus sf’s blunt, intellectual energies. True, Disch’s fellow marchers have largely fallen silent, but the advance of hard sf after them used weaponry they had devised. From Clement’s beginning, hard sf has fashioned a whole armament of methods, some of which mainstream mavens like Tom Clancy, and savvy insiders like Larry Niven and Jerry Pournelle, have built rich provinces of their own. Neal Stephenson’s cultural insights and technoriffs too have found a huge audience.

Genres are best seen as constrained conversations, and sf is the leader and innovator in this. Constraint is essential, defining the rules and assumptions open to an author. If hard sf occupies the center of science fiction, that is probably because hardness gives the firmest boundary.

Genres are also like immense discussions, with ideas developed, traded, and variations spun down through time. Players ring changes on each other-a steppin’-out jazz band, not a solo concert in a plush auditorium. Contrast “serious” fiction-more accurately described, I believe, as merely self-consciously solemn-which proceeds from canonical classics that supposedly stand outside of time, deserving awe, looming great and intact by themselves.

Disch seems to sense the central draw of sf, but because he has been so isolated from it for so long, his expedition never reaches the core. Genre pleasures are many, but the quality of shared values within an on-going discussion may be the most powerful, enlisting lifelong devotion in its fans. In contrast to the Grand Canon view, genre reading satisfactions are a striking facet of modern democratic (“pop”) culture.

Disch does deplore the recent razoring of literature by critics-the tribes of structuralists, post-modernists, deconstructionists. To many sf writers, “post-modern” is simply a signature of exhaustion. Its typical apparatus-self-reference, heavy dollops of obligatory irony, self-conscious use of older genre devices, pastiche and parody-betrays lack of invention, of the crucial coin of sf, imagination. Some deconstructionists have attacked science itself as mere rhetoric, not an ordering of nature, seeking to reduce it to the status of the ultimately arbitrary humanities. Most sf types find this attack on empiricism a worn old song with new lyrics, quite retro.

At the core of sf lies the experience of science. This makes the genre finally hostile to such fashions in criticism, for it values its empirical ground. Deconstructionism’s stress on a contradictory or self-contained internal differences in texts, rather than their link to reality, often merely leads to literature seen as empty word games.

Sf novels give us worlds which are not to be taken as metaphors, but as real. We are asked to participate in wrenchingly strange events, not merely watch them for clues to what they’re really talking about. Sf pursues a “realism of the future” and so does not take its surrealism neat, unlike much avant-garde work which is easily confused with it. Thes followers of James have yet to fathom this. The Mars and stars and digital deserts of our best novels are, finally, to be taken as real, as if to say: life isn’t like this, it is this.

The best journeys can go to fresh places, not merely return us to ourselves. Despite Disch’s sad eulogy for the genre’s past, which he considers its high point, I suspect there are great trips yet to be taken.

tzf_img_post

100 Year Starship Winner Announced

These are good times for Icarus Interstellar, which teamed with the Dorothy Jemison Foundation and the Foundation for Enterprise Development to win the 100 Year Starship proposal grant. Mae Jemison, the first female African-American astronaut to fly into space, founded DJF in honor of her late mother. As lead on the proposal, her organization now takes on the challenge of building a program that can last 100 years, and might one day result in a starship. Centauri Dreams congratulates the winning trio, and especially Kelvin Long, Richard Obousy and Andreas Tziolas, whose labors in reworking the Project Daedalus design at Icarus Interstellar have paid off. While the award was announced to the winners at the end of last week, I held up the news here while the three parties involved coordinated their own announcement. But I see that other venues are picking up the story, as in this Sharon Weinberger piece for the BBC and now a similar article in Popular Science, so it seems time to go ahead with at least a mention on Centauri Dreams while we await the official announcement from Jemison.

tzf_img_post

Resolving the Mysteries of Titan’s Weather

A robust new computer model that couples the atmosphere of Titan to a methane reservoir on the surface goes a long way toward explaining not just how methane is transported on the distant moon, but also why the various anomalies of Titan’s weather operate the way they do. The model comes out of Caltech under the guidance of Tapio Schneider, working with, among others, outer system researcher extraordinaire Mike Brown. It gives us new insights into a place where the average surface temperature hovers around a chilly -185 degrees Celsius (-300 F).

Image: NASA’s Cassini spacecraft chronicles the change of seasons as it captures clouds concentrated near the equator of Saturn’s largest moon, Titan. (Credit: NASA/JPL/SSI).

Titan can a frustrating place for meteorologists to understand because during the course of a year some things happen that, in the early days of research, didn’t make a lot of sense. The moon’s equator, for example, is an area where little rain is supposed to fall, but when the Huygens probe arrived, it saw evidence of rain runoff in the terrain. Later, storms were found occurring in the area that did not fit then current models of circulation. The new three-dimensional model simulates Titan’s atmosphere for 135 of its years, which converts to 3000 Earth years. And it produces intense equatorial rains during Titan’s vernal and autumnal equinoxes.

According to the researchers, rain is indeed rare at low latitudes, but as Schneider says, “When it rains, it pours.” And the equatorial regions aren’t the only venue on Titan that the new model addresses. Titan’s methane lakes cluster around the poles, and it has been established by Cassini’s unceasing labors that more lakes exist in the northern than the southern hemisphere. According to Schneider, methane collects in lakes near the poles because sunlight is weak enough in those regions that little methane evaporates.

As to why more lakes are found in the northern hemisphere, let me quote from the Caltech press release on this work:

Saturn’s slightly elongated orbit means that Titan is farther from the sun when it’s summer in the northern hemisphere. Kepler’s second law says that a planet orbits more slowly the farther it is from the sun, which means that Titan spends more time at the far end of its elliptical orbit, when it’s summer in the north. As a result, the northern summer is longer than the southern summer. And since summer is the rainy season in Titan’s polar regions, the rainy season is longer in the north.

And there you have it — the summer rains in the southern hemisphere may be more intense because of stronger sunlight to trigger storms, but over the course of a year, more rain falls in the north, filling the lakes and accounting for their distribution. Older explanations that relied on methane-producing cryogenic volcanoes look to be in danger of being supplanted by the new model of atmospheric circulation on Titan, an explanation that requires nothing esoteric. The beauty of the work is that the model predicts what we should see in the near future: Rising lake levels in the north over the next 15 years, and clouds forming over the north pole in the next two.

Thus a set of testable predictions by which to evaluate the model emerge, what Schneider calls “a rare and beautiful opportunity in the planetary sciences,” and one which should help us refine the model as events progress. “In a few years,” he adds, “we’ll know how right or wrong they are.” Adding to its weight is the fact that the model reproduces the observed distribution of clouds on Titan, which earlier atmospheric circulation models had failed to do. The paper is Schneider et al., “Polar methane accumulation and rainstorms on Titan from simulations of the methane cycle,” Nature 481 (5 January 2012), 58-61 (abstract).

tzf_img_post