Centauri Dreams
Imagining and Planning Interstellar Exploration
The 3 Most Futuristic Talks at IAC 2015
Justin Atchison’s name started appearing in these pages all the way back in 2007 when, in a post called Deep Space Propulsion via Magnetic Fields, I described his work at Cornell on micro-satellites the size of a single wafer of silicon. Working with Mason Peck, Justin did his graduate work on chip-scale spacecraft dynamics, solar sails and propulsion via the Lorentz force, ideas I’ve tracked ever since. He’s now an aerospace engineer at the Johns Hopkins University Applied Physics Laboratory, where he focuses on trajectory design and orbit determination for Earth and interplanetary spacecraft. As a 2015 NIAC fellow he is researching technologies that enable asteroid gravimetry during spacecraft flybys. In the entry that follows, Justin reports on his trip to Jerusalem for this fall’s International Astronautical Congress.
by Justin A. Atchison
Greetings. I’m Justin Atchison, an aerospace engineer at the Johns Hopkins University Applied Physics Laboratory. I’m proud to have previously had my graduate research included on Centauri Dreams (1,2, and 3). Now, I’m guest-writing an article about the three most futuristic talks I saw at the International Astronautical Congress in Jerusalem this past October. I was able to attend the conference thanks to a travel fellowship through the Future Space Leaders Foundation (FSLF). I’d strongly encourage any student or young-professional (under 35) to apply for this grant next year. It’s a fantastic opportunity to attend this premier conference and interact with a variety of international leaders and thinkers in the aerospace field. FSLF also hosts the Future Space Event on Capitol Hill each summer, which offers engagement with US Congress and aerospace executives on the latest and most relevant space-related subjects.
Image: Justin in the IAC-2015 exhibition hall trying on a protective harness that minimizes radiation exposure to the pelvis bone, which is particularly sensitive to radiation due to its high bone marrow production.
So with that note of thanks and recommendation, I give you “The 3 Most Futuristic Talks at IAC 2015.”
1. An Approach for Terraforming Titan
Abbishek G., D. Kothia, R.A. Kulkarni, S. Chopra, and U. Guven, “Space Settlement on Saturn’s Moon: Titan,” International Astronautical Congress, Jerusalem, Israel, IAC-15-D4.1.5, 2015.
University of Petroleum and Energy Studies, India
The authors of this paper explore options for terraforming Titan in the distant future. Specifically, this means liberating oxygen and increasing the surface temperature.
In addition to having water-ice, Titan is a candidate for human settlement for a few compelling reasons:
- Abundant Water-Ice – Water is obviously critical for life and is a source of oxygen.
- Solar Wind Shielding – Saturn’s magnetosphere “contains” Titan for 95% of its orbit period and is relatively stable.
- Earth-like Geology – Observations of Titan show a relatively young, Earth-like surface with rivers, wind-generated dunes, and tectonic-induced mountains.
- Native Atmosphere – Titan’s atmosphere is nitrogen rich (95%) and shows strata similar to Earth (troposphere, stratosphere, mesosphere, and thermosphere). The atmospheric pressure on the surface is only 60% higher than that on Earth.
However, Titan presents challenges for habitation, namely a lack of breathable oxygen and the presence of extremely cold surface temperatures (90K). The authors suggest that the solution to these two challenges is a nuclear fission plant that can dissociate oxygen from water and produce greenhouse gases.
Generating Oxygen – The main idea is to use beta or gamma radiation to set up a radiolysis process that converts hydrogen and oxygen atoms into usable constituents, including O2. This requires an artificial radiation source and a means of liberating the hydrogen and oxygen atoms from the native cold ice. They suggest a nuclear fission plant as the source of the radiation, and a duoplasmatron as the means of liberating and exciting the H and O atoms. The duoplasmatron would accelerate a beam of argon ions, which would be aimed at the water-ice. The collisions cause sputtering, where the argon ions literally knock O and H atoms out of the ice. These atoms are then collected and radiated to generate usable O2.
Heating Up the Atmosphere – At about 9.5 AU from the Sun, Titan receives only ~1% as much solar energy as Earth. The goal for raising the temperature on Titan is to capture and retain that limited energy. The authors consider the generation of greenhouse gases as the solution. There are two options they suggest:
- If lightning is present on Titan, then the oxygen generated by the nuclear reactor can energetically react with the already-present nitrogen to produce nitrogen oxides, namely NO, NO2, N2O, and N2O2. Once these nitrogen oxides are able to raise the surface temperature by roughly 20 K, Titan’s methane lakes will begin to boil off, releasing gaseous methane as an additional greenhouse gas, and potentially raising the surface temperature to a habitable value.
- If lightning isn’t present, or if its generation of nitrogen oxides is too inefficient, one could boil the methane lakes directly using the previously mentioned nuclear reactor. In this setup, the reactor is simply increasing the amount of vapor in the already-present methane cycle (vaporization and condensation of methane). To cause the lakes to naturally vaporize, one needs to generate sufficient vapor to affect the global climate and raise the surface temperature by 20 K.
The authors don’t estimate the total time required for terraforming, the size of the nuclear plant required to start the process, or the maximum theoretical surface temperature achievable, but they nonetheless posit a potential path forward for planetary habitation…and that’s a meaningful contribution.
____________________
2. Eternal Memory
Guzman M., A. M. Hein, and C. Welch, “Eternal Memory: Long-Duration Storage Concepts for Space” International Astronautical Congress, Jerusalem, Israel, IAC-15-D4.1.3, 2015.
International Space University and Initiative for Interstellar Studies, France
How can present day humanity leave a message for distant future civilizations (human or alien)? This question first became an option with Carl Sagan’s famous Voyager Golden Record. The authors of this review paper evaluate the requirements and near term options available to store and interpret data for millions to billions of years in space. That’s a long enough timescale that you have to start to consider the lifetime of the sun (5 billion years) and the merging of the Milky Way and Andromeda galaxies (long term dynamics may destabilize orbits in the solar system). There are a variety of current efforts, most of which are crowd-funded, including: The Helena Project, Lunar Mission Project, Time Capsule to Mars, KEO, The Rosetta Project, The Human Document Project, One Earth Message, and Moonspike.
The data storage mechanism has to survive radiation, micrometeoroids, extreme temperatures, vacuum, solar wind, and geologic processes (if landed on a planet or moon). In terms of locating the data, the authors consider just about every option: Earth orbit, the Moon, the planets, planetary moons, Lagrange points, asteroids and comets, escape trajectories, and even orbiting an M-star. The Moon appears to be a good candidate because it remains near enough to Earth for future civilizations to discover, yet distant enough to avoid too common access (it can’t be too accessible or it might be easily destroyed by malicious or careless humans). One of the proposed implementations is to bury data at the lunar north pole, where regolith can be a shield against micrometeoroid impacts.
There are a variety of near-term technologies available for this challenge, including three approaches that could likely survive the requisite millions to billions of years:
- Silica Glass Etching – “Silica is an attractive material for eternal memory concepts because it is stable against temperature, stable against chemicals, has established microfabrication methods, and has a high Young’s modulus and Knoop hardness.” In this implementation, femtosecond lasers are used to etch the glass and achieve CD-ROM like data densities. A laboratory test exposed a sample wafer to 1000°C heat for two hours with no damage.
- Tungsten Embedded in Silicon Nitride – A group in the Netherlands has developed and tested a process for patterning tungsten inside transparent, resilient silicon nitride. The resulting wafer can be read optically. The materials were selected for their high melting points, low coefficients of thermal expansion, and high fracture toughness. A sample QR code was generated and successfully tested at high temperatures, the result of which implied 106 year survivability.
- Generational Bacteria DNA – This approach uses DNA as a means of storing data (see Data Storage: The DNA Option). Although this may sound extreme, consider that bacterial DNA has already survived millions of years in Earth’s rather unstable environment. It is a demonstrated high-density, resilient means of storing data. In this implementation, we would write data into the genome of a particularly hardy strain of bacteria, and rely on its self-survival to protect our data for the future. This option presents the challenge that it requires the future civilization to have the capability to study the bacteria’s DNA and identify the human generated code.
As humans continue to send probes to unique places in the solar system, I hope that we’ll consider and incorporate these new technologies. Who knows–In a few million years, our “cave paintings” may be hanging in some intergalactic museum.
____________________
3. Carbon Nanotubes for Space Solar Power…(And Eventually Interplanetary Travel?)
Gadhadar R., P. Narayan, and I. Divyashree, “Carbon-Nanotube Based Space Solar Power (CASSP),” 4th Space Solar Power International Student and Young Professional Design Competition, Jerusalem, Israel, 2015.
NoPo Nanotechnologies Private Limited and Dhruva Space, India.
Single-Walled Carbon Nanotubes (SWCNT) have remarkable properties:
- Incredibly high strength-to-weight ratio (300x steel) [~50,000 kN m / kg]
- High electrical conductivity (higher than copper) [106-107 m/?]
- High thermal conductivity (higher than diamond) [3500 W/(m K)]
- High temperature stability (up to 2800°C in a vacuum)
- Tailorable semiconductive properties (based on nanotube diameter)
- Ability to sustain high voltage densities (1-2 V/µm)
- Ability to sustain high current densities (~109 A/cm2)
- High radiation resistance
These properties, specifically the strength-to-weight ratio, make them candidates for things like space elevators and momentum exchange tethers.
The authors of this paper posit a different application for SWCNT: space based solar power. This is the concept where an enormous array of solar cells is placed in orbit around Earth. Power is collected, and then beamed down to the surface at microwave frequencies for terrestrial use. The main idea is to collect solar power outside of Earth’s atmosphere, which attenuates something like 50-60% of the energy in solar spectra. The spacecraft has access to a higher energy density of solar light, which it then beams down to Earth at microwave frequencies, at which the atmosphere is transparent. A second advantage is that high orbiting satellites have much shorter local nights (eclipses) than someone on the Earth (there’s only up to about 75 minutes of darkness for geostationary orbit).
In this paper, the authors describe an implementation of a solar power satellite that would use semiconducting SWCNT as the solar cells. Based on the authors’ analysis, it’s feasible to mature this TRL 4 technology to achieve a peak energy of 2 W/g at 10 cents per W. This is compared to current TRL 9 options that offer roughly 0.046 W/g at 250 dollar per W. The design is entirely developed from different forms of SWCNT, which are used to make a transparent substrate, a semiconducting layer, and a conducting base. The three-layered assembly would have a density of 230 g/m2, roughly a third of current technologies.
Additionally, the authors advocate for SWCNT based microwave transmitters, which could potentially be more efficient than traditional Klystron tubes and wouldn’t require active cooling.
As an added benefit, this type of SWCNT microwave source could potentially be used in the newly discussed (and certainly controversial) CANNAE drive. In the paper’s implementation, CANNAE propulsion would only be used for station-keeping…But it’s not hard to extrapolate and conceive of a solar powered, CANNAE-driven spacecraft for interplanetary exploration.
I have to admit, I’m a bit skeptical of the economics of space-based solar power concepts. But this paper is nonetheless exciting as it highlights the potential applications for this relatively new engineered material. I can’t wait to see how SWCNT are used in the coming decades and what new exploration technologies they’ll enable.
The Cereal Box
“No matter how these issues are ultimately resolved, Centauri Dreams opts for the notion that even the back of a cereal box may contain its share of mysteries.” I wrote that line in 2005, and if it sounds cryptic, read on to discover its origins, ably described by Christopher Phoenix. I first encountered Christopher in an online discussion group made up of physicists and science fiction writers, where his knack for taking a topic apart always impressed me. A writer whose interest in interstellar flight is lifelong, he is currently turning his love of science fiction into a novel that, he tells me “incorporates some of the ideas we talk about on Centauri Dreams as a background setting.” Today’s essay examines the ideas of a physicist who dismissed the idea of interstellar flight entirely, while using a set of assumptions Christopher has come to challenge.
by Christopher Phoenix
“All this stuff about traveling around the universe in space suits — except for local exploration which I have not discussed — belongs back where it came from, on the cereal box.”
Over fifty years ago, physicist Edward Purcell penned the boldest dismissal of interstellar flight on record in his paper “Radioastronomy and Communication Through Space”. In that paper, Purcell uses the elementary laws of mechanics to refute the possibility of starflight in total. There are many people, of course, who share his belief that we will never reach the stars.
Keeping a firm grounding in the laws of physics is absolutely necessary when researching interstellar travel. A healthy skeptical attitude can help keep researchers honest with themselves. Certainly, not everything we imagine is possible. Nor can we hope to ever reach for the stars if we do not keep our feet firmly planted in reality.
However, sometimes such extreme skepticism deserves some healthy skepticism itself. Even though Purcell’s equations aren’t wrong, he didn’t prove that starflight belongs back on the cereal box. Instead, he defines the problem of interstellar travel in such a way that it seems to be insurmountable.
Radioastronomy and Communication Through Space
Before we begin, I want to quickly introduce Purcell and this paper. Edward M. Purcell made important contributions to physics and radioastronomy. He shared the 1952 Nobel Prize in Physics for discovering nuclear magnetic resonance (NMR) in liquids and solids. Later, Purcell was the first to detect radio emissions from neutral galactic hydrogen, the famous “21cm line”. Many important developments in radioastronomy resulted from his work.
“Radioastronomy and Communication Through Space” was the first paper in the Brookhaven Lecture Series. These lectures were meant to provide a meeting ground for all the scientists at Brookhaven National Laboratory. In this paper, Purcell argued that traditional radio SETI, not interstellar travel, is our only way of learning about other planets in the galaxy.
Image: Edward Mills Purcell (1912-1997). Credit: Wikimedia Commons.
Purcell builds to his conclusion in three sections. The first section discusses then-recent discoveries in radioastronomy. Purcell tells how astronomers mapped the galaxy by observing radio emissions from neutral galactic hydrogen (the 21cm line). He notes in particular that we gathered all this information by capturing an astonishingly tiny amount of radio energy from space. Over nine years, the total amount of radio energy captured by all 21cm observatories added up to less than one erg (10-7 J).
The paper then jumps from radioastronomy to more speculative topics. In the second section, Purcell takes on the idea of interstellar travel and runs some calculations on relativistic rockets. He concludes that interstellar flight is “preposterous”. In the final section of his paper, Purcell argues that radio messages can be sent between the stars for relatively little energy cost, while the energy required for interstellar travel is unobtainable.
I shall primarily discuss the second part of this paper, where Purcell argued against the possibility of interstellar travel.
“This is preposterous!”
From the start, Purcell considered fast interstellar travel as our only option. Purcell noted that relativity is not the obstacle to reaching another star within a single human lifetime. We cannot travel faster than light. However, if a we travel at speeds close to that of light, time dilation becomes an important factor, reducing the amount of time that passes for us on our trip. You will age much less than your friends back home if you travel to the stars at relativistic speeds.
This is perfectly correct, in my view, so far as it goes. Special relativity is reliable. The trouble is not, as we say, with the kinematics but with the energetics… Personally, I believe in special relativity. If it were not reliable, some very expensive machines around here would be in deep trouble.
The problem, Purcell says, is building a rocket capable of carrying out this mission. He develops this argument by examining a particular example flight.
Let us consider a trip to a place 12 light years away, and back. Because we don’t want to take many generations to do it, let us arbitrarily say we will go and come back in 28 years earth time. We will reach a top speed of 99% speed of light in the middle, and slow down and come back. The relativistic transformations show that we will come back in 28 years, only ten years older. This I really believe… Now let us look at the problem of designing a rocket to perform this mission.
So, Purcell has defined the problem in a certain way. The starship must fly to another star and return to Earth within a human lifetime. To do so, it will reach a top speed of 99% the speed of light (C) in the middle of the voyage. The craft is a rocket, and it must carry all its propellant from the beginning of the trip. It cannot refuel anywhere. To reach 99% C within a short amount of time, the rocket must maintain an acceleration of one g for most of the trip.
Having laid out the starting assumptions for our trip, Purcell uses the relativistic rocket equation to calculate the amount of propellant the rocket will require to complete the trip. Remember that rockets are momentum machines. They throw a certain mass of propellant out the back, and the reaction force pushes the rocket. When that propellant is all gone, only the payload remains and the rocket has reached its final speed.
A rocket engine’s performance is determined by its exhaust velocity (Vex). This is the velocity at which propellant leaves the engine as measured by the rocket. The higher the Vex, the more efficiently the rocket engine uses propellant. Engineers refer to rocket efficiency as specific impulse (Isp). A rocket’s specific impulse is determined by its exhaust velocity.
If you have a rocket of a certain Vex, and you want to accelerate it to a certain maximum velocity (Vmax), physics imposes a certain relationship between the initial and final mass of the rocket. Engineers call this ratio a rocket’s mass ratio. This relationship is shown by the rocket equation. Unfortunately, if our Vmax is much larger than our Vex, mass ratios increase exponentially. This is because the rocket must not only accelerate the payload, but also all the as-yet unused propellant. To go faster, you need more propellant, but you need more propellant to carry that propellant- and so on.
So, our next problem is choosing an engine. We want to travel close to the speed of light, so we need an engine with the highest exhaust velocity (and thus highest Isp) possible. Chemical rockets have much to low a Vex to do this- they would require an unimaginably large amount of reaction mass to approach the speed of light. We need a far more powerful engine.
One type of engine that could perform far better than chemical rockets is the nuclear fusion rocket. So, Purcell first proposes using idealized nuclear fusion propellant. In this case, the rocket’s initial mass must be a little over a billion times its final mass to reach 99% C. A ten ton payload will require a ten billion ton rocket at the start of the journey. This is simply too much mass!
We need something far more potent. Purcell turns to idealized matter-antimatter (M/AM) propellant. Again, we assume the fuel is utilized with perfect efficiency. Matter annihilates with antimatter, and the resulting energy is exhausted as massless electromagnetic radiation (gamma rays), giving us a Vex of C. We can’t beat that.
Image: VARIES (Vacuum to Antimatter Rocket Interstellar Explorer System) is a concept developed by Richard Obousy that would create its own antimatter enroute through the use of powerful lasers. Credit: Adrian Mann.
The situation is vastly improved by M/AM propellant. To reach 99% C, the rocket’s initial mass must be only 14 times its final mass. But we must also slow down at the destination, and slowing down requires just as much effort as accelerating in the first place. After that, we must turn the ship around and return to Earth.
So, during the course of our flight, the rocket shall undergo four accelerations. On the trip away from Earth, the rocket will accelerate to 99% C, and then decelerate back down to rest at the destination star. After turning around, it will accelerate back to 99% C on the trip home and then decelerate back down to rest at Earth. To do this, the rocket must start with an initial mass 40,000 times its final mass. To send a ten ton payload on this round trip will require a 400,000 ton rocket, consisting half of matter and half of antimatter.
The starship must accelerate at one g for most of the trip. At the outset of its journey, this rocket must radiate 1018 watts of radiant energy to accelerate its 400,000 tons of mass at one g. This is a little over the total power that the Earth receives from the sun. Only this energy is in gamma rays, which presents a shielding problem for any planet near the ship. In addition, once the rocket achieves relativistic velocities, cosmic dust and gas present a shielding problem for the ship itself. At these speeds, even tiny specks of matter will behave like pinpoint nuclear explosions, and individual protons will be transformed into deadly cosmic rays.
Purcell concludes that these calculations prove that interstellar flight is “preposterous”, in this solar system or any other.
Rigging the game
There isn’t anything wrong with Purcell’s calculations. The problem is that Purcell wants to take this one set of calculations and prove that any form of interstellar travel is impossible. This isn’t very fair, since the starting conditions he picked in his example lead to his pessimistic conclusions. Let’s examine these assumptions.
Purcell’s first assumption is we must travel at 99% C. Why must we travel so fast? Even to complete a trip to a nearby star within a human lifetime, you can travel slower than that. Purcell is committed to these extreme relativistic speeds in order to take advantage of time dilation and complete the round trip in a decade.
If we are willing to travel much slower, perhaps 10% C, or even 1% C, and let multiple generations of crew make the trip, the difficulties are greatly reduced. At slower speeds, propulsion requirements are far more reasonable, and deadly collisions with cosmic dust would be easier to defend against.
Of course, there are many very difficult challenges to solve before we can launch such a ship. The travelers must recycle all their air and water, grow their own food, and build a stable society able to last for centuries. Some form of artificial gravity must be provided to prevent muscle and bone loss in microgravity. The habitable sections of the ship must be shielded from cosmic rays. But none of these represent hard physical limits arising from the laws of mechanics and nothing else.
This is all assuming humans are making the trip. Slow travel is made even easier if humans do not make the trip, just as we have done with our current robotic exploration of the solar system.
The second assumption is the starship must return to Earth. Particularly if we must carry all the propellant we use from the outset, a round trip mission is far more difficult than a one-way trip. But why must the starship return to Earth? There are many interesting missions that do not require the spacecraft to return to Earth. A colonizing expedition does not have or even want to return. Neither does a robotic probe. A fly-by probe like Daedalus doesn’t even need to carry propellant to slow down at the destination.
Purcell’s third questionable assumption is an interstellar vehicle must carry all its energy and reaction mass on board from the start of the trip. Is this really true? Think about in-situ resource utilization. An interstellar expedition could mine propellant from planetoids encountered at the destination. We can use propulsion systems that use the resources present in space, like gravitational assists, solar sails, or even interstellar ramjets. Granted, gravitational assists and solar sails could not get you anywhere near relativistic speeds, but they could work for slower travel.
Image: A Bussard ramjet in flight, as imagined for ESA’s Innovative Technologies from Science Fiction project. Credit: ESA/Manchu.
If the natural resources of space are not sufficient, there are other options. Rockets carry all their energy and reaction mass from the start. Beam-rider propulsion systems are an alternative that leave heavy engines, energy sources, and propellant back home. One such craft is a photon sail pushed by a laser. Another is a spacecraft propelled by a stream of relativistic pellets, each transferring momentum to the craft. As a cursory read of Mallove and Matloff’s excellent book The Starflight Handbook shows, we are not limited to rockets only.
Ultimately, Purcell’s conclusion that all speculation about interstellar travel belongs back “on the cereal box” simply doesn’t hold air in the space vacuum.
SETI vs interstellar travel?
Purcell’s paper underscores an unfortunate split in the ranks of scientists. Many scientists interested in SETI maintain that interstellar flight is simply not feasible for any civilization. They argue that we don’t need to physically travel to other planetary systems in order to learn about the rest of the universe. We need only turn our radio telescopes to the sky and search for broadcasts from more advanced civilizations. If we find them, these advanced civilizations will hopefully tell us everything we want to know. We might even find that mature civilizations in space have formed a galactic community of communicating societies. Perhaps they might allow us to join the conversation once we demonstrate enough maturity to engage in interstellar radio communications. This an exciting possibility, if a bit idealistic, and SETI deserves our support.
However, it is important to realize it is not an either-or question. We can research interstellar travel and carry out SETI searches at the same time. Even if SETI searches find communicative aliens to talk to, that will not negate the usefulness of interstellar travel. We will still need interstellar flight to investigate the countless solar systems where such civilizations are not present, and starflight is absolutely necessary for interstellar migration. But it seems like some SETI supporters don’t see it that way.
Denying starflight has become a fundamental tenant of the SETI worldview. It speaks directly to the question of whether it might be dangerous to contact alien civilizations. Many SETI supporters claim that we don’t have to worry about this question. If we assume interstellar travel is impossible, no civilization in space can physically threaten another. As Purcell claims in his paper:
It [communicating with ETI] is a conversation which is, in the deepest sense, utterly benign. No one can threaten anyone else with objects. We have seen what it takes to send objects around, but one can send information for practically nothing. Here one has the ultimate in philosophical discourse – all you can do is exchange ideas, but you do that to your heart’s content.
In my opinion, this is the real reason why Purcell argues so vehemently against the possibility of interstellar flight. In order for communication with ETI to be completely safe, interstellar travel must be impossible for any civilization anywhere in the universe. Contact with ETI becomes more complicated if there is a possibility of encountering them or their technology physically. Of course, we can’t be entirely sure messages from ETI will be entirely harmless either, if they contain instructions or information that might pose a danger.
I suspect that Purcell’s pessimistic arguments against starflight were driven more by his desire to believe that discourse with aliens comes without risks than a genuine interest in the future of space travel. Whatever the disposition of aliens, we can’t allow our personal hopes and dislikes to bias our conclusions. While interstellar travel is very difficult, we can already conceive of ways that a sufficiently motivated civilization could reach the stars.
Directly Imaging a Young ‘Jupiter’
Centauri Dreams continues to follow the fortunes of the Gemini Planet Imager with great interest, and I thank Horatio Trobinson for a recent note reminding me of the latest news from researchers at the Gemini South installation in Chile. The project organized as the Gemini Planet Imager Exoplanet Survey is a three-year effort designed to do not radial velocity or transit studies but actual imaging of young Jupiters and debris disks around nearby stars. Operating at near-infrared wavelengths, the GPI itself uses adaptive optics, a coronagraph, a calibration interferometer and an integral field spectrograph in its high-contrast imaging work.
Launched in late 2014, the GPIES survey has studied 160 targets out of a projected 600 in a series of observing runs, all the while battling unexpectedly bad weather in Chile. Despite all this, project leader Bruce Macintosh (Stanford University), the man behind the construction of GPI, has been able to announce the discovery of the young ‘Jupiter’ 51 Eridani b, working with researchers from almost forty institutions in North and South America. The discovery was confirmed by follow-up work with the W.M. Keck Observatory on Mauna Kea (Hawaii).
Image: Discovery image of 51 Eri b with the Gemini Planet Imager taken in the near-infrared light on December 18, 2014. The bright central star has been mostly removed by a hardware and software mask to enable the detection of the exoplanet one million times fainter. Credits: J. Rameau (UdeM) and C. Marois (NRC Herzberg).
This is a world with about twice the mass of Jupiter, and this news release from the Gemini Observatory is characterizing it as “the most Solar System-like planet ever directly imaged around another star.” The reasons are obvious: 51 Eridani b orbits at about 13 AU, putting it a bit past Saturn in our own Solar System. And although 51 Eridani b is some 100 light years away, Macintosh and colleagues have found a strong spectroscopic signature of methane.
“Many of the exoplanets astronomers have imaged before have atmospheres that look like very cool stars” says Macintosh. “This one looks like a planet.”
Indeed, and we have further evidence that this is a planet rather than a brown dwarf in chance alignment with the star in the form of a recent paper that analyzes the motion of 51 Eridani b and finds it consistent with a forty-year orbit. Moreover, we’re going to be learning a great deal more about this interesting object in years to come, as the paper explains:
Continued astrometric monitoring of 51 Eri b over the next few years should be sufficient to detect curvature in the orbit, further constraining the semimajor axis and inclination of the orbit, and placing the first constraints on the eccentricity. Absolute astrometric measurements of 51 Eri with GAIA (e.g., Perryman et al. 2014), in conjunction with monitoring of the relative astrometry of 51 Eri b, will enable a direct measurement of the mass of the planet. Combined with the well-constrained age of 51 Eri b, such a determination would provide insight into the evolutionary history of low-mass directly imaged extrasolar planets, and help distinguish between a hot-start or core accretion formation process for this planet.
Image: The Gemini Planet Imager utilizes an integral field spectrograph, an instrument capable of taking images at multiple wavelengths – or colors – of infrared light simultaneously, in order to search for young self-luminous planets around nearby stars. The left side of the animation shows the GPI images of the nearby star 51 Eridani in order of increasing wavelength from 1.5 to 1.8 microns. The images have been processed to suppress the light from 51 Eridani, revealing the exoplanet 51 Eridani b (indicated) which is approximately a million times fainter than the parent star. The bright regions to the left and right of the masked star are artifacts from the image processing algorithm, and can be distinguished from real astrophysical signals based on their brightness and position as a function of wavelength. The spectrum of 51 Eridani b, on the right side of the animation, shows how the brightness of the planet varies as a function of wavelength. If the atmosphere was entirely transmissive, the brightness would be approximately constant as a function of wavelength. This is not the case for 51 Eridani b, the atmosphere of which contains both water (H2O) and methane (CH4). Over the spectral range of this GPI dataset, water absorbs photons between 1.5 and 1.6 microns, and methane absorbs between 1.6 and 1.8 microns. This leads to a strong peak in the brightness of the exoplanet at 1.6 microns, the wavelength at which absorption by both water and methane is weakest. Credit: Robert De Rosa (UC Berkeley), Christian Marois (NRC Herzberg, University of Victoria).
Christian Marois (National Research Council of Canada) discusses the nature of the find:
“GPI is capable of dissecting the light of exoplanets in unprecedented detail so we can now characterize other worlds like never before. The planet is so faint and located so close to its star, that it is also the first directly imaged exoplanet to be fully consistent with Solar System-like planet formation models.”
As you would expect, 51 Eridani b is a young planet, young enough that the heat of its formation gives us a solid infrared signature, allowing its direct detection. In addition to being in an orbit that reminds us of the Solar System, the young world is probably the lowest-mass planet yet imaged, just as its atmospheric methane signature is the strongest yet detected. Given that the Gemini Planet Imager Exoplanet Survey is only a fraction of the way through its observing list, we can expect to find more planets in the target area within 300 light years of the Solar System.
The paper is Macintosh et al., “Discovery and spectroscopy of the young jovian planet 51 Eri b with the Gemini Planet Imager,” Science Vol. 350, No. 6256 (2 October 2015), pp. 64-67 (abstract). The follow-up paper is DeRosa et al., “Astrometric Confirmation and Preliminary Orbital Parameters of the Young Exoplanet 51 Eridani b with the Gemini Planet Imager,” accepted at The Astrophysical Journal Letters (preprint).
A Kepler-438b Caveat (and a Digression)
Before we go interstellar, a digression with reference to yesterday’s post, which looked at how we manipulate image data to draw out the maximum amount of information. I had mentioned the image widely regarded as the first photograph, Joseph Nicéphore Niépce’s ‘View from the Window at Le Gras.’ Centauri Dreams regular William Alschuler pointed out that this image is in fact a classic example of what I’m talking about. For without serious manipulation, it’s impossible to make out what you’re seeing. Have a look at the original and compare it to the image in yesterday’s post, which has been processed to reveal the underlying scene.
Image: New official image of the first photograph in 2003, minus any manual retouching. Joseph Nicéphore Niépce’s View from the Window at Le Gras. c. 1826. Gernsheim Collection Harry Ransom Center / University of Texas at Austin. Photo by J. Paul Getty Museum.
And here again is the processed image, a much richer experience.
The University of Texas offers this explanation of how the image was made:
“Niépce thought to capture this image using a light-sensitive material so that the light itself would “etch” the picture for him. In 1826, through a process of trial and error, he finally came upon the combination of bitumen of Judea (a form of asphalt) spread over a pewter plate. When he let this petroleum-based substance sit in a camera obscura for eight hours without interruption, the light gradually hardened the bitumen where it hit, thus creating a rudimentary photo. He “developed” this picture by washing away the unhardened bitumen with lavender water, revealing an image of the rooftops and trees visible from his studio window. Niépce had successfully made the world’s first photograph.”
As with many astronomical photographs, what the unassisted human eye would see is often the least interesting aspect of the story. While we always want to know what a person looking out a window would see, we learn a great deal more by subjecting images to a variety of filters.
Meanwhile, in the Rest of the Galaxy…
Habitable zone planets are a primary attraction of the exoplanet hunt, but so often a tight analysis shows that what we know of a world isn’t enough to confirm its habitable status. Kepler-438b is a case in point, a world that is likely rocky orbiting a red dwarf some 470 light years away in the constellation Lyra. The planet orbits the primary every 35.2 days, but writing in these pages last January, Andrew LePage estimated there was only a one in four chance that Kepler-438b is in the habitable zone, declaring it more likely to be a cooler version of Venus.
Now we have more evidence that a planet some in the media have called ‘Earth-like’ is in fact a wasteland, its chances of life devastated by hard radiation from the host star. Kepler-438 produces huge flares every few hundred days, each of them approximately ten times more powerful than anything we’ve ever recorded on the Sun. These ‘superflares’ are laden with energies of 1033 erg, although energies of 1036 erg have been observed.
But the flares are part of a larger problem for Kepler-438b. They are associated with coronal mass ejections (CMEs), a phenomenon likely to have stripped away the planet’s atmosphere entirely. In work to be published in Monthly Notices of the Royal Astronomical Society, David Armstrong (University of Warwick, UK) and colleagues analyze conditions around the red dwarf. Armstrong explains in a University of Warwick news release:
“If the planet, Kepler-438b, has a magnetic field like the Earth, it may be shielded from some of the effects. However, if it does not, or the flares are strong enough, it could have lost its atmosphere, be irradiated by extra dangerous radiation and be a much harsher place for life to exist.”
Image: The planet Kepler-438b is shown here in front of its violent parent star. It is regularly irradiated by huge flares of radiation, which could render the planet uninhabitable. Here the planet’s atmosphere is shown being stripped away. Credit: Mark A Garlick / University of Warwick.
The relationship of flares and CMEs is complicated, as are the effects of a magnetic field. From the paper:
It is possible that CMEs occur on other stars that produce very energetic flares, which could have serious consequences for any close-in exoplanets without a magnetic field to deflect the influx of energetic charged particles. Since the habitable zone for M dwarfs is relatively close in to the star, any exoplanets could be expected to be partially or completely tidally locked. This would limit the intrinsic magnetic moments of the planet, meaning that any magnetosphere would likely be small. Khodachenko et al. (2007) found that for an M dwarf, the stellar wind combined with CMEs could push the magnetosphere of an Earth-like exoplanet in the habitable zone within its atmosphere, resulting in erosion of the atmosphere. Following on from this, Lammer et al. (2007) concluded that habitable exoplanets orbiting active M dwarfs would need to be larger and more massive than Earth, so that the planet could generate a stronger magnetic field and the increased gravitational pull would help prevent atmospheric loss.
A coronal mass ejection occurs when huge amounts of plasma are blown outward from the star, and the extensive flare activity on Kepler-438 makes CMEs that much more likely. With the atmosphere greatly compromised or stripped away entirely, the flares can do their work, bathing the surface in ultraviolet and X-ray radiation and a sleet of hard particles. For a time, Kepler-438b looked so intriguing from an astrobiological standpoint, especially with its small radius 1.1 the size of Earth’s, but it takes an optimistic assessment of the habitable zone indeed to include it in the first place, and it now appears that the chances for life here are remote.
The paper is Armstrong et al., “The Host Stars of Keplers Habitable Exoplanets: Superflares, Rotation and Activity,” accepted at MNRAS and available as a preprint.
Pluto and How We See It
As I did after yesterday’s post, I occasionally get requests for pictures of objects in natural color, as opposed to significantly enhanced images (at various wavelengths) designed to tease out structure or detail. Here are Pluto and Charon as seen by New Horizons’ LORRI (Long Range Reconnaissance Imager), with color data supplied by the Ralph instrument. The images in this composite are from July 13 and 14 and according to JHU/APL, “…portray Pluto and Charon as an observer riding on the spacecraft would see them.”
For those interested, Jenna Garrett wrote a fine piece for WiReD last summer called What We’re Really Looking at When We Look at Pluto that goes into the instrumentation aboard New Horizons and discusses the philosophical issues separating what we see from what is really there. Let me quote briefly from this:
It’s not hard for a photographer to understand why you’d question actually seeing Pluto—the same question has nagged photographers since Nicéphore Niépce made View from the Window at Le Gras in 1826. A camera is a simple machine: A lens and a shutter that allows the passage of light, which hits the chemical emulsion of film or the pixels of a digital sensor. That intervening technology takes photons bouncing off an object and interpolates them into data. More technology turns that data into an image. And still more technology disseminates that image so you might see it.
I don’t want to get too far into philosophy, but just for fun, here’s the Niépce image.
Image: The first recorded photograph, taken from a window in his study by Joseph Nicéphore Niépce. For more, visit this University of Texas site.
It’s always good to ask how images are processed, especially when dealing with data being returned from the edge of the Solar system through a number of instruments. New Horizons carries three imagers: The aforementioned LORRI and Ralph, along with Alice, an ultraviolet imaging spectrometer. Ralph has ten times the resolution of the human eye, but we use data as needed from the instrument packages to ferret out what scientists are looking for.
I also like this quote by Jon Lomberg, a deeply felt ratification of New Horizons from Garrett’s piece:
“You don’t really have to understand a lot about astronomy to know how difficult this is,” says Lomberg. “Getting it there, having it work for nine years and having it do exactly what they’re telling it to do. You just want to applaud. It makes everybody think Goddamn! that was a good thing to do!“
I’m sure that most of the Centauri Dreams audience is with Jon on that sentiment. But let’s move on to further news about Pluto from the Division of Planetary Sciences meeting in Maryland. In analyzing the New Horizons data, we learn that Pluto’s upper atmosphere is a good deal more compact and significantly colder than we had thought based upon earlier models. Pluto’s atmosphere seems to escape more or less the same way that atmospheric gases do on Earth or Mars, rather than acting as we would expect from a cometary body. Here is the already famous image that showed us ‘blue skies’ (of a sort) on Pluto.
Image: Pluto’s haze layer shows its blue color in this picture taken by the New Horizons Ralph/Multispectral Visible Imaging Camera (MVIC). The high-altitude haze is thought to be similar in nature to that seen at Saturn’s moon Titan. The source of both hazes likely involves sunlight-initiated chemical reactions of nitrogen and methane, leading to relatively small, soot-like particles (called tholins) that grow as they settle toward the surface. This image was generated by software that combines information from blue, red and near-infrared images to replicate the color a human eye would perceive as closely as possible. Credits: NASA/JHUAPL/SwRI.
Here again we’re aiming at something close to what the human eye would perceive.
We also learned at DPS that Pluto’s moons are unlike anything we’re familiar with in the rest of the Solar System. Most inner moons in the Solar System (including ours) move in synchronous rotation, with one face always toward the planet. But the small moons of Pluto do nothing of the kind. Hydra rotates 89 times during a single circuit of Pluto, while the rest of the small moons rotate faster than we would expect as well. Have a look at the chart that Mark Showalter (SETI Institute) prepared for presentation at DPS.
Image: Spin periods for the range of Pluto’s moons. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.
We may well be looking at chaotic spin rates — variable over time — and likely the result of the torque Charon exerts, which would keep the moons from settling into synchronicity. Showalter characterized these wobbling moons as ‘spinning tops,’ and it also appears that several of them may have been formed by merger, with two or more former moons coming together following the event that created Charon. That would make sense — surely there were a large number of objects after a massive impact, with the present system having consolidated from these. Here’s the slick video illustrating the motion of these moons that NASA has produced.
I love Showalter’s take on all this in a SETI Institute news release: “There’s clearly something fundamental about the dynamics of the system that we do not understand. We expected chaos, but this is pandemonium.”
Pluto’s Unexpected Complexities
Keeping up with a site like this can be a daunting task, especially when intriguing papers can pop up at any time and announcements of new finds by our spacecraft come in clusters. But site maintenance itself can be tricky. Recently Centauri Dreams regular Tom Mazanec wrote in with a project to be added to the links on the home page and before long, with my encouragement, he had sent a number of solid suggestions on exoplanet projects both Earth- and space-based, most of which have now been added. My thanks to Tom and all those who have at various times caught a broken link or added a suggestion for new links or stories.
We begin the week looking at work discussed at the Division for Planetary Sciences meeting in Maryland, starting with the continuing bounty coming in from New Horizons. I always like to quote Alan Stern, because as principal investigator for New Horizons, he is not only its chief spokesman but the guiding force that saw this mission become a reality. And I think he’s absolutely on target when he points to how fulsome a discovery Pluto is turning out to be:
“It’s hard to imagine how rapidly our view of Pluto and its moons are evolving as new data stream in each week,” says Stern. “As the discoveries pour in from those data, Pluto is becoming a star of the solar system. Moreover, I’d wager that for most planetary scientists, any one or two of our latest major findings on one world would be considered astounding. To have them all is simply incredible.”
Image: Pluto and Charon are revealing themselves as worlds of profound complexity. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.
All this with a flyby, leading me to wonder what we might find with a Pluto orbiter.
Think about Voyager. It opened our eyes to new worlds for this first time. It flew by Io and gave us active volcanoes. It flew by Triton and we saw weird ‘cantaloupe terrain’ and nitrogen geysers. All these stay fixed in my mind as I remember first learning about them. But what New Horizons is showing us ranges from bizarre moons to possible ice volcanoes, the huge satellite Charon in a system that is practically a binary ‘planet,’ and surface features that tell us about an active world that was once thought to be inert. Who thought Pluto/Charon would be this complex!
Wright Mons and Piccard Mons, as it turns out, each appear to have a hole at their summit, the signature of a volcano, but one expected to cough up water ice, nitrogen, ammonia or methane in a melted slurry rather than lava. We can’t push this too far, because on a world about which we have so much to learn, we may be in for yet another surprise. And Oliver White, a postdoctoral researcher at NASA Ames, points out another of the unknowns:
“If they are volcanic, then the summit depression would likely have formed via collapse as material is erupted from underneath. The strange hummocky texture of the mountain flanks may represent volcanic flows of some sort that have travelled down from the summit region and onto the plains beyond, but why they are hummocky, and what they are made of, we don’t yet know.”
Image: Scientists using New Horizons images of Pluto’s surface to make 3-D topographic maps have discovered that two of Pluto’s mountains, informally named Wright Mons and Piccard Mons, could possibly be ice volcanoes. The color is shown to depict changes in elevation, with blue indicating lower terrain and brown showing higher elevation; green terrains are at intermediate heights. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.
As this JHU/APL news release makes clear, Pluto’s surface is also showing us far more textures than we might have expected. Why do we see so few small craters? Neither Pluto nor Charon give us many of these, casting doubt on the older model of Kuiper Belt objects formed by the accumulation of small objects. Now you can see why 2014 MU69 is beginning to loom so large. This KBO may be a pristine primordial planetesimal, the first ever to be explored. Assuming the New Horizons mission is extended, a flyby of 2014 MU69 will give us another look at a class of objects that may have been formed quickly and at close to their current size.
But we still have a lot of explaining to do re Pluto’s surface itself. Trying to determine the age of a surface is often a matter of counting the crater impacts to see what has accumulated over time (think of the relatively smooth surface of Europa, which indicates continuing resurfacing that obscures impacts). On Pluto, we do find surfaces that point to the earliest era of the Solar System four billion years ago, but we also see things like Sputnik Planum, whose smooth and impact-free terrain looks to have been formed within the past ten million years, an eyeblink in astronomical time.
Image: A slide from Oliver White’s presentation at DPS, showing crater densities on Pluto’s surface. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.
Other terrains on Pluto look to be somewhere in between, with evidence of cratering extending back not nearly as far as the oldest areas. So is Sputnik Planum, which is on the left of Pluto’s heart-shaped feature, an anomaly, or a marker for a surface that has been geologically active for much of its history? We’re looking at evidence for how objects in the outer Solar System formed, again a splendid reason to back the extension of New Horizons to 2014 MU69.
More on Pluto/Charon and the findings discussed at the DPS meeting tomorrow.