Centauri Dreams

Imagining and Planning Interstellar Exploration

Interstellar Probe: Prospects for ESA Technologies

The Interstellar Probe concept being developed at Johns Hopkins Applied Physics Laboratory is not alone in the panoply of interstellar studies. We’ve examined the JHU/APL effort in a series of articles, the most recent being NASA Interstellar Probe: Overview and Prospects. But we should keep in mind that a number of white papers have been submitted to the European Space Agency in response to the effort known as Cosmic Vision and Voyage 2050. One of these, called STELLA, has been put forward to highlight a potential European contribution to the NASA probe beyond the heliosphere.

Image: A broad theme of overlapping waves of discovery informs ESA’s Cosmic Vision and Voyage 2050 report, here symbolized by icy moons of a gas giant, an temperate exoplanet and the interstellar medium itself, with all it can teach us about galactic evolution. Among the projects discussed in the report is NASA’s Interstellar Probe concept. Credit: ESA.

Remember that Interstellar Probe (which needs a catchier name) focuses on reaching the interstellar medium beyond the heliosphere and studying the interactions there between the ‘bubble’ that surrounds the Solar System and interstellar space beyond. The core concept is to launch a probe explicitly designed (in ways that the two Voyagers currently out there most certainly were not) to study this region. The goal will be to travel faster than the Voyagers with a complex science payload, reaching and returning data from as far away as 1000 AU in a working lifetime of 50 years.

But note that ‘as far away as 1000 AU’ and realize that it’s a highly optimistic stretch goal. A recent paper, McNutt et al., examined in the Centauri Dreams post linked above, explains the target by saying “To travel as far and as fast as possible with available technology…” and thus to reach the interstellar medium as fast as possible and travel as far into it as possible with scientific data return lasting 50 years. From another paper, Brandt et al. (citation below) comes this set of requirements:

  • The study shall consider technology that could be ready for launch on 1 January 2030.
  • The design life of the mission shall be no less than 50 years.
  • The spacecraft shall be able to operate and communicate at 1000 AU.
  • The spacecraft power shall be no less than 300 W at end of nominal mission.

This would be humanity’s first mission dedicated to reaching beyond the Solar System in its fundamental design, and it draws attention across the space community. How space agencies work together could form a major study in itself. For today, I’ll just mention a few bullet points: ESA’s Faint Object Camera (FOC) was aboard Hubble at launch, and the agency built the solar panels needed to power up the instrument. The recent successes of the James Webb Space Telescope remind us that it launched with NIRSpec, the Near-InfraRed Spectrograph, and the Mid-InfraRed Instrument (MIRI), both contributed by ESA. And let’s not forget that JWST wouldn’t be up there without the latest version of the superb Ariane 5 launcher, Ariane 5 ECA. Nor should we neglect the cooperative arrangements in terms of management and technical implementation that have long kept the NASA connection with ESA on a productive track.

Image: This is Figure 1 from Brandt et al., a paper cited below out of JHU/APL that describes the Interstellar Probe mission from within. Caption: Fig. 1. Interstellar Probe on a fast trajectory to the Very Local Interstellar Medium would represent a snapshot to understand the current state of our habitable astrosphere in the VLISM, to ultimately be able to understand where our home came from and where it is going.

So it’s no surprise that a mission like Interstellar Probe would draw interest. Earlier ESA studies on a heliopause probe go back to 2007, and the study overview of that one can be found here. Outside potential NASA/ESA cooperation, I should also note that China is likewise studying a probe, intrigued by the prospect of reaching 100 AU by the 100th anniversary of the current government in 2049. So the idea of dedicated missions outside the Solar System is gaining serious traction.

But back to the Cosmic Vision and Voyage 2050 report, from which I extract this:

The great challenge for a mission to the interstellar medium is the requirement to reach 200 AU as fast as possible and ideally within 25-30 years. The necessary power source for this challenging mission requires ESA to cooperate with other agencies. An Interstellar Probe concept is under preparation to be proposed to the next US Solar and Space Physics Decadal Survey for consideration. If this concept is selected, a contribution from ESA bringing the European expertise in both remote and in situ observation is of significance for the international space plasma community, as exemplified by the successful joint ESA-NASA missions in solar and heliospheric physics: SOHO, Ulysses and Solar Orbiter.

I’m looking at the latest European white paper on the matter, whose title points to what could happen assuming the JHU interstellar probe concept is selected in the coming Heliophysics Decadal Survey (as we know, this is a big assumption, but we’ll see). The paper, “STELLA—Potential European contributions to a NASA-led interstellar probe,” appeared recently in Frontiers of Astronomy and Space Science (citation below), highlighting possible European contributions to the JHU/APL Interstellar Probe mission, and offering a quick overview of its technology, payload and objectives.

As mentioned, the only missions to have probed this region from within are the Voyagers, although the boundary has also been probed remotely in energetic neutral atoms by the Interstellar Boundary Explorer (IBEX) as well as the Cassini mission to Saturn. We’d like to go beyond the heliosphere with a dedicated mission not just because it’s a step toward much longer-range missions but also because the heliosphere itself is a matter of considerable controversy. Exactly what is its shape, and how does that shape vary with time? Sometimes it seems that our growing catalog of data has only served to raise more questions, as is often the case when pushing into territories previously unexplored. The white paper puts it this way:

The many and diverse in situ and remote-sensing observations obtained to date clearly emphasize the need for a new generation of more comprehensive measurements that are required to understand the global nature of our Sun’s interaction with the local galactic environment. Science requirements informed by the now available observations drive the measurement requirements of an ISP’s in situ and remote-sensing capabilities that would allow [us] to answer the open questions…

We need, in other words, to penetrate and move beyond the heliosphere to look back at it, producing the overview needed to study these interactions properly. But let’s pause on that term ‘interstellar probe.’ Exactly how do we characterize space beyond the heliosphere? Both our Voyager probes are now considered to be in interstellar space, but we should consider the more precise term Very Local Interstellar Medium (VLISM), and realize that where the Voyagers are is not truly interstellar, but a region highly influenced by the Sun and the heliosphere. The authors are clear that even VLISM doesn’t apply here, for to reach what they call the ‘pristine VLISM’ demands capabilities beyond even the interstellar probe concept being considered at JHU.

Jargon is tricky in any discipline, but in this case it helps to remember that we move outward in successive waves that are defined by our technological capabilities. If we can get to several hundred AU, we are still in a zone roiled by solar activity, but far enough out to draw meaningful conclusions about the heliosphere’s relationship to the solar wind and the effects of its termination out on the edge. In these terms, we should probably consider JHU/APL’s Interstellar Probe as a mission toward the true VLISM. Will it still be returning data when it gets there? A good question.

IP is also a mission with interesting science to perform along the way. A spacecraft on such a trajectory has the potential for flybys of outer system objects like dwarf planets (about 130 are known) and the myriad KBOs that populate the Kuiper Belt. Dust observations at increasing distances would help to define the circumsolar dust disk on which the evolution of the Solar System has depended, and relate this to what we see around other stars. We’ll also study extragalactic background light that should provide information about how stars and galaxies have evolved since the Big Bang.

Image: A visualization of Interstellar Probe leaving the Solar System. Credit: European Geosciences Union, Munich.

The white paper offers the range of outstanding science questions that come into play, so I’ll send you to it for more but ultimately to the latest two analytical descriptions out of JHU/APL, which are listed in the citations below. To develop instruments to meet these science goals would involve study by a NASA/ESA science definition team, and of course depends on whether the Interstellar Probe concept makes it through the Decadal selection. It’s interesting to see, though, that among the possible contributions this white paper suggests from ESA is one involving a core communications capability:

One of the key European industrial and programmatic contributions proposed in the STELLA proposal to ESA is an upgrade of the European deep space communication facility that would allow the precise range and range-rate measurements of the probe to address STELLA science goal Q5 [see below] but would also provide additional downlink of ISP data and thus increase the ISP science return. The facility would be a critical augmentation of the European Deep Space Antennas (DSA) not only for ISP but also for other planned missions, e.g., to the icy giants.

Q5, as referenced above, refers to testing General Relativity at various spatial scales all the way up to 350 AU, and the authors note that less than a decade after launch, such a probe would need a receiving station with the equivalent of 4 35-meter dishes, an architecture that would be developed during the early phases of the mission. On the spacecraft itself, the authors see the potential for providing the high gain antenna and communications infrastructure in a fully redundant X-band system that represents mature technology today. I’m interested to see that they eschew optical strategies, saying these would “pose too stringent pointing requirements on the spacecraft.”

STELLA makes the case for Europe:

The architecture of the array should be studied during an early phase of the mission (0/A). European industries are among the world leaders in the field. mtex antenna technology. (Germany) is the sole prime to develop a production-ready design and produce a prototype 18-m antenna for the US National Research Observatory (NRAO) Very Large Array (ngVLA) facility. Thales/Alenia (France/Italy), Schwartz Hautmont (Spain) are heavily involved in the development of the new 35-m DSA antenna.

As the intent of the authors is to suggest possible European vectors for collaboration in Interstellar Probe, their review of key technology drivers is broad rather than deep; they’re gauging the likelihood of meshing areas where ESA’s expertise can complement the NASA concept, some of them needing serious development from both sides of the Atlantic. Propulsion via chemical methods could work for IP, for example, given the options of using heavy lift vehicles like NASA SLS and the possibility, down the road, of a SpaceX Starship or BlueOrigin vehicle to complement the launch catalog. The availability of such craft coupled with a passive gravity assist at Jupiter points to a doubling of Voyager’s escape velocity, reaching 7.2 AU per year. (roughly 34 kilometers per second).

As to power, NASA is enroute to bringing the necessary nuclear package online via the Next-Generation Radioisotope Thermoelectric Generator (NextGen RTG) under development at NASA Glenn. But improvements in communications at this range represent one area where European involvement could play a role, as does reliability of the sort that can ensure a viable mission lasting half a century or more. Thus:

Development and implementation of qualification procedures for missions with nominal lifetimes of 50 years and beyond. This would provide the community with knowledge of designing long-lived space equipment and be helpful for other programs such as Artemis.

This area strikes me as promising. We’ve already seen how spacecraft never designed for missions of such duration have managed to go beyond the heliosphere (the Voyagers), and developing the hardware with sufficient reliability seems well within our capabilities. Other areas ripe for further development are pointing accuracy and deep space communication architectures, thus the paper’s emphasis on ESA’s role in refining the use of integrated deep space transponders for Interstellar Probe.

Whether the JHU/APL Interstellar Probe design wins approval or not, the fact that we are considering these issues points to the tenacious vitality of space programs looking toward expansion into the outer Solar System and beyond, a heartening thought as we ponder successors to the Voyagers and New Horizons. The ice giants and the VLISM region will truly begin to reveal their secrets when missions like these fly. And how much more so if, along the way, a propulsion technology emerges that reduces travel times to years instead of decades? Are beamed sails the best bet for this, or something else?

The paper is Wimmer-Schweingruber et al., “STELLA—Potential European contributions to a NASA-led interstellar probe,” a whitepaper that was submitted to NASA’s 2023/2024 decadal survey based on a proposal submitted to the European Space Agency (ESA) in response to its 2021 call for medium-class mission proposals. Frontiers in Astronomy and Space Sciences, 17 November 2022 (full text).

For detailed information about Interstellar Probe, see McNutt et al., “Interstellar probe – Destination: Universe!” Acta Astronautica Vol. 196 (July 2022), 13-28 (full text) as well as Brandt et al., “Interstellar Probe: Humanity’s exploration of the Galaxy Begins,” Acta Astronautica Volume 199 (October 2022), pages 364-373 (full text).

tzf_img_post

KOBE: The Hunt for Habitable Zone K-dwarf Planets

From the standpoint of producing interesting life, K-dwarf stars look intriguing. Our G-class Sun is warm and cozy, but its lifetime is only about 10 billion years, while K-dwarfs (we can also call them orange dwarfs) can last up to 45 billion years. That’s plenty of time for evolution to work its magic, and while G-stars make up only about 6 or 7 percent of the stars in the galaxy, K-dwarfs account for three times that amount. We have about a thousand K-dwarfs within 100 light years of the Solar System.

When Edward Guinan (Villanova University) and colleague Scott Engle studied K-dwarfs in a project called “GoldiloKs,” they measured the age, rotation rate, and X-ray and far-ultraviolet radiation in a sampling of mostly cool G and K stars (see Orange Dwarfs: ‘Goldilocks’ Stars for Life?). Their work took in a number of K-stars hosting planets, including the intriguing Kepler-442, which has a rocky planet in the habitable zone. Kepler-442b is where we’d like it to be in terms of potential habitability, but it’s twice as massive as Earth, and it also raises the question of why it seems rare.

In other words, why do we have so few habitable zone planet detections around K-dwarfs? A new paper from astronomers with the European Southern Observatory points out that we have only a small number of such worlds at present. Have a look at the table below to see what I mean. Here we’re looking at known planets, which means either confirmed or validated, that are found within the ‘optimistic’ habitable zone of K-dwarfs with an effective temperature between 3800 K and 4600 K. ESO’s Jorge Lillo-Box and his fellow researchers go so far as to declare this lack of habitable zone worlds ‘the K-dwarf habitable zone desert’ in the paper, which has just appeared in Astronomy & Astrophysics (citation below).

Image: This is Table 1 from the paper, showing confirmed or validated K-dwarf planets in the habitable zone known as of May 2022.

A few points about all this. First of all, note the distinction above between ‘confirmed’ and ‘validated’ planets, two terms that are all too often conflated but mean different things. With the original Kepler mission (before the K2 extended mission), a planet candidate was a transit signal that passed various false-positive tests. A ‘validated’ planet is one that has been studied in follow-up observations and determined quantitatively to be more likely an exoplanet than a false positive. Thus ‘validated’ means a planet about which confidence of its existence is higher than a simple ‘candidate.’ Confirmation, as for example through radial velocity study of the transiting planet’s mass, is the final step in turning a validated planet into a confirmed one.

So what’s happening in the K-dwarf desert, or more to the point, why is the desert there in the first place? Expectations are high, for example, that a K-dwarf like Centauri B might host a planet in the habitable zone, such orbits being allowed even in this close binary system. The authors point out that the reason may simply be lack of observation. There is understandable emphasis on G-class stars because they are like the Sun, while M-dwarfs are highly studied because planets there are more readily detectable. What is needed is a dedicated program of K-dwarf observations.

Image: This infographic compares the characteristics of three classes of stars in our galaxy: Sunlike stars are classified as G stars; stars less massive and cooler than our Sun are K dwarfs; and even fainter and cooler stars are the reddish M dwarfs. The graphic compares the stars in terms of several important variables. The habitable zones, potentially capable of hosting life-bearing planets, are wider for hotter stars. The longevity for red dwarf M stars can exceed 100 billion years. K dwarf ages can range from 15 to 45 billion years. And, our Sun only lasts for 10 billion years. Red dwarfs make up the bulk of the Milky Way’s population, about 73%. Sunlike stars are merely 6% of the population, and K dwarfs are at 13%. When these four variables are balanced, the most suitable stars for potentially hosting advanced life forms are K dwarfs, sometimes called orange dwarf stars. Credit: NASA, ESA, and Z. Levy (STScI).

Enter KOBE, standing for K-dwarfs Orbited By habitable Exoplanets. This is a survey, introduced by Lillo-Box and team, that monitors the radial velocity of 50 pre-selected K-dwarfs using the CARMENES spectrographs mounted on the 3.5m telescope at the Calar Alto Observatory in southern Spain. Given the capabilities of the instruments and planet occurrence expectations, the team believes it will find 1.68 ± 0.25 planets per star, with about half of these likely to be planets in the habitable zone.

The choice of K-dwarfs is interesting in various ways. I’ve focused on this class of star recently because while G-class stars like the Sun offer obvious analogies to our own Solar System, their size puts habitable zone orbits far enough from the star that a radial velocity campaign to detect them takes years. Bear in mind as well that, as I learned from this paper, G-class stars produce a lot more stellar noise than K-dwarfs, making detections more problematic, even with instruments like ESPRESSO.

With M-dwarfs as well, we run into intrinsic problems. They tend to be active stars, so that working at the levels needed to detect habitable zone planets likewise means extracting data from the noise (not to mention the effect of flares on potential habitability!) We’ll continue to put a lot of emphasis on M-dwarfs because with habitable zones closer to their smaller stars, any planets there are quite detectable, but given the advantages of K-dwarf detection, an effort like KOBE is well justified.

The authors make the case this way:

K-dwarfs have their HZ located at longer periods (typically 50-200 days), where planets can have their rotation and orbital periods decoupled, thus allowing the planet to have day-night cycles. Stellar activity and magnetic flaring is dramatically diminished for stars earlier than M3 and specially in the late K-type domain… Consequently, habitability is not threatened by these effects as much as it is in the HZ planets around M dwarfs. Besides, unlike in M-dwarfs, we can derive, in a standard way, precise and reliable stellar parameters, as well as chemical abundances that are relevant to a proper characterization of the planets and the star-planet connection…

Indeed, K-dwarfs show UV and X-ray radiation levels 5 to 50 times smaller when they are young than early M-dwarfs, an interesting point re habitability prospects. And as the authors note, this type of star offers ‘the best trade-off’ to detect biosignature molecules through direct imaging using next-generation space observatories.

So I’m all for KOBE and similar efforts that may arise to populate our catalog of habitable planets around this interesting kind of star. Because it focuses on the detection of new worlds in the habitable zone, KOBE rules out stars that have already had exoplanets discovered around them or are highly monitored by other surveys. The effort runs through 2024, and if things go according to expectation, we should wind up with about 25 new planets in the so-called K-dwarf habitable zone desert.

Image: The 3.5m telescope at Calar Alto Observatory under the Milky Way. Credit: CAHA.

It’s always useful to delve into anomalies that seem to be the result of observational biases in getting a more accurate picture of the systems in the stellar neighborhood. And while it’s a southern sky object and thus out of range of KOBE’s efforts at Calar Alto, I still have high hopes for Centauri B, the closest K-dwarf to Earth…

The paper is Lillo-Box et al., “The KOBE experiment: K-dwarfs Orbited By habitable Exoplanets Project goals, target selection, and stellar characterization,” Astronomy & Astrophysics Vol. 667 (15 November 2022) A102. Full text.

tzf_img_post

Simultaneous Growth of Planet & Star?

I’m interested in a new paper on planet formation, not only for its conclusions but its methodology. What Amy Bonsor (University of Cambridge) and colleagues are drawing from their data is how quickly planets can form. We’ve looked numerous times in these pages at core accretion models that explain the emergence of rocky worlds and gravitational instability models that may offer a way of producing a gas giant. But how long after the formation of the circumstellar disk do these classes of planets actually appear?

A planet like the Earth poses fewer challenges than a Jupiter or Saturn. Small particles run into each other within the gas and dust disk surrounding the young star, assembling planets and other debris through a process of clumping that eventually forms planetesimals that themselves interact and collide. Thus core accretion: The planet ‘grows’ in ways that are readily modeled and can be observed in disks around other stars.

But the gas giants still pose problems. Core accretion would suggest the growth of a solid core that gradually draws in mass until a dense atmosphere enshrouds it. But the core accretion process, according to the latest models, takes long enough that by the time it has finished, the disk is depleted. Gas giants are primarily made of hydrogen and helium, but these gases disappear from the disk in relatively short order. So a gas giant’s formation has to occur quickly and early, before the needed hydrogen and helium are blown off by radiation from the young star or consumed by it.

Gas giants around M-dwarfs may be one route to follow here, for core accretion seems to operate more slowly in M-dwarf systems, and we almost have to call in relatively short-order clumping – this is the disk instability model championed by Alan Boss – to explain how such worlds could form. So one observational path into gas giant formation is to look for such worlds around smaller stars, where their presence would indicate a model other than core accretion at work. But Bonsor and team have chosen an ingenious alternate route: They’re probing the atmospheres of white dwarf stars.

Bonsor points out that “[s]ome white dwarfs are amazing laboratories, because their thin atmospheres are almost like celestial graveyards.” Exactly so, because so-called ‘polluted’ white dwarfs show clear signs of heavy elements like magnesium, iron and calcium in their atmospheres, and the assumption is that these elements must be the result of small bodies within the stellar system falling into the parent star. So the method here is to use spectroscopic observations to probe the composition of asteroids that are long gone, but whose traces help us chart the conditions of their formation.

Studying the atmospheres of more than 200 polluted white dwarfs, the researchers found that the elements there can only be explained by the infall of asteroids that have undergone differentiation. In other words, they have gone through the process of melting, with iron sinking into their core while lighter elements rise to the surface. Amy Bonsor explains:

“The cause of the melting can only be attributed to very short-lived radioactive elements, which existed in the earliest stages of the planetary system but decay away in just a million years. In other words, if these asteroids were melted by something which only exists for a very brief time at the dawn of the planetary system, then the process of planet formation must kick off very quickly.”

Image: This is Figure 3 from the paper. Caption: The core- or mantle-rich materials in the atmospheres of white dwarfs are the collision fragments of planetesimals that formed earlier than ?1 Myr, when large-scale melting was fueled by the decay of 26Al. Alternatively, in the most massive, close-in, highly excited, planetesimal belts, catastrophic collisions between Pluto-sized bodies (anything with D > 1, 400 km) could supply most smaller planetesimals. Gravitational potential energy during accretion can fuel large-scale melting and core formation in these large bodies, such that almost all planetary bodies in the belt are the collision fragments of core–mantle differentiated bodies. tMS , tGB and tWD refer to the star’s main-sequence, giant branch lifetimes and the start of the white dwarf phase. Credit: Bonsor et al.

The researchers’ simulations of planetesimal and collisional evolution show that short-lived radioactive nuclides like Aluminium-26 (26Al) are the most likely heat source to explain the accreted iron core or mantle material. From the paper:

The need for enhanced abundances of 26Al to explain core- or mantle-rich white dwarf spectra provides distinct evidence for the early formation of planetesimals in exoplanetary systems contemporaneously with star formation. Rapid planetesimal formation offers an explanation for the difference in mass budgets between Class 0, I and II discs [6]. Our findings point to the growth of large, > 10 km-sized planetesimals, potentially even planetary cores, rather than just the coagulation of pebbles. The earlier planetary cores form, the more likely they are to grow to the pebble isolation mass and the more likely giant planet formation is to occur early-on, which can provide an explanation for substructures commonly observed with ALMA.

Early planet formation helps explain how gas giant planets form and seems to put pressure on gravitational instability models, although there may be multiple routes to the same result. But it is striking that the researchers, who include scientists at Oxford, the Ludwig-Maximilians-Universität in Munich, the University of Groningen and the Max Planck Institute for Solar System Research, Gottingen, find evidence for the early formation of planetesimals “contemporaneously with star formation.”

Thus star and planet formation begin concurrently, under this model, with planets evolving during the collapse of the circumstellar disk. I’ve always found white dwarfs fascinating, but that we might probe the origins of stellar systems by analyzing the composition of their atmospheres is remarkable. It points to the continuing vitality of work on this class of star for understanding both planetary and stellar evolution.

The paper is Bonsor et al. “Rapid formation of exoplanetesimals revealed by white dwarfs,”’ Nature Astronomy 14 November 2022. Abstract.

tzf_img_post

Super Earths/Hycean Worlds

Dave Moore is a Centauri Dreams regular who has long pursued an interest in the observation and exploration of deep space. He was born and raised in New Zealand, spent time in Australia, and now runs a small business in Klamath Falls, Oregon. He counts Arthur C. Clarke as a childhood hero, and science fiction as an impetus for his acquiring a degree in biology and chemistry. Dave has kept up an active interest in SETI (see If Loud Aliens Explain Human Earliness, Quiet Aliens Are Also Rare) as well as the exoplanet hunt, and today examines an unusual class of planets that is just now emerging as an active field of study.

by Dave Moore

Let me draw your attention to a paper with interesting implications for exoplanet habitability. The paper is “Potential long-term habitable conditions on planets with primordial H–He atmospheres,” by Marit Mol Lous, Ravit Helled and Christoph Mordasini. Published in Nature Astronomy, this paper is a follow-on to Madhusudhan et al’s paper on Hycean worlds. Paul’s article Hycean Worlds: A New Candidate for Biosignatures caught my imagination and led to this further look.

Both papers cover Super-Earths, planets larger than 120% of Earth’s radius, but smaller than the Sub-Neptunes, which are generally considered to start at twice Earth’s radius. Super-Earths occur around 40% of M-dwarf stars examined and are projected to constitute 30% of all planets, making them the most common type in the galaxy. Hycean planets are a postulated subgroup of Super-Earths that have a particular geology and chemistry; that is, they have a water layer above a rocky core below a hydrogen–helium primordial atmosphere.

We’ll be hearing a lot more about these worlds in the future. They are similar enough to Earth to be regarded as a good target for biomarkers, but being larger than Earth, they are easier to detect via stellar Doppler shift or stellar transit, and their deep atmospheres make obtaining their spectra easier than with terrestrial worlds. The James Webb telescope is marginal for this purpose, but getting detailed atmospheric spectra is well within the range of the next generation of giant, ground-based telescopes: the 39-meter Extremely Large Telescope and the 24.5-meter Giant Magellan Telescope, both of which are under construction and set to start collecting data by the end of the decade (the status of the Thirty Meter Telescope is still problematic).

Earth quickly lost its primordial hydrogen-helium atmosphere, but once a planet’s mass reaches 150% of Earth’s, this process slows considerably and planets more massive than that can retain their primordial atmosphere for gigayears. Hydrogen, being a simple molecule, does not have a lot of absorption lines in the infrared, but under pressure, the pressure-broadening of these lines makes it a passable greenhouse gas.

If the atmosphere is of the correct depth, this will allow surface water to persist over a much wider range of insolation than with Earth-like planets. With enough atmosphere, the insulating effect is sufficient to maintain temperate conditions over geological lengths of time from the planet’s internal heat flow alone, meaning these planets, with a sufficiently dense atmosphere, can have temperate surface conditions even if they have been ejected from planetary systems and wander the depths of space.

Figure 1: This is a chart from Madhusudhan et al’s paper showing the range where Hycean planets maintain surface temperatures suitable for liquid water, compared with the habitable zone for terrestrial planets as derived by Kopparapu et al. ‘Cold Hycean’ refers to planets where stellar insolation plays a negligible part in heating the surface. Keep in mind, that Lous et al regard the inner part of this zone as unviable due to atmospheric loss.

Madhusudhan et al’s models were a series of static snapshots under a variety of conditions. Lous et al’s paper builds on this by modeling the surface conditions of these planets over time. The authors take a star of solar luminosity with a solar evolutionary track and, using 1.5, 3 and 8 Earth mass planets, model the surface temperature over time at various distances and hydrogen overpressures, also calculating in the heat flow from radiogenic decay.

Typically, a planet will start off too hot. Its steam atmosphere will condense, leaving the planet with oceans; and after some period, the surface temperature will fall below freezing. The chart below shows the length of time a planet has a surface temperature that allows liquid water. (Note that, because of higher surface pressures, water in these scenarios has a boiling point well over 100°C, so the oceans may be considered inhospitable to life for parts of their range.)

Planets with small envelope masses have liquid water conditions relatively early on, while planets with more massive envelopes reach liquid water conditions later in their evolution. Out to 10 au, stellar insolation is the dominant factor in determining the surface temperature, but further out than that, the heat of radiogenic decay takes over. The authors use log M(atm)/log M(Earth) on their Y axis, which I didn’t find very helpful. To convert this to an approximate surface pressure in bars, make the following conversions: 10-6 = 1 bar, 10-5 = 10 bar, 10-4 = 100 bar and so on.

Figure 2: Charts a-c are for core masses of 1.5 (a), 3 (b) and 8 M? (c). The duration of the total evolution is 8 Gyr. The color of a grid point indicates how long there were continuous surface pressures and temperatures allowing liquid water, ?lqw. These range from 10 Myr (purple) to over 5 Gyr (yellow). Gray crosses correspond to cases with no liquid water conditions lasting longer than 10 Myr. Atmospheric loss is not considered in these simulations. d is the results for planets with a core mass of 3 M?, but including the constraint that the surface temperature must remain between 270 and 400 K. Every panel contains an ‘unbound’ case where the distance is set to 106 AU and solar insolation has become negligible.

The authors then ran their model adjusted for hydrodynamic escape (Jeans escape is negligible). This loss of atmosphere mainly affects the less massive, closer in planets with thinner atmospheres.

To quote:

The results when the hydrodynamic escape model is included are shown in Fig. 3. In this case, we find that there are no long-term liquid water conditions possible on planets with a primordial atmosphere within 2au. Madhusudhan et al. found that for planets around Sun-like stars, liquid water conditions are allowed at a distance of ~1 au. We find that the pressures required for liquid water conditions between 1 and 2au are too low to be resistant against atmospheric escape, assuming that the planet does not migrate at a late evolutionary stage.

Figure 3: Charts a-c are for core masses of 1.5 (a), 3 (b) and 8 M? (c). d is the results for planets with a core mass of 3 M?, but including the constraint that the surface temperature must remain between 270 and 400 K. Note: escape inhibits liquid water conditions by removing the atmosphere for close-in planets with low initial envelope masses. Lower core masses are more affected.

The authors also note that their simulations indicate that, unlike terrestrial planets which require climatic negative feedback loops to retain temperate conditions, Hycean worlds are naturally stable over very long periods of time.

The authors then go on to discuss the possibility of life, pointing out that the surface pressures required are frequently in the 100 to 1000 bar range, which is the level of the deep ocean and with similar light levels, so photosynthesis is out. This is a problem searching for biomarkers because photosynthesis produces chemical disequilibria, which are considered a sign of biological activity, whereas chemotrophs, the sort of life forms you would expect to find, make their living by destroying chemical disequilibria.

The authors hope to do a similar analysis with red dwarf stars as these are the stars where Super-Earths occur most frequently. Also, they are the stars where the contrast between stellar and planetary luminosity gives the best signal.

Thoughts and Speculations

The exotic nature of these planets lead me to examine their properties, so here are some points I came up with that you may want to consider:

i) The Fulton Gap—also called the small planet mass-radius valley. Small planets around stars have a distinctly bimodal distribution with peaks at 1.3 Earth radii and 2.4 Earth radii with a minimum at 1.8 Earth radii. Density measurements align with this distribution. Super-Earth densities peak, on average, at 1.4 Earth radii with a steady fall off above that. Planets smaller than about 1.5 Earth radii are thought to contain a solid core with shallow atmospheres, whereas planets above 1.8 Earth radii are thought to have deep atmospheres of volatiles and a composition like an Ice-Giant (i.e. they are Sub-Neptunes.)

Taking Lous et al’s planets, a 3 Earth mass planet would have an approximate radius of 1.3 Earth radii. An 8 Earth mass planet would have an approximate radius of 1.8 Earth radii (assuming similar densities to Earth.) This would point towards the 8 Earth mass planets having an atmosphere too deep to make a Hycean world. The atmosphere would probably transition into a supercritical fluid.

ii) I compared the liquid water atmospheric pressures from our solar system’s giant planets with the expectations of the paper. I had trouble finding good figures, as the pressure temperature charts peter out at water ice cloud level, but here are the approximate figures for the giant planets compared with the range on the 270°K-400°K graph that Lous et al produced:

Jupiter: 7-11 bar / 8-30 bar

Saturn: 10-20 bar / 25-100 bar

Neptune: 50+ Bar (50 bar is the level at which ice clouds form) / 200-500 bar

Our giant planets appear to be on the shallow side of the paper’s expectations. This could be attributed to our giant planets having greater internal heat flow than the Super-Earths modeled, but that would make the deviation greatest for Jupiter and least for Neptune. The deviation, however, appears to increase in the other direction.

The authors of the paper note that their models did not take into consideration the greenhouse effect of other gasses such as ammonia and methane likely to be found in Hycean planets’ atmospheres, which would add to the greenhouse effect and therefore give a shallower pressure profile for a given temperature. And from looking at our giant planets, this would appear to be the case.

This could mean that an unbound world would maintain a liquid ocean under something like 100+ bars of atmosphere rather than the 1000 bars originally postulated.

iii) Next, I considered the chemistry of Hycean worlds. Using our solar system’s giant planets as a guide, we can expect considerable quantities of methane, ammonia, hydrogen sulfide and phosphine in the atmospheres of Hycean worlds. The methane would stay a gas, but ammonia, being highly hydrophilic, would dissolve into the ocean. If the planet’s nitrogen to water ratio is similar to Earth’s, this would result in an approximately 1% ammonia solution. A ratio like Jupiter’s would give a 13% solution. (Ammonia cleaning fluids are generally 1-3% in concentration.) A 1% solution would have a pH of about 12, but some of this alkalinity may be buffered by the hydrosulfide ion (HS) from the hydrogen sulfide in solution.

It then occurred to me to look at freezing point depression curves of ammonia/water mixtures, and they are really gnarly. An ammonia/water ocean, if cooled below 0°C, will develop an ice cap, but as the water freezes out, this increases the ammonia concentration, causing a considerable depression in the freezing point. If the ocean reaches -60°C, something interesting starts to happen. The ice crystals forming in the ocean and floating up to the base of the ice cap start to sink, as the ocean fluid, now 25% ammonia, is less dense than ice. This will result in an overturn of the ocean and the ice cap. Further cooling will result in the continued precipitation of ice crystals until the ocean reaches a eutectic mixture of approximately 2 parts water to 1 part ammonia, which freezes at -91°C. (For comparison, pure ammonia freezes at -78°C.) Note: all figures are for 1 bar.

When discussing the possibility of liquid water on planets, we have to include the fact that water under sufficient pressure can be liquid up to its critical point of 374°C. The paper takes this into account; but what we see here is that, aside from showing that the range of insolation over which planets can have liquid water is larger than we thought, the range that water can be liquid is also larger than we assumed.

While some passing thought has been given to the possibility of ammonia as a solvent for life forms, nobody appears to have considered water/ammonia mixtures.

iv) Turning from ammonia to methane, I began to wonder if these planets would have a brown haze like Titan. A little bit of research showed that the brown haze of Titan is mainly made of tholins, which are formed by the UV photolysis of methane and nitrogen. Tholins are highly insoluble in hydrocarbons, which is why Titan’s lakes are relatively pure mixtures of hydrocarbons. However, tholins are highly soluble in polar solvents like water. So a Hycean planet with a water cycle would rain out tholins that formed in the upper atmosphere, but if the surface was frozen like Titan’s, they would stay in the atmosphere, forming a brown haze.

This points to the possibility that there are significant differences in the composition of a Hycean planet’s atmosphere depending on whether its surface is frozen or oceanic. and this may be detectable by spectroscopy.

I’m looking forward to finding out more about these planets. In some ways, I feel that in respect to exosolar planets, we are now in a position similar to that of our own solar system in the early 60s – eagerly awaiting the first details to come in.

References

Marit Mol Lous, Ravit Helled and Christoph Mordasini, “Potential long-term habitable conditions on planets with primordial H–He atmospheres,” Nature Astronomy, 6: 819-827 (July 2022). Full text.

Nikku Madhusudhan, Anjali A. A. Piette, and Savvas Constantinou, “Habitability and Biosignatures of Hycean Worlds,” The Astrophysical Journal, (Aug. 2021). Preprint.

Fulton et al, “The California-Kepler Survey. III. A Gap in the Radius Distribution of Small Planets,” The Astronomical Journal, 154 (3) 2017. Abstract.

Christopher P. McKay, ”Elemental composition, solubility, and optical properties of Titan’s organic haze,” Planetary Space Science, 8: 741-747 (1996). Abstract.

tzf_img_post

Stapledon’s Hawk

Walking along dark streets this morning, as autumn leaves gusted past under a deepening lunar eclipse, I realized that there was a reason for my recent foray into what I called ‘Stapledon thinking.’ The reason: Landscape by moonlight.

What these early walks remind me of is the beginning of Olaf Stapledon’s 1937 novel Star Maker, in which the narrator takes a similar walk in the darkness, musing on his personal relationships as well as his place in the larger structure of the cosmos (I’m using the word ‘structure’ there deliberately, as we’ll see later). The narrator walks to a hill overlooking houses below, somewhere near the sea.

There is a lighthouse. He sits down on the heather. And now ‘the hawk-flight of imagination,’ in Stapledon’s lovely phrase, takes over. An astral journey begins:

Imagination was now stimulated to a new, strange mode of perception. Looking from star to star, I saw the heaven no longer as a jeweled ceiling and floor, but as depth beyond flashing depth of suns. And though for the most part the great and familiar lights of the sky stood forth as our near neighbors, some brilliant stars were seen to be in fact remote and mighty, while some dim lamps were visible only because they were so near. On every side the middle distance was crowded with swarms and streams of stars. But even these now seemed near; for the Milky Way had receded into an incomparably greater distance. And through gaps in its nearer parts appeared vista beyond vista of luminous mists, and deep perspectives of stellar populations.

Image: William Olaf Stapledon (1886–1950), whose novels on humanity’s future depict a cosmos that dwarfs human understanding and challenges all our philosophy. Credit: Wikimedia Commons.

My flight of imagination the other day was hardly as dramatic, but the memory of the opening of Star Maker informed my thinking and led to my musings on the Fermi Question. For Stapledon’s narrator will travel deep into the cosmos in his astral form and, along the way, perceive things that pose deadly challenges to our anthropocentrism. Stapledon corresponded with H.G. Wells and was an influence on writers as disparate as C.S. Lewis, Brian Aldiss, Bertrand Russell and Vernor Vinge. He became a major factor in Arthur C. Clarke’s thinking – ponder Childhood’s End (1953), with its Overlords and transcendent ‘Overmind.’

Even more pointedly, consider Clarke’s Diaspar in The City and the Stars (1956), and the multi-hued seven-star asterism created by a long departed galactic empire in the novel. We’re getting at the roots of ‘Stapledon thinking’ when we talk about things of inconceivable (to us) scale being shaped by intelligences that may or may not be transcendent. Stapledon’s imagination knew few boundaries, a thought underlined by the fact that the idea of a star enclosed so that a civilization could use all of its energy was actually one of the tamer things his Star Maker traveler would encounter. Here is how the idea of such a sphere appears in the novel. The narrator sees the galaxy developing into a single intelligence subsuming its parts:

This whole vast community looked now beyond itself toward its fellow galaxies. Resolved to pursue the adventure of life and of spirit in the cosmical, the widest of all spheres, it was in constant telepathic communication with its fellows; and at the same time, conceiving all kinds of strange practical ambitions, it began to avail itself of the energies of its stars upon a scale hitherto unimagined. Not only was every solar system now surrounded by a gauze of light traps, which focused the escaping solar energy for intelligent use, so that the whole galaxy was dimmed, but many stars that were not suited to be suns were disintegrated, and rifled of their prodigious stores of sub-atomic energy.

And there you have what we generally call a ‘Dyson sphere.’ Let’s pause here to note that Freeman Dyson told everyone who would listen that he drew his concept originally from Stapledon, which is why I chose ‘Stapledon thinking’ as my focus even while elsewhere referring to ‘Dysonian SETI,’ the latter being the search for artifacts like such spheres around other stars. It would be just – and Greg Matloff does this – to refer to Stapledon/Dyson spheres, just as we might call the Kuiper Belt the Edgeworth/Kuiper Belt, after Irish astronomer Kenneth Edgeworth, who first predicted it in 1943. In the case of Dysonian SETI, the term seems right because it refers to a scientific search for artifacts, whereas Stapledon’s thinking was deeply philosophic in intent.

That philosophical aspect of Stapledon runs through his entire output and in Star Maker embraces a view of the universe that nudges toward the religious but then draws back from comfortable comparisons to suggest a cosmos that is beyond any human understanding, much less communion. From a SETI standpoint, we are confounded. The narrator’s astral journey encompasses universes within universes, pushing into civilizations that have emerged as global minds that are themselves finally aware of the Star Maker, an even more powerful intellect that cares not at all for the universes it has been creating, but simply makes, and evidently abandons, its earlier work. The narrator, indeed, calls the Star Maker an ‘artist.’ A calculating one, who chooses, when one creation doesn’t measure up (ours does not), to move on to another:

Here was no pity, no proffer of salvation, no kindly aid. Or here were all pity and all love, but mastered by a frosty ecstasy. Our broken lives, our loves, our follies, our betrayals, our forlorn and gallant defenses, were one and all calmly anatomized, assessed, and placed. True, they were one and all lived through with complete understanding, with insight and full sympathy, even with passion. But sympathy was not ultimate in the temper of the eternal spirit; contemplation was. Love was not absolute; contemplation was.

Here I’m reminded of Yeats as much as Stapledon:

Turning and turning in the widening gyre
The falcon cannot hear the falconer…

Star Maker contains many ideas that can conceivably evolve into technologies, even while exploring these deeply metaphysical realms. ‘Stapledon thinking,’ then, couples creativity and conjecture with philosophy, with the suggestion that the exploration of such concepts can be a forerunner of later science. Kepler had conceptions of a structured and mathematically tuned system of planetary orbits that would eventually produce his familiar laws of planetary motion. The ‘Platonic solids’ had nothing to do with it, as it turns out, but the laws he discovered still pertain.

I suggest that such thinking gives us insights into the Fermi Question in that ‘Where are they’ offers no solutions – to this point, anyway – but only a deepening series of probes. This is science fiction’s eternal ‘what if’ pushed about as hard as it can go. Because if there are other civilizations out there, we have no way of knowing how they function, or indeed think, or indeed perceive. We are on the shoals of ignorance.

Olaf Stapledon’s work echoes through science fiction to this day, and perhaps no more tellingly than in the work of Canadian writer and futurist Karl Schroeder. His question: Does our own ignorance about extraterrestrial civilizations imply that if life is indeed common in the universe, it must evolve to a point where its works are indistinguishable from nature? In his essential survey of Fermi ‘solutions’ The Great Silence: Science and Philosophy of Fermi’s Paradox, Milan ?irkovi? spends a good deal of time with Schroeder, recognizing how thoroughly the writer has explored these questions in novels like Permanence (Tor Books, 2002) and Lockstep (Tor, 2014), where outcomes that fit our lack of SETI success flow out of unusual premises.

‘Indistinguishable from nature’ is, of course, Schroeder’s canny nod to Clarke’s ‘indistinguishable from magic,’ and here is what he means (as drawn from The Deepening Paradox, an essay on his website. The italics are mine:

If the Fermi Paradox is a profound question, then this answer is equally profound. It amounts to saying that the universe provides us with a picture of the ultimate end-point of technological development. In the Great Silence, we see the future of technology, and it lies in achieving greater and greater efficiencies, until our machines approach the thermodynamic equilibria of their environment, and our economics is replaced by an ecology where nothing is wasted. After all, SETI is essentially a search for technological waste products: waste heat, waste light, waste electromagnetic signals. We merely have to posit that successful civilizations don’t produce such waste, and the failure of SETI is explained.

If a civilization produces no waste heat, is it somehow manipulating the laws of thermodynamics? We can push this conjectural realm still further. It was through ?irkovi? that I learned about Stanislaw Lem’s “The New Cosmogony,” which is included in his collection A Perfect Vacuum. Here we find a conjectured universe populated by the first civilizations to emerge into awareness, billions of years ago. Their operations are so embedded in the natural world that we perceive them as essential characteristics of the laws of physics, which they in fact manipulate to their own advantage. They have done this through all stages of the universe’s evolution. The work of these ‘Players,’ as Lem styles them, is utterly beyond our observation, or perhaps better to say, beyond our comprehension – we do observe it as nature itself.

We are indeed latecomers, whether the fantastic notions of Schroeder or Lem have traction or not. The formation of terrestrial-class planets could have begun as much as eight billion years before our own Solar System emerged, making the questions of how intelligence appears and how long civilizations last a pointed issue indeed. ?irkovi? notes about the Fermi Question that “…the very richness of the multidisciplinary and multicultural resources required by individual explanatory hypotheses enables us to claim that it is the most complex multidisciplinary problem in contemporary science.” His taxonomy of Fermi ‘solutions’ explores the entirety of this conceptual space as currently conceived.

Consider, for example, the matter of post-biological evolution, which Larry Klaes brought up in his recent essay. Is such evolution inevitable? If so, it would have an impact on how we do SETI. Here’s ?irkovi?:

Coupled with the ideas of interstellar colonization and astroengineering, postbiological evolution changes the entire game: we need not – and indeed should not – target habitable planets and circumstellar habitable zones in our SETI searches. Instead, we ought to focus on regions with the greatest amounts of resources, including metals and energy, as well as low working temperatures, as the best locales for optimized computation. Surveying warm, wet places would not make much sense.

And we’re clearly going to be finding further ‘solutions’ to the Fermi question as we proceed, for increasing capabilities in our instrumentation will suggest new prospects for discovery. The incontrovertible fact is that about other civilizations we have no data, and I am not one of those who is content to avoid speculation until such data arrive, if this ever happens. ‘Stapledon thinking,’ then, points to an amalgam of musing that is as much at home in hard science as it is in Plato or the films of Alain Resnais. It calls on us to pull out the stops and ask questions that some might find more comfortable to discuss in a pub than a faculty lounge. Or perhaps the pages of a science fiction novel, a field in which Stapledon’s influence will always loom large.

That such matters take us outside the realm of science and into philosophy and metaphysics should not surprise us. But it is equally clear that the science we practice on our species’ place in the universe inevitably raises questions it cannot yet answer. We probe, we analyze, we conceive of possibilities. We assume answers are out there.

We keep looking.

tzf_img_post

In Person or Proxy to Mars and Beyond?

Larry Klaes is well known in these parts for his extraordinary reviews of classic science fiction films. Today, however, he steps back from cinema to consider how we will expand into space. The crews on our deep space missions will doubtless be a lot different than some of those old black-and-white movies would suggest. Just how will our species adapt to the environments it will soon be exploring? There’s nothing quite so lush as our own blue and green planet, yet the imperative to move ever outward is a driver for our species. Mars is a case in point, but the long-range picture is that we’re looking off-planet and already pondering destinations beyond the Solar System. Re-shaping our expectations will be a part of what drives the scientists and engineers who equip us for the next steps. An earlier version of this essay was published by The Mars Society.

by Larry Klaes

In 1972, singer, pianist, and composer Sir Elton Hercules John (born 1947) released a song titled “Rocket Man”. This piece, which was inspired by a Ray Bradbury (1920-2012) science fiction story of the same name, has an individual who sees his job in outer space not as some grand adventure as one might expect of a typical astronaut, but rather as ordinary and isolating.

Not only does this Rocket Man miss Earth and his wife living there, declaring “it’s lonely out in space,” he also says that “Mars ain’t the kind of place to raise your kids/In fact it’s cold as hell/And there’s no one there to raise them/If you did.”

As a life-long space and astronomy enthusiast, when I first became aware of this song, I was highly disappointed with its message. “Rocket Man” was a definite reflection of the counterculture era, where many rejected what they saw as the militant flaws and antiquated traditions of society which held back all but a select privileged few.

The space program fell into that category, being seen as a vehicle of a predominantly white male military-industrial complex. That it was also so publicly prominent in the news and entertainment media only made it an even easier target for criticism, in particular the kind that asked why we were spending money on sending humans to the Moon when there were so many problems on Earth that needed fixing first.

Even as a kid I knew this was an “apples and oranges” situation. The National Aeronautics and Space Administration, or NASA, was funded far less than most other government agencies of that era, a status that remains to the present day. Diverting all its resources to social agendas would have been a temporary band aid at best, not a real solution to modern civilization’s myriad of problems.

Image: A future Mars settlement as envisioned by SpaceX. Is this how humanity will live on other worlds, or will something else be required?

Nevertheless, the general public which had supported the early bold declaration of “sending a man to the Moon and returning him safely to the Earth” within ten years had undergone a sea change by the time NASA was actually placing astronauts on our planet’s nearest celestial neighbor at the end of the 1960s and into the early 1970s.

I had grown up in that era of the early Space Age when humans were actively circling Earth in preparation for launching representatives of our species to land on the Moon while robotic probes had begun to reveal other worlds such as Venus and Mars. I bought into the future storyline of the 1968 film 2001: A Space Odyssey and all those other pro-space entertainment media so prevalent then that humanity would almost automatically spread out and colonize first the Moon and then the other places in our Sol system, before moving on into the wider Milky Way galaxy.

I did not pay much attention to the geopolitical and social forces driving and affecting the space programs then, not just because I wasn’t able to fully comprehend them as a naive kid, but also because I felt they were only temporary issues, ones humanity would conquer as easily and rightly as we were doing with our move into outer space. After all, didn’t Star Trek show a future just a few centuries from now where all of humanity was united, we were flying about the galaxy in fancy starships, and dealing with new alien neighbors as part of a collective called the United Federation of Planets (UFP)?

So, when I heard Elton John warbling a very popular tune that said the starry realm was unpleasant, lonely, and not something good for bringing up children in, I was concerned his words would only add fuel to the fire that was already setting back our “manifest destiny” in the Final Frontier in the beginning of the 1970s.

The Apollo lunar program was already being defunded after the seventeenth mission, which in turn was killing off any plans for manned lunar colonies. The logical promise of sending humans on to the planet Mars after the success of Apollo – as soon as the 1980s it was being declared in certain circles – was also placed on a shelf. No one was saying such missions were being canceled, but it was pretty obvious that no one at NASA was seriously working on such an adventure by then, nor would they be any time soon.

Many in the West thought that America’s superpower rival, the Soviet Union, would pick up the gauntlet we had dropped: Soon there would be cosmonaut bootprints on the Moon and Mars as they went on to become the dominant society throughout the Sol system and beyond.

Since then, a lot has changed. The American manned space program is not only picking up again, with real plans to settle the Moon with a new generation of astronauts in this decade as well as send these explorers on to Mars in the 2030s. There is also a new Space Race of a kind, this time mainly with China. Upon jumping into this race with their first successful satellite launch in 1970, the “People’s Republic” now has a second crewed space station circling Earth while simultaneously conducting automated rover and sample return missions to the Moon and their first wheeled explorer conducting science on the Red Planet.

My attitude and views on our ventures into the Final Frontier have also changed over the decades. I am still quite the space supporter, but I am seeing it now as happening in certain different ways, in particular how we should venture into the void directly with fellow human beings.

When I used to read and hear certain professionals, whom I automatically assumed should have been big supporters of manned space exploration and settlement, publicly state that robots were better for exploring the cosmic void than human beings, I was indignant. They were going against the virtually predestined vision for our species expansion into the Milky Way galaxy and all those other stellar islands out there. Humans had to be an integral part of this future, otherwise our species and society would end up either stagnating or outright destroying itself in the very nest of its birth. No one in their right mind would keep a child in their crib and expect them to develop properly otherwise.

What needs to be understood is that when the Space Age began in the 1950s (or the 1940s if you want to count the first rockets that breached the actual realm, if only briefly), humans were almost always the foremost choice for conducting all kinds of expeditions, be it on Earth’s surface, at sea, or in the skies. Space would have been no different.

Yes, there were many satellites that went up carrying no living organic beings at all, but their mechanisms and computer “brains” were primitive by current standards. For example, Mariner 2, the first probe to successfully explore the planet Venus in late 1962, contained a computer weighing just over eleven pounds that was capable of “a total of 11 real-time commands and a spare… along with a stored set of 3 onboard commands which could be modified,” according to Oran W. Nicks, then Director of Lunar and Planetary Programs for NASA, as he described in his wonderfully written book Far Travelers: The Exploring Machines (NASA SP-480, 1985).

Even the twin Voyager space probes, designed, built, and launched into the outer Sol system on much more complex missions over one decade after Mariner 2, had multiple computers that were still less powerful than a modern day automobile key fob. The onboard computers that helped land astronauts on the Moon with Apollo weighed over 75 pounds and had only 1,600 bits of memory in them, and they were specially designed by experts at the Massachusetts Institute of Technology (MIT).

On Earth, up until the first personal computers began showing up in large numbers in the 1970s, the majority of “thinking” machines were bulky, heavy, and most often required trained specialists to operate them. So it is easy to see why most people back then assumed the best “computer” to explore outer space was the four pounds of “gray matter” occupying the skull of a functioning and properly educated adult human.

This technology has certainly changed since the first two decades of the Space Age. The average person now routinely works and plays with lightweight computers possessing storage levels and functionalities that would have been pure science fiction to their parents and grandparents. The machines currently exploring the Moon and Mars have autonomous capabilities that allow them to independently run their own missions while also being smart enough to avoid potential hazards in these alien environments.

As one may easily imagine, computing and robotic technologies for space are only going to improve in the coming decades to the point that one may rightly question the purpose of sending humans to distant worlds when much more durable and far less expensive and resource-demanding robots equipped with sophisticated Artificial Intelligence (AI) minds could do the same tasks.

Deadly Rays and Dwellings

Fewer resources and relatively cheaper funding aren’t the only reasons for sending machines over humans to explore other worlds. The cosmic environment beyond Earth is quite hazardous indeed for a species that has spent its entire existence evolving on a planet that is a virtual paradise for our biology compared to every other place in our Sol system (and who knows how far beyond).

Mars has often been considered the world closest in comparison to Earth, yet even the least harsh places on the Red Planet make Antarctica look like a tropical island. Possessing only a very thin atmosphere composed mostly of carbon dioxide and no appreciable ozone layer or magnetic field, Mars is constantly bombarded by high levels of radiation from solar subatomic particles and cosmic rays. Solar ultraviolet rays also reach the Red Planet’s surface unabated. Meteoroids of most sizes are not deterred by the Martian air as they would be on Earth. Orbiting probes have imaged multiple results of recent impacts and the rovers have found substantiated meteorites as they roam their rather narrow swaths of the Martian landscape.

For Martian settlers to survive all these dangers, they would need to either develop structures with heavily reinforced radiation-proof roofs, cover their settlement with local regolith, or bury their dwellings deep underground. In most of these cases, unless humanity develops a type of transparent radiation shielding, the human residents will have to live without a direct view or easy access of their new homeworld.

Can humans stand being in an artificial environment underground on Mars for decades or even their entire lives? Down the road, settlements may be made large and luxurious enough to recreate the nature found on Earth, but the early pioneers will probably not be so fortunate. Will they last long enough psychologically to establish a permanent residence on Mars?

It is easy for those of us who are living now in the relative comfort and safety of Sol 3 to assume that those first settlers on our planetary neighbor can “tough it out” like the pioneers of the olden days did, but those ancestors who sought a new life did so on a world they were already adapted to physiologically. Martian settlers will require a great deal of preparation and mechanical services just to keep the climate of the Red Planet from outright killing them within minutes if they are ever exposed to the raw environment. Running back to Earth in the event of a disaster is not a quick option.

Terrestrial explorers and settlers also did not need to worry about dealing with the effects of a lesser gravity, for the pull of the mass of Mars for anyone on its surface is only 38 percent that experienced by those living on Earth. Not only will this eventually weaken those first settlers and their descendants, but it may create unexpected health issues and affect the way humans gestate in a mother’s womb and how they are born.

Our natural satellite has even less gravity and protection from the celestial threats already mentioned. There too settlers will have to live underground and deal with the same situations as their Martian brethren. The other nearby worlds with solid surfaces such as Venus, Mercury, and the Galilean satellites also have their own unique challenges in addition to all those just described.

Circling Earth in Tin Cans

Human beings have been launching representatives of their species into the Final Frontier for just over six decades now. Yet outside of those handful of brief jaunts to the Moon now fifty-plus years ago, astronauts and cosmonauts have only experienced space directly in their biggest habitat as temporary residents of various space stations in perpetual fall around Earth, where their stays currently last between six months to one year.

Unlike those space explorers going to Mars, those in Low Earth Orbit (LEO) are but a matter of hours away from rescue and safety in the event of an emergency. This also applies to the ability to resupply the residents of a space station. These facts will have a definite impact on our first venturers who will be months to years away from any kind of help from Earth.

Space is Not a Convenient Species Safety Valve

One thing we cannot count on space serving us any time soon is as a method of reducing the overpopulation of humans on Earth, which at this moment is approaching eight billion individuals. The exodus numbers required to start alleviating the environmental pressure in that direction are not going to happen in the foreseeable future, assuming reducing the human population ever even becomes a goal in the first place. Besides, we cannot just “displace” a fraction of our species without serious preparation first and this will only loop us back to the issues I have already addressed, the ones that will decide whether we can permanently settle space or not.

Of course, none of these obstacles may deter those individuals, organizations, and nations who are determined to live off Earth despite the various costs. One may easily envision the super wealthy constructing their own space habitats in what might be considered the ultimate gated community. Others may turn a hollowed out planetoid or comet into a WorldShip, or multigenerational space ark, and head off into the wider galaxy with their chosen acolytes, their fates left entirely in their hands.

Should space become profitable, corporations will most certainly start moving humans and machines out there. Will conditions for such laborers be better than on Earth, or will it be a case of the old phrase ‘the same day, just a different song?’ While more robots and other artificial devices will be required to literally mine the Final Frontier, corporations may still find humans to be overall much cheaper to utilize to collect resources and maintain the various services envisioned in space. This would also include the costs of replacing such laborers when the situation calls for it.

Adapting Ourselves to the Reality

As we are such small creatures compared to the vastness of space and its many wild and dangerous environments, would it perhaps make more sense to change ourselves rather than try to make other worlds more like Earth?

Terraforming Mars, Venus, and even the Moon have been suggested for roughly a century now as a way for humanity to expand to the stars, but would it work? At the least it may take centuries or more to convert an entire planet into one resembling what we have now. One aspect we may not be able to change that could affect such a project is that except for Venus, the other worlds will continue to have less mass than Earth.

Now imagine beings who could live and work directly in outer space, or on any number of moons and planetary bodies without the need for special suits and gear. It will very likely be easier, cheaper, and quicker to adapt humans to other places via bioengineering and cyborg technologies than try to change an entire world to suit our physiological needs. These adaptations would certainly ensure the survival of our species, even if they become quite different from their predecessors, us. This would not be all that unusual considering how different we are from our distant prehistoric ancestors, and rather few are put off by this fact.

As an additional incentive, note how humans already spend billions in unrelenting efforts to make themselves better in all sorts of ways. Such desires have only increased as our technologies for these desired changes continue to improve. Space may become the ultimate reason for human durability and advancement. Perhaps this is all part of the process of our evolution, only we are facilitating the matter faster than nature has done in the past and in the directions we want it to go.

Science fiction most often envisions interstellar vessels having human crews as the primary features and functions of such starships. Even when they include a capable AI, it is the humans running the show. However, just as autonomous machines have long been our first and continuous “ambassadors” to other worlds in our Sol system, so too will we likely see even more advanced versions plying their ways to other star systems in the galaxy.

Artilects will be the “crew” of choice not only due to their multiple levels of durability and longevity, but also because their artificial minds will be able to process and comprehend far more than any human mind could, perhaps functioning even better than a cybernetically enhanced organic human brain. This will be a vital advantage in a galaxy of unknown factors, including when they encounter other minds that may be unlike anything we have ever dealt with before. Then their roles as ambassadors from our realm will be much more than just a clever designation.

We should not be disheartened by the fact that exploring and settling space with our species as it currently is may not be the best way to go in the really long term. Instead, with our new capabilities and knowledge, humanity can supersede what we once thought was the best way to expand into the Universe and do so in a way that will ensure our survival and success.

So, Rocket Man, you may have been ultimately correct regarding the expansion of humanity into space, but for reasons rather different than you could have possibly imagined. This is with no offense intended, as we are all products of our time and place, and you did highlight some important issues regarding permanent space utilization and settlement. The good thing is that we can and will evolve our collective understanding of existence, which in turn will allow us to adapt for the future, wherever it will be.

tzf_img_post

Charter

In Centauri Dreams, Paul Gilster looks at peer-reviewed research on deep space exploration, with an eye toward interstellar possibilities. For many years this site coordinated its efforts with the Tau Zero Foundation. It now serves as an independent forum for deep space news and ideas. In the logo above, the leftmost star is Alpha Centauri, a triple system closer than any other star, and a primary target for early interstellar probes. To its right is Beta Centauri (not a part of the Alpha Centauri system), with Beta, Gamma, Delta and Epsilon Crucis, stars in the Southern Cross, visible at the far right (image courtesy of Marco Lorenzi).

Now Reading

Version 1.0.0

Recent Posts

On Comments

If you'd like to submit a comment for possible publication on Centauri Dreams, I will be glad to consider it. The primary criterion is that comments contribute meaningfully to the debate. Among other criteria for selection: Comments must be on topic, directly related to the post in question, must use appropriate language, and must not be abusive to others. Civility counts. In addition, a valid email address is required for a comment to be considered. Centauri Dreams is emphatically not a soapbox for political or religious views submitted by individuals or organizations. A long form of the policy can be viewed on the Administrative page. The short form is this: If your comment is not on topic and respectful to others, I'm probably not going to run it.

Follow with RSS or E-Mail

RSS
Follow by Email

Follow by E-Mail

Get new posts by email:

Advanced Propulsion Research

Beginning and End

Archives