GDEM: Mission of Gravity

If space is infused with ‘dark energy,’ as seems to be the case, we have an explanation for the continuing acceleration of the universe’s expansion. Or to speak more accurately, we have a value we can plug into the universe to make this acceleration happen. Exactly what causes that value remains up for grabs, and indeed frustrates current cosmology, for something close to 70 percent of the total mass-energy of the universe needs to be comprised of dark energy to make all this work. Add on the mystery of ‘dark matter’ and we actually see only some 4 percent of the cosmos.

So there’s a lot out there we know very little about, and I’m interested in mission concepts that seek to probe these areas. The conundrum is fundamental, for as a 2017 study from NASA’s Innovative Advanced Concepts office tells me, “…a straightforward argument from quantum field theory suggests that the dark energy density should be tens of orders of magnitude larger than what is observed.” Thus we have what is known as a cosmological constant problem, for the observed values depart from what we know of gravitational theory and may well be pointing to new physics.

The report in question comes from Nan Yu (Jet Propulsion Laboratory) and collaborators, a Phase I effort titled “Direct probe of dark energy interactions with a Solar System laboratory.” It lays out a concept called the Gravity Probe and Dark Energy Detection Mission (GDEM) that would involve a tetrahedral constellation of four spacecraft, a configuration that allows gravitational force measurements to be taken in four simultaneous directions. These craft would operate about 1 AU from the Sun while looking for traces of a field that could explain the dark energy conundrum.

Now JPL’s Slava Turyshev has published a new paper which is part of a NIAC Phase II study advancing the GDEM concept. I’m drawing on the Turyshev paper as well as the initial NIAC collaboration, which has now proceeded to finalizing its Phase II report. Let’s begin with a quote from the Turyshev paper on where we stand today, one that points to critical technologies for GDEM:

Recent interest in dark matter and dark energy detection has shifted towards the use of experimental search (as opposed to observational), particularly those enabled by atom interferometers (AI), as they offer a complementary approach to conventional methods. Situated in space, these interferometers utilize ultra-cold, falling atoms to measure differential forces along two separate paths, serving both as highly sensitive accelerometers and as potential dark matter detectors.

Thus armed with technology, we face the immense challenge of such a mission. The key assumption is that the cosmological constant problem will be resolved through the detection of light scalar fields that couple to normal matter. Like temperature, a scalar field has no direction but only a magnitude at each point in space. This field, assuming it exists, would have to have a mass low enough that it would be able to couple to the particles of the Standard Model of physics with about the same strength as gravity. Were we to identify such a field, we would move into the realm of so-called ‘fifth forces,’ a force to be added to gravity, the strong nuclear force, the weak nuclear force and electromagnetism.

Can we explain dark energy by attempting to modify General Relativity? Consider that its effects are not detectable with current experiments, meaning that if dark energy is out there, its traces are suppressed on the scale of the Solar System. If they were not, the remarkable success scientists have had at validating GR would not have happened. A workable theory, then, demands a way to make the interaction between a dark energy field and normal matter dependent on where it occurs. The term being used for this is screening.

We’re getting deep into the woods here, and could go further still with an examination of the various screening mechanisms discussed in the Turyshev paper, but the overall implication is that the coupling between matter and dark energy could change in regions where the density of matter is high, accounting for our lack of detection. GDEM is a mission designed to detect that coupling using the Solar System as a laboratory. In Newtonian physics, the gravitational gradient tensor (CGT), which governs how the gravitational force changes in space, would have a zero trace value in a vacuum in regions devoid of mass. That’s a finding consistent with General Relativity.

But if in the presence of a dark energy field, the CGT trace value would be other than zero. The discovery of such a variance from our conventional view of gravity would be revolutionary, and would support theories deviating from General Relativity.

Image: Illustration of the proposed mission concept – a tetrahedral constellation of spacecraft carrying atomic drag-free reference sensors flying in the Solar system through special regions of interest. Differential force measurements are performed among all pairs of spacecraft to detect a non-zero trace value of the local field force gradient tensor. A detection of a non-zero trace, and its modulations through space, signifies the existence of a new force field of dark energy as a scalar field and shines light on the nature of dark energy. Credit: Nan Yu.

The constellation of satellites envisioned for the GDEM experiment would fly in close formation, separated by several thousand kilometers in heliocentric orbits. They would use high-precision laser ranging systems and atom-wave interferometry, measuring tiny changes in motion and gravitational forces, to search for spatial changes in the gravitational gradient, theoretically isolating any new force field signature. The projected use of atom interferometers here is vital, as noted in the Turyshev paper:

GDEM relies on AI to enable drag-free operations for spacecraft within a tetrahedron formation. We assume that AI can effectively measure and compensate non-gravitational forces, such as solar radiation pressure, outgassing, gas leaks, and dynamic disturbances caused by gimbal operations, as these spacecraft navigate their heliocentric orbits. We assume that AI can compensate for local non-gravitational disturbances at an extremely precise level…

From the NIAC Phase 1 report, I find this:

The trace measurement is insensitive to the much stronger gravity field which satisfies the inverse square law and thus is traceless. Atomic test masses and atom interferometer measurement techniques are used as precise drag-free inertial references while laser ranging interferometers are employed to connect among atom interferometer pairs in spacecraft for the differential gradient force measurements.

In other words, the technology should be able to detect the dark energy signature while nulling out local gravitational influences. The Turyshev paper develops the mathematics of such a constellation of satellites, noting that elliptical orbits offer a sampling of signals at various heliocentric distances, which improves the likelihood of detection. Turyshev’s team developed simulation software that models the tetrahedral spacecraft configuration while calculating the trace value of the CGT. This modeling along with the accompanying analysis of spacecraft dynamics demonstrates that the needed gravitational observations are within range of near-term technology.

Turyshev sums up the current state of the GDEM concept this way:

…the Gravity Probe and Dark Energy Detection Mission (GDEM) mission is undeniably ambitious, yet our analysis underscores its feasibility within the scope of present and emerging technologies. In fact, the key technologies required for GDEM, including precision laser ranging systems, atom-wave interferometers, and Sagnac interferometers, either already exist or are in active development, promising a high degree of technical readiness and reliability. A significant scientific driver for the GDEM lies in the potential to unveil non-Einsteinian gravitational physics within our solar system—a discovery that would compel a reassessment of prevailing gravitational paradigms. If realized, this mission would not only shed light on the nature of dark energy but also provide critical data for testing modern relativistic gravity theories.

So we have a mission concept that can detect dark energy within our Solar System by measuring deviations found within the classic Newtonian gravitational field. And GDEM is hardly alone as scientists work to establish the nature of dark energy. This is an area that has fostered astronomical surveys as well as mission concepts, including the Nancy Grace Roman Space Telescope, the European Space Agency’s Euclid, the Vera Rubin Observatory and the DESI collaboration (Dark Energy Spectroscopic Instrument). GDEM extends and complements these efforts as a direct probe of dark energy which could further our understanding after the completion of these surveys.

There is plenty of work here to bring a GDEM mission to fruition. As Turyshev notes in the paper: “Currently, there is no single model, including the cosmological constant, that is consistent with all astrophysical observations and known fundamental physics laws.” So we need continuing work on these dark energy scalar field models. From the standpoint of hardware, the paper cites challenges in laser linking for spacecraft attitude control in formation, maturation of high-sensitivity atom interferometers and laser ranging with one part per 1014 absolute accuracy. Identifying such technology gaps in light of GDEM requirements is the purpose of the Phase II study.

As I read this, the surveys currently planned should help us hone in on dark energy’s effects on the largest scales, but its fundamental nature will need investigation through missions like GDEM, which would open up the next phase of dark energy research. The beauty of the GDEM concept is that it does not depend upon a single model, and the NIAC Phase I report notes that it can be used to test any modified gravity models that could be detected in the Solar System. As to size and cost, this looks like a Large Strategic Science Mission, what NASA used to refer to as a Flagship mission, about what we might expect from an effort to answer a question this fundamental to physics.

The paper is Turyshev et al., “Searching for new physics in the solar system
with tetrahedral spacecraft formations,” Phys. Rev. D 109 (25 April 2024), 084059 (abstract / preprint).

White Holes: Tunnels in the Sky?

It’s good now and then to let the imagination soar. Don Wilkins has been poking into the work of Carlo Rovelli at the Perimeter Institute, where the physicist and writer explores unusual ideas, though perhaps none so exotic as white holes. Do they exist, and are there ways to envision a future technology that can exploit them? A frequent contributor to Centauri Dreams, Don is an adjunct instructor of electronics at Washington University, St. Louis, where he continues to track research that may one day prove relevant to interstellar exploration. A white hole offers the prospect of even a human journey to another star, but turning these hypothesized objects into reality remains an exercise in mathematics, although as the essay explains, there are those exploring the possibilities even now.

by Don Wilkins

Among the many concepts for human interstellar travel, one of the more provocative is an offspring of Einstein’s theories, the bright twin of the black hole, the white hole. The existence of black holes (BH), the ultimate compression stage for aging stellar masses above three times the mass of our sun, is announced by theory and confirmed by observation. White holes, the matter spewing counterparts of BHs, escape observation but not the explorations of theorists.

Carlo Rovelli, an Italian theoretical physicist and writer, now the Distinguished Visiting Research Chair at the Perimeter Institute, discusses all this in a remarkably brief book called, simply, White Holes (Riverhead Books, 2023) wherein he travels in company with Dante Alighieri, another author with experience at descents into perilous places. Rovelli makes two remarkable assertions. [1]

1) Rovelli states that another scientist, Daniel Finkelstein, demonstrated that Einstein and other analysts are incorrect when they depict what occurs as one enters a black hole. From the Finkelstein paper (citation below):

The gravitational field of a spherical point particle is then seen not to be invariant under time reversal for any admissible choice of time coordinate. The Schwarzschild surface, r=2m is not a singularity but acts as a perfect unidirectional membrane: causal influences can cross it but only in one direction. [2]

In other words, no time dilation, no spaghettification of trespassers entering a black hole. Schwarzchild’s solution only applies to distant observers; it does not describe the observer crossing the event horizon of the black hole.

2) Rovelli believes in the existence of white holes. His white hole births when the black hole compresses its constituent parts into the realm of quantum mechanics. Rovelli speculates “… a black hole … quantum tunnels into a white one on the inside – and the outside can stay the same.”

In Figure 1 and Rovelli’s intuition, a quantum mesh separates the black hole and white hole. At these minute dimensions, quantum tunneling effects surge matter away from the black hole, into the mouth of the white hole and back into the Universe.

Figure 1. Relationship between a black hole and a white hole. Credit: C. Rovelli/Aix-Marseille University; adapted by APS/Alan Stonebraker.

The outside of a black hole and a white hole are geometrically identical regardless of the direction of time. The horizon is not reversible under the flow of time. As a result the interiors of the black hole and white hole are identical.

In a paper he co-authored with Hal Haggard, Rovelli writes:

We have constructed the metric of a black hole tunneling into a white hole by using the classical equations outside the quantum region, an order of magnitude estimate for the onset of quantum gravitational phenomena, and some indirect indications on the effects of quantum gravity. [3]

Haggard and Rovelli acknowledge that the calculations do not result from first principles. A full theory of quantum gravity would supply that requirement.

Figure 2: Artist rendering of the black-to-white-hole transition. Credit: F. Vidotto/University of the Basque Country. [9]

Efforts to design a stable wormhole require buttressing the entrance or mouth of the wormhole with prodigious amounts of a hypothesized material, negative matter. Although minute amounts have been claimed to form in the narrow confines of a Casimir device, ideas on how to manufacture planetary-sized masses of negative matter are elusive. [4]

According to recent research, the stability of the WH is dependent upon which of the two major families of matter, bosons or fermions, forms the WH. Bosons are subatomic particles which obey Bose-Einstein statistics and whose spin quantum number has an integer value (0, 1, 2, …). Photons, gluons, the Z neutral weak boson and the weakly charged bosons are bosons. The graviton, if it exists, is a boson. Theoretic analysis of stable traversable WHs founded on bosonic fields demonstrates a need for vast amounts of negative matter to hold open the mouth of a WH.

The other family, the fermions, have odd half-integer (1/2, 3/2, etc.) spins. These particles, electrons, muons, neutrinos, and compound particles, obey the Pauli Exclusion Principle. It is this family that is employed by a team of researchers to describe a two fermion stable white hole [5]. Their configuration produces John Wheeler’s “charge without charge”, where an electric field is trapped within the structure without any physical electrical charge present. The opening in the white hole would be too small, a few hundred Planck lengths (a Planck length is 1.62 x 10-35 meters) to pass gamma rays.

Rovelli reenters the discussion here. [6] The James Webb Space Telescope has identified large numbers of black holes in the early Universe, more black holes than anticipated. Rovelli describes white holes forming from these black holes as Planck-length sized, chargeless entities, unable to interact with the matter except through gravity. In other words, the descendants of the early black holes manifest as the material we describe as dark matter. Rovelli is working on a quantum sensor to detect these white holes.

Once the white holes are detected, it might be possible to capture a white hole. John G. Cramer, professor emeritus of physics at the University of Washington in Seattle, Washington, suggests accelerating the wormhole to almost the speed of light. [7] Aimed at Tau Ceti, he predicts:

The arrival time as viewed through a wormhole is T’ = T/γ , where γ is the Lorentz factor [γ= (1- v/c)] and v is the wormhole-end velocity after acceleration. For reference, the maximum energy protons accelerated at CERN LHC have a Lorentz factor of 6,930. Thus, the arrival time at Tau Ceti of an LHC-accelerated wormhole-end would be 15 hours….Effectively, the accelerated wormhole becomes a time machine, connecting the present with an arrival far in the future.

Spraying accelerated electrons through the wormhole could expand the mouth to a size where it could be used as a sensor portal into another star system. The wormhole becomes a multi light-year long periscope, one that scientists could bend and twist to study up close and in detail the star and its companions. Perhaps the wormhole could be expanded enough to pass larger, physical bodies.

Constantin Aniculaesei and an international team of researchers may have overcome the need for an accelerator as large as the LHC to accelerate the white hole to useful size [8]. Developing a novel wakefield accelerator, wherein an intense laser pulse focused onto a plasma excites nonlinear plasma waves to trap electrons, the team’s machine produced 10 Giga electron Volt (GeV) electron bunches. The wakefield accelerator was only ten centimeters long, although a petawatt laser was needed to excite the wakefields.

Cramer hypothesizes that fermionic white holes formed immediately after the Big Bang and in cosmic rays. The gateways to the stars could be found in the cosmic ray bombardment of the Earth or possibly trapped in meteorites. The heavy particles, if ensnared on Earth, would probably sink to the center of the planet.

All that is needed to find a fermionic white hole, Cramer suggests, is a mass spectrometer. But let me quote him on this:

[Wormholes] might be a super-heavy components of cosmic rays….They might be trapped in rocks and minerals….In a mass spectrograph, they could in principle be pulled out of a vaporized sample by an electric potential but would be so heavy that they would move in an essentially undeflected straight line in the magnetic field. …wormholes might still be found in meteorites that formed in a gravity free environment.

The worm hole is essentially unaffected by a magnetic field. A mass detector would point to an invisible mass. The rest, as non-engineers like to say, is merely engineering.

If this line of reasoning is correct – a very large if – enlarged white holes could pass messages and matter through tunnels in the sky to distant stars.

References

1. Carlo Rovelli, translation by Simon Carnell, White Holes, Riverhead Books, USA, 2023

2. David Finkelstein, Past-Future Asymmetry of the Gravitational Field of a Point Particle, Physical Review, 110, 4, pages 965–967, May 1958, 10.1103/PhysRev.110.965

3. Hal M. Haggard and Carlo Rovelli, Black hole fireworks: quantum-gravity effects outside the horizon spark black to white hole tunneling, 4 July 2014, https://arxiv.org/pdf/1407.0989.pdf

4. Matt Visser, Traversable wormholes: Some simple examples, arXiv:0809.0907 [gr-qc], 4 September 2008.

5. Jose Luis Blázquez-Salcedo, Christian Knoll, and Eugen Radu, Traversable Wormholes in Einstein-Dirac-Maxwell theory, arXiv:2010.07317v2, 12 March 2022.

6. What is a white hole? – with Carlo Rovelli, The Royal Institution, https://www.youtube.com/watch?v=9VSz-hiuW9U

7. John G. Cramer, Fermionic Traversable Wormholes, Analog Science Fiction & Fact, January/February 2022.

8. Constantin Aniculaesei, Thanh Ha, Samuel Yoffe, et al, The Acceleration of a High-Charge Electron Bunch to 10 GeV in a 10-cm Nanoparticle-Assisted Wakefield Accelerator, Matter and Radiation at Extremes, 9, 014001 (2024), https://doi.org/10.1063/5.0161687

9). Rovelli, “Black Hole Evolution Traced Out with Loop Quantum Gravity,” Physics 11, 127 (December 10, 2018).
https://physics.aps.org/articles/v11/127

Building the Gravitational Machine

A friend and I were sitting in a diner some time back talking mostly about old movies (my passion is for black-and-white films from 1927 to the death of Bogart in 1957). Somehow the topic of gravity came up, I suspect because we had homed in on early 50’s science fiction films. Anyway, I remember his eyebrows raising when I mentioned how puny a force gravity was. I can understand why. We think about massive objects when we think about gravity, but of course it takes a lot of mass to get a little gravity.

In fact, gravity is some 1038 times weaker than the strong force that holds atomic nuclei together, easily illustrated by pointing out to my friend that I was overcoming an entire planet’s worth of gravity by lifting the salt shaker on the table. I learned from Greg Matloff and Eugene Mallove’s The Starflight Handbook that despite Freeman Dyson’s early interest in using the gravitational force to capture energy from astronomical objects, it was Stanislaw Ulam who first pondered the idea in print.

Now Ulam is an interesting figure, a name that resonates on Centauri Dreams in the context of nuclear pulse propulsion, which he first analyzed as far back as 1947 in a report for Los Alamos Scientific Laboratory. This grew into the Project Orion concept, with nuclear bombs exploded behind a flat steel plate, the crew protected by the mother of all shock absorbers. Ulam’s work on gravitational machines, however, analyzed how much energy could be extracted in a three-body system, one of which was a rocket, and what kind of velocities such a rocket could attain.

Image: Stanislaw Ulam (1909-1984), well known for his work on Orion, but also an early analyst of the extraction of gravitational energy.

Freeman Dyson’s notion, explained in the paper “Gravitational Machines” that we looked at last week, was to extract energy from a binary star system, as shown in the figure from the paper below. The ever imaginative Dyson, remember, was captivated by the possibilities of engineering on the part of advanced civilizations, whose works we might observe in the form of technosignatures. Here, the idea involves two stars of mass equal to the Sun revolving around a common barycenter. A spacecraft can be injected into an orbit that maximizes the gravitational effect, as Dyson explains:

The exploiters of the device are living on a planet or vehicle P which circles around the double star at a distance much greater than R. They propel a small mass C into an orbit which falls toward the double star, starting from P with a small velocity. The orbit of C is computed in such a way that it makes a close approach to B at a time when B is moving in a direction opposite to the direction of arrival of C. The mass C then swings around B and escapes with greatly increased velocity. The effect is almost as if the light mass C had made an elastic collision with the moving heavy mass B. The mass C will arrive at a distant point Q with velocity somewhat greater than 2V.

Image: This is Figure 1 from the Dyson paper. Caption: The solid line indicates the orbit of A and B; the dashed line indicates the orbit of C. Credit: Freeman Dyson.

Two options open up as we reach point Q:

At Q the mass C may be intercepted and its kinetic energy converted into useful form. Alternatively the device may be used as a propulsion system, in which case C merely proceeds with velocity 2V to its destination. The destination might be a similar device situated very far away, which brings C to rest by the same mechanism working in reverse.

Why call this a ‘machine’? Dyson speculated that if the advanced civilization would create ‘a whole ring of starting points P and end points Q’ around the same binary system, masses dropped into the system would emerge as a continuous stream of payloads, or cargo, or whatever. The point is that the energy source for this system is simply the gravitational potential between the two stars.

Then we can go further and extrapolate what happens as the machine continues to function. Over large timespans, the two stars will be drawn closer together, with the effect that their orbital velocity will necessarily increase. Thus the machine continues to operate, extracting energy from the system until the stars close to such a tight distance that no passage between them is possible. Dyson thinks this would be a point where the distance between the centers of the two stars is 4 times the radius of each star.

Dyson calculated that the luminous energy radiated by Sun-like stars in a three-body system like this would be a more practical source than gravitational energy, but white dwarfs, far less luminous than the Sun, would ramp up the gravitational energy by a factor of a hundred. So there’s an interesting technosignature for you, a search for white dwarf binaries with the parameters defined by Dyson, marking a system that could accelerate objects to 2000 kilometers per second without any propellant.

The ever imaginative Dyson thought such a system of white dwarf binaries scattered around the galaxy could serve as a long-haul freight transportation network. More significantly, he went on to consider the more condensed form of star known as the neutron star, which as the time of writing was still a theoretical concept. “[T]he fact that none has yet been observed does not argue strongly against their existence.” And of course, it would not be long before Jocelyn Bell and Antony Hewish found the first pulsar in 1967.

If we were to choose a pair of white dwarf stars as our binary system, Greg Matloff notes in The Starflight Handbook, we might reach velocities of 0.009 c. This is roughly 2700 kilometers per second, not bad given our Voyager 1 travel speed of a mere 17.1 kilometers per second. Even so, it’s a long way to Proxima Centauri. If we could work with a pair of neutron stars, according to the calculations Dyson made, we might reach 0.27 c,or almost 81,000 kilometers per second. Now we’re moving out, reaching Proxima in a couple of decades. Then, of course, we’ve got to slow down.

Adds Dyson:

…it may be said that the dynamics of stellar systems, under conditions in which gravitational radiation is important, is a greatly neglected field of study. In any search for evidences of technologically advanced societies in the universe, an investigation of anomalously intense sources of gravitational radiation ought to be included.

What an extraordinary thinker Dyson was! I look forward to the recent essay collection “Well, Doc, You’re In”: Freeman Dyson’s Journey through the Universe (MIT Press, 2022), just arrived here and placed at the top of my stack of necessary reading. Meanwhile, it’s intriguing to take the subject further still. Although Dyson didn’t push into this direction, Greg Benford has examined how truly advanced civilizations might create a different kind of gravitational machine to enable communications systems that would make using the electromagnetic spectrum seem quaint. More on that soon.

Freeman Dyson’s Gravitational Machines

What an intriguing thing to find Freeman Dyson’s “Gravitational Machines” paper popping up on arXiv. This one is yet another example of Dyson’s prescience, for in it he examines, decades before the actual event, how gravitational waves could be produced and detected, although he uses neutron stars rather than black holes as his focus. Fair enough. When this was written, in 1962, black holes were far more conjectural than they appear in most of the scientific literature today.

But what a tangled history this paper presents. First of all, how does a 1962 paper get onto arXiv? A quick check reveals the uploader as David Derbes, a name that should resonate with Dyson purists. Derbes (University of Chicago Laboratory Schools, now retired) is the power behind getting Dyson’s lectures on quantum electrodynamics, first given at Cornell in 1951, into print in the volume Advanced Quantum Mechanics (World Scientific Publishing, 2007). He’s also an editor on Sidney Coleman’s Lectures on Relativity (Cambridge University Press, 2022) and has written a number of physics papers.

“Gravitational Machines” has been hard to find. Dyson wrote it, according to my polymath friend Adam Crowl, for the Gravitational Research Foundation in 1962; Centauri Dreams regular Al Jackson corroborates this in an email exchange, noting that the GRF was created by one Roger Babson, who offered a prize for such papers. Astrophysicist Alastair G. W. Cameron added it to his early SETI tome Interstellar Communications: A Collection of Reprints and Original Contributions (W. A. Benjamin, 1963). The paper, a tight six pages, does not appear in the 1996 volume Selected Papers of Freeman Dyson with Commentary (American Mathematical Society, 1996).

So we can be thankful that David Derbes saw fit to post it on arXiv. Al Jackson noted in his email that Greg Benford and Larry Niven have used Dyson’s gravitational concepts in their work, so I suspect “Gravitational Machines” was a paper known to them at this early stage of their career. A recent phone call with Jim Benford also reminded me of the Dyson paper’s re-emergence. Listen to Dyson’s familiar voice in 1962:

The difficulty in building machines to harness the energy of the gravitational field is entirely one of scale. Gravitational forces between objects of a size that we can manipulate are so absurdly weak that they can scarcely be measured, let alone exploited. To yield a useful output of energy, any gravitational machine must be built on a scale that is literally astronomical. It is nevertheless worthwhile to think about gravitational machines, for two reasons. First, if our species continues to expand its population and its technology at an exponential rate, there may come a time in the remote future when engineering on an astronomical scale will be both feasible and necessary. Second, if we are searching for signs of technologically advanced life already existing elsewhere in the universe, it is useful to consider what kinds of observable phenomena a really advanced technology might be capable of producing.

There’s the Dysonian reach into the far future, sensing where exponential technology growth might lead a civilization, and speculating at the most massive scale on the manipulation of matter as a form of engineering. But here too is the Dyson of ‘Dyson Sphere’ fame, tackling the question of whether or not such a project would be observable if undertaken elsewhere in the cosmos, just as he would go on to bring numerous other ideas on ‘technosignatures’ to our consciousness. Hence the term ‘Dysonian SETI,’ which I’ve often used here on Centauri Dreams.

Dyson goes on to speculate on the nature of eclipsing white dwarf binaries and their output of gravitational radiation, working the math to demonstrate the strength of such systems in terms of gravitational wave output, and finding that the output might be detectable. However, what catches his eye next is the idea of neutron star binaries, although he notes that at the time of writing, these objects were entirely hypothetical. But if they did exist (they do), their gravitational output should be “interesting indeed.”

…the loss of energy by gravitational radiation will bring the two stars closer with ever-increasing speed, until in the last second of their lives they plunge together and release a gravitational flash at a frequency of about 200 cycles and of unimaginable intensity.

It’s interesting that at the time Dyson wrote, Joseph Weber was mounting what must be the first attempt to detect gravitational waves, although he seems to have found nothing but instrumental noise. The LIGO (Laser Interferometer Gravitational-Wave Observatory) team would go on to cite Weber’s work following their successful detection of GW170817 in 2017, produced just as Dyson predicted by a neutron star binary. Calling such waves “a neglected field of study,” the 1962 paper adds this:

…the immense loss of energy by gravitational radiation is an obstacle to the efficient use of neutron stars as gravitational machines. It may be that this sets a natural limit of about 108 cm/sec to the velocities that can be handled conveniently in a gravitational technology. However, it would be surprising if a technologically advanced species could not find a way to design a nonradiating gravitational machine, and so to exploit the much higher velocities which neutron stars in principle make possible.

At the end of the paper posted on arXiv, David Derbes adds a useful note, pointing out Dyson’s prescience in this field, and adding that he had secured Dyson’s permission to publish the article before the latter’s death. But as typical of Dyson, he also stressed that he wanted Weber’s contribution to be noted, which Derbes delivered on by inserting a footnote to that effect in the text. We can all thank David Derbes for bringing this neglected work of a masterful scientist back into wider view.

In the next post, I want to talk about how these gravitational wave energies might be exploited by the ‘machines’ Dyson refers to in the title of the paper. The paper is Dyson, “Gravitational Machines,” now available on arXiv.

Into the Maelström

“‘This,’ said I at length, to the old man — ‘this can be nothing else than the great whirlpool of the Maelström’… The ordinary accounts of this vortex had by no means prepared me for what I saw. That of Jonas Ramus, which is perhaps the most circumstantial of any, cannot impart the faintest conception either of the magnificence, or of the horror of the scene — or of the wild bewildering sense of the novel which confounds the beholder.” So wrote Edgar Allen Poe in 1841 in a short story called “A Descent into The Maelström,” reckoned by some to be an early instance of science fiction. In today’s essay, Adam Crowl explores another kind of whirlpool, armed with the tools of mathematics to take the deepest plunge imaginable, into the maw of a supermassive black hole. Adam’s always fascinating musings can be followed on his Crowlspace site.

by Adam Crowl

The European Southern Observatory’s (ESO) GRAVITY instrument is a beam combiner in the infra-red K-band that operates as a part of the Very Large Telescope Interferometer, combining infra-red light received by four different telescopes, out of the eight operated (four 8.2 metre fixed telescopes and four 1.8 metre movable telescopes).

The latest measurements of the stars orbiting the Milky Way’s Galactic Core Super-Massive Black Hole (SMBH), otherwise known as Sagittarius A* (pronounced as ‘Sagittarius A Star’), by the GRAVITY instrument have determined its mass and distance to new levels of accuracy:

Ro = 8,275 parsecs (+\-) 9.3 parsecs and a mass of (in 106 Msol) 4.297 +\- 0.013.

Image: The galactic centre in infrared. Credit: NASA.

In round figures, that’s 27,000 light-years and 4.3 million Solar masses. The closest that light can approach a Black Hole and still escape is the Event Horizon, which is the spherical boundary at the distance of the Schwarzschild Radius, which is a radius of 2.95325 kilometres per solar mass. Thus 4.3 million solar masses is wrapped in an Event Horizon 12.7 million kilometres in radius. In aeons to come, when the Milky Way and M31 have collided and their black holes have coalesced, the combined Super Massive Black Hole (SMBH) will mass 100 million solar masses with an event horizon almost 300 million kilometres in radius.

Image: From Tales of Mystery and Imagination, by Edgar Allan Poe, with illustrations by Arthur Rackham (1935).

Into the Maelström

Mass-energy, so General Relativity tells us, puts dents into Space-time. Most concentrations of mass-energy, like stars, planets and galaxies, form shallow dents. Black Holes – like the future SMBH – go deeper, forming an inescapable waterfall of space-time inwards to their centres, the edge of which is the Event Horizon.

Light follows the curvature of space-time, traveling the shortest pathways (geodesics). At the Event Horizon the available geodesics all point towards the “middle” of the Black Hole. For particles with rest mass, like atoms, dust and space-ships, geodesics can’t be followed, merely approached, so they follow different pathways just as inexorably towards the centre.

Instead of flying radially inwards towards the future SMBH’s Centre, let’s ponder orbiting it. For most orbital distances from any Black Hole any small mass in orbit will experience nothing different to orbiting around any other large mass. Too close and you’ll experience extreme tidal forces if the black hole is small, so to avoid being torn to shreds when approaching really close a really big Black Hole is needed. The future SMBH massing 100 million solar masses with a Schwarzschild radius of 300 million kilometres has very mild Tidal Forces at the Event Horizon, though potentially significant for things as big as stars and planets.

We have multi-year ‘movies’ of stars orbiting around our SMBH, though none as close as we will explore. Close orbits get measured in multiples of M – which is half the Schwarzschild radius. At a radius of r = 3M space-time is so curved that the geodesics form a circle around the Black Hole. Light can thus orbit indefinitely, building up to potentially extraordinary energy densities if nothing else gets in its way, forming a so-called Photon Sphere. But the centre of the Galaxy is full of dust and gas, so something is always getting in the way. Eventually even photons get so energetic they perturb each other out of the Sphere.

For particles that don’t follow geodesics, merely approximate them, the Innermost Stable Circular Orbit (ISCO) is further out, at r = 6M. Objects here travel at half the speed of light. Other shapes of orbits can dip a bit closer in, down to r = 4M. Deeper in and motion near the Black Hole is no longer “orbital”. You must point away from the Black Hole and apply thrust or in-fall is inevitable.

The equation of orbital motion from the ISCO radius (rI) all the way to the centre was only recently worked out in closed form for rotating and non-rotating (stationary) Black Holes. Previously numerical Relativity methods were used, complicating modelling of Accretion Disks around astrophysical Black Holes. The Equation of Motion of a test particle (i.e. very small mass) around a non-rotating Black Hole, which our future SMBH might approximate, is straightforward:

? is the angular distance traveled, with a range from negative infinity to zero, by convention. I’ve plotted r against ? here:

The Red Circle at r = 3 M is the Photon Sphere and the Yellow Circle at r = 2M is the Schwarzschild Radius aka Event Horizon. In this case the plot starts at r = 5.95M with the test particle circling the Black Hole 6 times before hitting the central point. The proper time experienced by an observer spiralling into the Centre is a bit more complicated. We can parameterise x as follows to make the mathematics easier:

with ? running from an angle ? to 0. Then the Proper-Time ? of the inspiral trajectory is:

The above equation is true for any black-hole, spinning or stationary. For a stationary Black Hole, rI = 6M, so the equation simplifies to:

But what is M? It’s the “geometrised” mass of the Black Hole, which is derived by muliplying the mass by G/c2. Similarly the proper time is in units of “geometrised” time, so it needs to be divided by the speed of light, c, to convert to seconds.

In the case of the fall from r = 5.95M to r = 0, thus ? = (5.95/6) ? (?) to ? = 0, the total time is ? = 1291.14M. In the case of our Galaxy’s SMBH a proper time of M is 493 seconds. So the inspiral time is 176.6 hours and the Event Horizon is reached with 1.32 hours to go.

Surviving the Plunge

Falling into a Black Hole is probably fatal. However, like any fall, it’s not gravity that’ll kill you, but the sudden stop at the end. The final destination is the concentration of mass at the very centre. As the Centre is approached the first derivative of gravitational acceleration with respect to the radial distance vectors – the tidal forces – that will be experienced will become extreme.

Black holes are the pointy end of a spectrum of astrophysical objects. Stars exist due to their dynamic balance between the outward pressure from their fusion energy production and the inward pressure from their self gravity. When fusion energy production ends, the cores of stars begin collapsing, held above the Abyss of gravitational collapse by successive fusion energy reactions, then electron degeneracy pressure from squeezing free electrons too close together (via the Pauli Exclusion Principle), and when that isn’t enough, neutron degeneracy pressure and beyond.

Pressure is a measure of the ‘expansive’ energy packed in a volume. Dimensionally we can see that F / m2 (Pressure) = E / m3 (Energy per unit volume), so that as the mutual gravitational squeeze pushes inwards on a mass of particles which are pushing back against each other thanks to the Pauli Exclusion Principle for fermions (electrons, protons, neutrons etc) that pressure increases and increases, in a feed-back loop. Too much and equilibrium is never achieved. Thanks to Special Relativity we know that energy has mass, so that Pressure adds to the inward squeeze of gravity as particles are squeezed harder together. When the Gravity Squeeze – Push-Back Pressure process self-amplifies and runs away, the mass collapses ‘infinitely’ inwards forming a Singularity. Such a Singularity cuts itself of from the rest of the Universe when it squeezes inwards past the mass’s Schwarzschild Radius:

The resulting Event Horizon defines a Black Hole, by being a ‘surface of no return’ for everything, including light. Nothing escapes from within the Event Horizon. The minimum mass to cause such an inwards collapse and form a Black Hole for a mass of fermions (i.e. the same particles that make Stars, humans and space-ships) in the present day Universe is about 3 solar masses, squished into a volume smaller than 18 kilometres across.

Before we get to that point there are White Dwarfs and Neutron Stars – objects supported against collapse by Electron Degeneracy Pressure and Neutron Degeneracy Pressure, respectively. White Dwarfs are typically composed of carbon and oxygen – the ashes of helium fusion – and have observed masses anywhere between 0.1 and 1.3 Solar masses. Their radius is proportional to the inverse 1/3 power of their mass:

R* is a reference radius. For a cool white dwarf of 1 solar mass, the radius is about 0.8 Earth’s – 5,600 km. A space vehicle falling from infinity, on a flyby very close to such a star’s surface will rush past the lowest point of its orbit at 6,900 km/s, experiencing over 430,000 gee acceleration. In free-fall however it feels only the first derivative of that acceleration:

Which in this example is 0.15 gee per metre of radial stretching directed outwards and inwards along the direction of the radial distance to the white dwarf and a squeezing force half that directed laterally inwards from the sides. Easily resisted by small structures, like bodies and space-ships.

Neutron stars are smaller again – typically 20 km wide for a 1.3 solar mass neutron star. A near surface flyby isn’t recommended, since the tidal forces are thus almost a million times stronger. Close proximity to a magnetic neutron star is probably lethal anyway due to the intense magnetic fields long before the tidal forces rip you to shreds. Heavier neutron stars get smaller – just like white dwarfs – until they totally collapse as a black hole.

Black holes reverse the trend. The Event Horizon gets bigger linearly with their mass and there’s no upper limit to their mass. Our future Galactic SMBH’s Event Horizon will be 295.325 million kilometres in radius, give or take. Substituting the Schwarzschild Radius equation into the Tidal force equation gives us:

So the tidal force at the Event Horizon is 0.1 microgee per metre. The Moon could almost enter the Event Horizon peacefully…

How far into the SMBH can we, as Observers, then fall? If we can brace ourselves against 1,000 gees per metre of squeezing and stretching, then quite a long way…

Which gives a distance of 139,430 kilometres from the centre. In other words 99.953% of the way to the central Singularity.

What wonders might we see? Quantum Gravity is yet to give a clear answer. Traditionally an imploding mass ends in the Singularity, which is a geometrical point. But quantum particles can’t be reduced to a singular point and retain quantum information. A possibility, due to the massively distorted space-time around the collapsing mass, is that ultimately the quantum particles all “bounce” after hitting Planck density and explode back outwards. To external Observers this is seen, in time-dilated fashion, as the slow-leak from the Event Horizon that is Hawking Radiation. Or, if the particles “twist” in a higher dimension, so they bounce as a new Big Bang forming another Universe. This can be seen as an emergence from a White Hole, as White Holes must keep expanding else they collapse into another Black Hole.

None of those options are ‘healthy’ to be around as flesh-and-blood Observer, so presently surviving the plunge is in doubt.

As we conclude, let’s check back in with Edgar Allan Poe, who knew a few things about terrifying plunges himself. In “Descent into the Maelstrom,” he gives us a look into what might be considered a 19th Century conception of a black hole and the journey into its bizarre interior:

“Looking about me upon the wide waste of liquid ebony on which we were thus borne, I perceived that our boat was not the only object in the embrace of the whirl. Both above and below us were visible fragments of vessels, large masses of building timber and trunks of trees, with many smaller articles, such as pieces of house furniture, broken boxes, barrels and staves. I have already described the unnatural curiosity which had taken the place of my original terrors. It appeared to grow upon me as I drew nearer and nearer to my dreadful doom. I now began to watch, with a strange interest, the numerous things that floated in our company. I must have been delirious — for I even sought amusement in speculating upon the relative velocities of their several descents toward the foam below. ‘This fir tree,’ I found myself at one time saying, ‘will certainly be the next thing that takes the awful plunge and disappears,’ — and then I was disappointed to find that the wreck of a Dutch merchant ship overtook it and went down before. At length, after making several guesses of this nature, and being deceived in all — this fact — the fact of my invariable miscalculation — set me upon a train of reflection that made my limbs again tremble, and my heart beat heavily once more.”

References

Mummery, A. & Balbus, S. “Inspirals from the innermost stable circular orbit of Kerr black holes: Exact solutions and universal radial flow,” Physical Review Letters 129, 161101 (12 October 2022).
https://doi.org/10.48550/arXiv.2209.03579

Fragione, G. and Loeb, A., “An Upper Limit on the Spin of SgrA* Based on Stellar Orbits in Its Vicinity” (2020) ApJL 901 L32
DOI 10.3847/2041-8213/abb9b4

tzf_img_post

White Paper: Why We Should Seriously Evaluate Proposed Space Drives

Moving propulsion technology forward is tough, as witness our difficulties in upgrading the chemical rocket model for deep space flight. But as we’ve often discussed on Centauri Dreams, work continues in areas like beamed propulsion and fusion, even antimatter. Will space drives ever become a possibility? Greg Matloff, who has been surveying propulsion methods for decades, knows that breakthroughs are both disruptive and rare. But can we find ways to increase the odds of discovery? A laboratory created solely to study the physics issues space drives would invoke could make a difference. There is precedent for this, as the author of The Starflight Handbook (Wiley, 1989) and Deep Space Probes (Springer, 2nd. Ed., 2005) makes clear below.

by Greg Matloff

We live in very strange times. The possibility of imminent human contraction (even extinction) is very real. So is the possibility of imminent human expansion.

On one hand, contemporary global civilization faces existential threats from global climate change, potential economic problems caused by widespread application of artificial intelligence, the still-existing possibility of nuclear war, political instability, at many levels, an apparently endless pandemic, etc.

Image: Gregory Matloff (left) being inducted into the International Academy of Astronautics by JPL’s Ed Stone.

One the other hand, humanity seems poised for the long-predicted but oft-delayed breakout into the solar system. United States and Chinese national space programs will compete for lunar resources. Elon Musk’s SpaceX has its sights fixed on establishing human settlements on Mars. Jeff Bezos’ Blue Origin is concentrating on construction of in-space settlements of increasing population and size.

Because of the discovery of a potentially habitable planet circling Proxima Centauri and the possibility of other worlds circling in the habitable zones of Alpha Centauri A and B, one wonders how many decades it will take for an in-space settlement with a mostly closed ecology to begin planning for a change of venue from Sol to Centauri.

Consulting the literature reveals that controlled (or partially controlled) nuclear fusion and the photon sail are today’s leading contenders to propel such a venture. But the literature also reveals that travel times of 500-1,000 years are expected for human-occupied vessels propelled by fusion or radiation pressure.

Before interstellar mission planners finalize their propulsion choice, an ethical question must be addressed. In the science-fiction story “Far Centaurus”, originally published by A, E. van Vogt in 1944, the crew of a 500-year sleeper ship to the Centauri system awakens to learn that they must go through customs at the destination. During their long interstellar transit, a breakthrough had occurred leading to the development of a hyper-fast warp drive.

We simply must evaluate all breakthrough possibilities, no matter how far-fetched they seem, before planning for generation ships. The initial crews of these ships and their descendants must be confident that they will be the first humans to arrive at their destination. Otherwise it is simply not fair to dispatch them into the void.

Recently, I was a guest on Richard Hoagland’s radio show “Beyond Midnight”. Although the discussion included such topics as the James Webb Space Telescope, panpsychism, stellar engineering and ‘Oumuamua, I was particularly intrigued when the topic of space drives came up.

Richard is especially interested in possible ramifications of Bruce E. DePalma’s spinning ball experiment, which has not been investigated in depth. He later sent along a 2014 e-mail released to the public from physics Nobel Prize winner Brian D. Josephson discussing another proposed space-drive candidate, the Nassikas thruster. Professor Josephson is of the opinion that this device is well worth further study, writing this:

The Nassikas thruster apparently produced a thrust, both when immersed in its liquid nitrogen bath, and for a short period while suspended in air, until it warmed to above the superconductor critical temperature, this thrust presenting itself as an oscillating motion of the pendulum biased in a particular direction. If this displacement is due to a new kind of force, this would be an important observation; however, until better controlled experiments are performed it is not possible to exclude conventional mechanisms as the source of the thrust.

It is in this area of controlled experiments that we need to move forward. A little research on the Web revealed that there are a fair number of candidate space drives awaiting consideration. Most of these devices are untested. DARPA, NASA and a few other organizations have applied a small amount of funds in recent years to test a few of them—notably the EMdrive and the Mach Effect Thruster.

Experimental analysis of proposed space drives has not always been done on such a haphazard basis. Chapter 13 of my first co-authored book (E. Mallove and G. Matloff, The Starflight Handbook, Wiley, NY, 1989) discusses a dedicated effort to evaluate these devices. It was coordinated by engineer G. Harry Stine, retired USAF Colonel William O. Davis (who had formerly directed the USAF Office of Scientific Research) and NYU physics professor Serge Korff.

Between 1996 and 2002, NASA operated a Breakthrough Physics Program. Marc G. Millis, who coordinated that effort, has contributed here on Centauri Dreams a discussion of the many hoops a proposed space drive must jump through before it is acknowledged as a true Breakthrough [see Marc Millis: Testing Possible Spacedrives]. These ideas were further examined in the book Marc edited with Eric Davis, Frontiers of Propulsion Science, where many such concepts were subjected to serious scientific scrutiny. When I discussed all this in emails with Marc, he responded:

“The dominant problem is the “lottery ticket” mentality (a DARPA norm), where folks are more interested in supporting a low-cost long-shot, rather than systematic investigations into the relevant unknowns of physics. In the ‘lottery ticket’ approach, interest is cyclical depending if there is someone making a wild claim (usually someone the sponsor knows personally – rather than by inviting concepts from the community). With that hype, funding is secured for ‘cheap and quick’ tests that drag out ambiguously for years (no longer quick, and accumulated costs are no longer cheap). The hype and null tests damage the credibility of the topic and interest wanes until the next hot topic emerges. That is a lousy approach.

“That taint of both the null results and ‘lottery ticket’ mentality is why the physics community ignores such ambitions. I tried to attract the larger physics community by putting the emphasis on the unfinished physics, and made some headway there. When the emphasis is on credibility (and funding available), physicists will indeed pursue such topics and do so rigorously. And they will more quickly drop it again if/when the lottery ticket advocates step up again.”

Marc advocates a strategic approach, which he tried to establish as the preferred norm at NASA BPP, thus identifying the most ‘relevant’ open question in physics, and then getting reliable research done on those topics, thereafter letting these guide future inquiries. He believes that the most relevant open questions in physics deal with the source (unknown) and deeper properties of inertial frames (conjectured). Following those unknowns are the additional unfinished physics of the coupling of the fundamental forces (including neutrino properties).

In light of this pivotal period in space history and the ever-increasing contributions of private individuals and organizations, it seems reasonable to conclude that now is an excellent time to establish a well funded facility to continue the work of the Stine et al. team and the NASA Breakthrough Propulsion Physics program.

tzf_img_post