With our eyes on a proposed interstellar future, we don’t want to neglect the real challenges of preserving the steps taken along the way. I’m thinking about this because of a post on an astronomy list (thanks to Larry Klaes for the pointer) by Richard Sanderson, who is curator of physical science at the Springfield Science Museum (MA). Sanderson is worried about the media upon which we store our information, and for good reason. Here’s the issue in a nutshell:
The difficulties that future historians may encounter are related to the ephemeral nature of digital information and the media used to store it. I can visit an old monastery in Europe, find a giant leather-bound astronomy book from the 17th century, blow off the dust, open it, and read the pages (provided I can read Latin). The only tools required are my eyes and hands. But imagine someone living in the 23rd or 24th century who finds an old box of computer diskettes or CDs. Even if the diskettes haven’t been corrupted and the CDs haven’t delaminated, will that person be able to access the information?
The question surfaces enough in the computer press to give at least some reassurance that it’s under active study, and of course we have projects like the Internet Archive at work, attempting to preserve such ephemera as short-lived Web sites, particularly those associated with concrete events that have their brief moment in the Sun and then disappear. Sanderson quotes Curator, a magazine for museum professionals, to the effect that while high-quality photographic film can last 500 years, unmanaged digital media have a life expectancy of only five. We have examples of books in our museums that have lasted for centuries, but in the digital age we’re constantly forced to transfer our data from one medium to another.
I can see signs of the problem in my own life. I’m using old pulp magazines, some of them collected forty years ago, some recently bought, to work on a book project. An Astounding Story: Science & Fiction in the Golden Age looks at the changing views of science presented by these magazines in the first third of the 20th Century. Anyone who has collected pulps knows that severe limitations are in play here. Not only did World War II paper drives lead to the destruction of vast stores of the magazines, but the paper they were printed on was high in acid, making their deterioration a process that is all but impossible to reverse.
Image: The August, 1937 issue of Astounding Stories. Finding old issues of popular magazines is complicated by the fact that most people in their time saw no need to preserve them.
So I use the lovely old volumes with caution, otherwise keeping them stored in transparent envelopes made for the purpose. It’s remarkable to me how few of these magazines, which reflect their time and place so well, have been digitized against the day when they’re all but unusable. Simple OCR isn’t adequate, because we need the context as well as the content. We need to see the surrounding material, the ads, the illustrations, the environment of the text that informs how people in that era experienced the embedded stories and occasional science article.
Shifting media a bit, I’m also an inveterate collector of old movies. I have some 1400 films going back to the silent era and ending in the mid-1950s (1956, the year Bogart died, is usually my terminus). All of these have been taped from movie channels and now reside on VCR tapes, yet another example of a technology whose day is passing. I could re-master these onto DVD but the task is enormous. In any case, a DVD recorder would allow me to reassemble many of them at far higher quality as presented on my new HDTV. How to proceed, and when to switch?
Sanderson calls this the ‘stewardship’ problem:
Unlike books and paper archives, which only require a decent environment and some security, digital media require continual stewardship, which translates into a significant investment of time and money. As time goes on, data must be transferred over and over onto newer media so that both the information and the context (software) are preserved and made accessible using future digital equipment. The stewardship problem is exacerbated by the sheer volume of digital data being generated.
So are we entering a ‘dark age’ of science that passes along to the future a record containing large holes where digital data were mismanaged? Let’s hope not, but we have to be aware of the danger. Years ago in Reykjavik, as a UNC graduate student trying to master Old Icelandic, I examined a manuscript called the Flateyjarbók at the Háskóli Íslands (University of Iceland). The manuscript, 225 vellum leaves containing sagas of the Norse kings, historical annals and poetry, has survived since the late 14th Century and remains an exquisite testament to a people and their times. Knowing its fragility and its age, not to mention the uniqueness of its contents, we have no doubt that it needs to be preserved.
Image: A page from the Norse saga of King Sverre Sigurðsson in the Flateyjarbók, a manuscript that has endured for centuries, challenging us to find better ways to preserve data that may otherwise be lost. Credit: Árni Magnússon Institut, Reykjavik.
The trick is in understanding things that are closer to our own time. Materials that seem pedestrian to us can quickly disappear. In an era where so many journals are expanding their online presence, where peer review is being taken into new and productive realms on the Internet and the very idea of the book is being challenged by digital devices, we need to remember that science builds upon the efforts of those who have come before. As we take the first tentative steps toward an interstellar culture, we are in a transitional period in data stewardship that is fraught with peril. Let’s be careful about preserving that future history.
Keeping our eyes open over a whole range of wavelengths makes priceless science possible. Thus the new data on dark matter, culled from observations of the galactic cluster known as MACSJ0025.4-1222. The Hubble Space Telescope offered up images in the visual light range, sufficient to provide astronomers (thanks to the effects of gravitational lensing) with a map of dark matter associated with the cluster. The Chandra X-Ray Observatory provided a balancing map of ordinary matter by showing us the distribution of hot gas in the cluster, the latter readily visible in the X-ray wavelengths Chandra works at.
The result is the beautiful, if color-coded, image at the left. Here the dark matter is shown in blue, the ordinary matter in pink. The assumption is this: The two galactic clusters that formed MACSJ0025.4-1222 (each a quadrillion times the mass of the Sun) merged at titanic speeds, causing the hot gas (ordinary matter) within each to collide and slow. The dark matter, however, seems to have passed right through the collision event. The separation visible in the image supports a model of dark matter in which its particles interact little, if at all, with each other, although they are responsive to gravity.
Image: This newly studied cluster, officially known as MACS J0025.4-1222, shows a clear separation between dark and ordinary matter. This helps answer a crucial question about whether dark matter interacts with itself in ways other than via gravitational forces. Credit: X-ray(NASA/CXC/Stanford/S.Allen); Optical/Lensing(NASA/STScI/UC Santa Barbara/M.Bradac).
Located in the constellation Cetus (the Whale) at a distance of some 5.7 billion light years, this interesting cluster offers up independent evidence for an effect previously detected in 1E 0657-56, the so-called Bullet Cluster, in 2006 (but see this for John Moffat’s alternate take on the Bullet Cluster results). The latter was also formed through the collision of two large clusters, and like MACSJ0025.4-1222, showed a clear separation between normal and dark matter. Now we’re in interesting territory, for verifying the Bullet Cluster findings re dark matter indicates the latter are not an exception. Have a look at the image below for clarification of the gravitational lensing technique used.
What we’re observing here is the effect of gravity from a galaxy cluster distorting the light of background galaxies, an effect predicted by Einstein. Two distorted images, representing the same galaxy, are visible above and below the actual location of the galaxy. Astronomers can use numerous such distorted images to pull together a map of gravity’s effects, thus showing the location of mass in the cluster. As we can see in the first image, most of the matter in MACSJ0025.4-1222 is dark. Image credit: NASA/CXC/M.Weiss.
Current thinking, in fact, is that dark matter is about five times more common than the ordinary matter we assumed, until not so long ago, to be the prime constituent of the universe. MACSJ0025 backs up these findings. From the spacefaring standpoint that invariably guides us in these pages, the question is again posed: Is there a kind of reaction mass in space that may offer clues to new forms of propulsion? Yet if this mass is what the researchers call “…a relatively collisionless form of dark matter,” can it ever be harnessed other than through its gravitational interactions with ordinary matter?
The paper is Brada? et al., “Revealing the properties of dark matter in the merging cluster MACSJ0025.4-1222,” accepted for publication in The Astrophysical Journal (abstract).
I like the logo for the Fermi Gamma-Ray Space Telescope, shown at the right. It’s appropriately stylish and, with that ‘beamed’ F emerging out of a galactic core, reminds us that the instrument will be opening a data window on the supermassive black holes found in such places. Fermi was until yesterday known as GLAST (Gamma-Ray Large Area Space Telescope), so the change of name moves us out of acronym territory and personalizes the instrument in favor of one of the true pioneers of high-energy physics, as well as the author of the ever intriguing Fermi paradox.
We’ve talked about the latter in the context of the search for extraterrestrial life, wondering how Fermi’s famous ‘where are they?’ question might be answered. But the Fermi telescope, in space for just two and a half months, is giving signs of being quite a newsmaker itself, if perhaps less controversial. The image below presents a map put together from 95 hours of observation, an all-sky view showing the glow of gas and dust at gamma ray wavelengths, the result of collisions with cosmic rays. We can also see both the Crab Nebula and Vela pulsars, as well as a galaxy in Pegasus that is now undergoing a flaring episode.
Image: This all-sky view from GLAST reveals bright emission in the plane of the Milky Way (center), bright pulsars and super-massive black holes. Credit: NASA/DOE/International LAT Team.
‘First light’ from Fermi, in other words, has been remarkably productive, especially when you factor in the years it took the earlier Compton Gamma-Ray Observatory to produce a similar image. Indeed, the Large Area Telescope (LAT) aboard the spacecraft is itself thirty times more sensitive than any previous satellite detector and will be able to survey the entire sky several times each day. What’s going to be especially useful here is the fact that Fermi’s main instruments, the LAT and the GLAST Burst Monitor (GBM), will be able to detect gamma rays in a fabulously wide range of energies spanning a factor of ten million, according to the Fermi site. Between the two instruments, we’ll see more broadly than ever before in the gamma ray spectrum.
A sign of what’s to come is the fact that the GBM found 31 gamma-ray bursts in its first month of operation. Longer lasting gamma-ray bursts (GRBs) are now thought to herald the explosive demise of massive stars, while GRBs less than two seconds in duration may be the result of the merger of two neutron stars, or the merger of a black hole and a neutron star. Fermi should be able to tell us more about the stars that produce GRBs, how gamma rays occur in the initial burst, and how jets can channel material out of a dying star. Because a single GRB can give off the same amount of energy that our Sun will radiate in its ten billion year lifetime, we have much to learn in that area alone, not to mention what Fermi will teach us about the processes at work in galactic cores.
by Kimberly Trent
Here we depart briefly from the norm by looking at the work of Kimberly Trent, a graduate student in the Applied Physics Program at the University of Michigan. Working as an intern with Marc Millis at NASA’s Glenn Research Center, Trent examined the broad issues of advanced propulsion and focused on a research topic that takes off on a Robert Forward idea from the 1960s. The goal: To develop a propulsion concept involving non-Newtonian frame-dragging effects, which Trent studies in relation to the work of Martin Tajmar. The details follow, in an article designed to show one student’s involvement in the kind of studies Tau Zero hopes to encourage at other institutions.
This past summer, I interned at the NASA Glenn Research Center in Cleveland, OH through the NASA Academy program. My individual research project was in theoretical spacecraft propulsion. This area involves research into devices and concepts such as space drives, warp drives, gravity control, and faster-than-light travel. This research began in the 1960s and was mainly carried out by private companies, individual efforts, and the Air Force. For example, Robert L. Forward who founded Forward Unlimited and who consulted for NASA and the Air Force, did studies on the possibility of generating gravity controlling forces using an electromagnetic torus configuration with an accelerated mass flow instead of an accelerated electric current.
In the 1990s, governmental funding increased for theoretical propulsion research. One of the programs funded was the Breakthrough Physics Propulsion (BPP) Project (1996-2003) headed by Marc Millis, who was my advisor this summer. This project supported research that strove to uncover new physics that would led to the development of more advanced propulsion methods. Since current propulsion technology is not suited for extended manned missions and interstellar travel, new physics, or a deeper understanding of the laws of nature, is needed so that more advanced propulsion methods can be developed. If new propulsion physics is discovered, a new class of technologies would result, revolutionizing spaceflight and enabling humanity to learn even more about the Universe.
Image: An artist’s conception of an interstellar ramjet. Credit: European Space Agency (ITSF)/Manchu.
On the other hand, even if a breakthrough does not exist, using the narrower goal of breakthrough propulsion introduces a different point of view for tackling the lingering unknowns of physics. Analyzing the unanswered questions of science from this perspective can provide insights that might otherwise be overlooked from just curiosity-driven science. Theoretical spacecraft propulsion research is important and useful because its minimum value, which is what can be learned along the way, can still be revolutionary. The BPP Project ended in 2003 when NASA’s priorities shifted.
Even though research has decreased in this area due to lack of funding, progress is still being made by private companies and individual effort. For example, Millis and some of the scientists who worked with him on the BPP Project, along with other scientists that he networks with around the world, continue to pursue research in their spare time. Millis founded the Tau Zero Foundation, which formally connects this group of researchers. In addition to collaborating on further research and reviewing papers, they create tools that will make it easier for the next generation of researchers to access past research in this field.
One of these tools is a textbook put together by Millis and his associates entitled Frontiers of Propulsion Science, which is pending publication through AIAA. A great amount of research in these areas has been compiled for the first time in this book. In addition, the book provides guidance for researchers who wish to enter the field, stressing to them the importance of approaching this research in a scientifically rigorous way. In order for this research to be taken seriously by critics and other scientists, Millis and his colleagues emphasize that researchers must strike a balance, conducting their study within the rigorous constraints of conventional physics while still remaining open to the possibility of results that may test or extend our understanding of those principles.
This summer, I used the manuscript of this book along with a number of other individual research papers to help me carry out my research. During the first part of the summer, I read through much of the book to familiarize myself with past and current theoretical spacecraft propulsion research. Then I decided on a research topic that was of interest to me. The topic I chose explores the non-Newtonian frame-dragging gravitational forces that have been observed near distant rotating massive cosmological objects, and that NASA’s Gravity Probe B has attempted to detect near the earth. In addition, these gravitational forces can be derived from Einstein’s general relativity field equations.
Due to the small magnitude of these forces on a cosmological scale, it was thought that they could only be detected and studied through astronomy and very sensitive satellite experiments. Then in the early 2000s, Martin Tajmar started to investigate rotating ultra-cold rings cooled by liquid Helium, and detected that they were generating what appeared to be macroscopic frame-dragging fields. These effects are still undergoing examination in ongoing experiments. However, preliminary mathematical analysis of these fields using Einstein’s equations shows that these frame-dragging forces and the Newtonian gravitational force may have a relationship analogous to that described by Maxwell’s equations for electricity and magnetism.
Using this preliminary analysis and these initial results from Tajmar’s experiments, I wrote a paper in which I performed a cursory frame-dragging propulsion assessment. I proposed a conceptual device similar to the gravitational version of the electromagnetic torus Robert Forward proposed in the 1960s. However, his torus was based on cosmological observations and therefore required a mass flow with the density of a dwarf star flowing through tubing with the diameter of a football field to produce a gravitational field at the center of the torus equal to 1g.
Using Tajmar’s results, rotating ultra-cold rings with a diameter of about 6 inches replaced the massive tubing and mass flow and resulted in a gravitational field with the same 1g magnitude at the center. Still, relativistic accelerations on the order of 1011 m/s2 would be needed even with this version of the torus. However, this analysis shows that we may be making progress towards figuring out how to make something like gravity control a possibility in the future.
Keep in mind that this analysis was a preliminary assessment in which many assumptions were made. Its purpose was to demonstrate how the frame-dragging effect might be used for propulsion so as to stimulate additional thought and further research on this topic. At the end of my paper the “next-steps” for further research in this area were outlined. These include deriving equations for the torus configuration from Einstein’s equations directly so that a more exact analysis can be performed, and carrying out additional experiments on the frame-dragging effect Tajmar is observing so that it can be better understood. At the end of my summer program, Millis found the paper I wrote to be suitable for submission to the Journal of the British Interplanetary Society. We submitted the paper and are now awaiting the peer review process.
The latest Carnival of Space offers several posts with an interstellar bent in addition to our own discussion, linked to from the Carnival, about antimatter rocketry and the recent thinking of JPL’s Robert Frisbee. I notice that Gerald Cleaver and Richard Obousy’s ideas about warp drive continue to get play, with particular reference to the amount of energy that this purely theoretical construct might demand. As with Alcubierre’s own warp drive speculations, the energy levels are daunting, in Cleaver and Obousy’s case the equivalent of converting the planet Jupiter into energy (that actually beats many Alcubierre demands!).
Thus NextBigFuture‘s comment, rising naturally from this conundrum:
…it makes no sense to assume being able to convert a planetary mass into energy without having increased control of technology and information and increased economy. It is like assuming a group of cavemen get the designs for a supersonic plane but only have the economy of their tribe of six to fund it. The assumptions would also be that they need to transport their rock caves and the woolly mammoths and buffalo herds that they hunt.
A point well taken, and one reason why blogger Brian Wang looks to laser propulsion as an alternative, a prospect that appeals to near-term thinkers because it takes us back into the realm of known physics. Moreover, in its various manifestations, beamed power leaves the propellant at home so that the spacecraft can carry a greater payload. What we need to learn, of course, is how sails behave in space, an examination we’ve yet to begin — let’s hope SpaceX can help us get the duplicate NanoSail-D package (whose existence was revealed in these pages) onto an upcoming Falcon flight. The ill-fated Cosmos sail built by the Planetary Society was itself capable of being used for microwave beaming experiments, and the sense here is that a world of useful experimentation awaits if we can just deploy that first true sail.
Mike Simonsen at Simostronomy takes a look at recent computer simulations from Edward Thommes and team that model planetary formation, with results that some have found unsettling:
What they found is that our solar system represents the rare cases, where gas giants form, but do not migrate to the inner planetary system, and the final orbits of the planets in the system are fairly circular and stable. In many simulations, lots of gas giants formed in chaotic environments with collisions and eccentric orbits. In other simulations, plenty of smaller rocky planets formed, but hardly any gas giants materialized out of the proto-planetary disk. Only under specific, unique conditions do planetary systems like ours evolve.
We’ve discussed the Thommes work with interest here, but failed to catch a National Science Foundation interview, in which the scientist described Earth-like planets as fairly common: “…they’re almost like weeds, they’ll sprout up under almost any conditions.” The uncommon aspect of our Solar System, then, is the existence of those gas giants in their particular orbits, posing the question of what happens to Earth-like worlds when gas giants migrate inward, as they seem to do in many simulations.
In a wider context, what happens if we do find out that a planet like ours really is rare? How would we cope with the overthrow of yet another paradigm, the Copernican perspective that has us constantly assuming we are living on an average planet in an average galaxy, and that given enough time we will inevitably find other intelligent species on similar worlds? Not that the Thommes team takes us anywhere near that conclusion, but it does offer a challenging look at planet formation theories that will only be confirmed or refuted once we have the resources in space to perform small exoplanet detections. And let’s just say it twists the tail of the Copernican assumption rather provocatively.