A mission to the Sun’s gravity focus – or more precisely, the focal ‘line’ we might begin to use at around 650 AU – is never far from my mind. Any interstellar mission we might launch within the next thirty years or so (think Breakthrough Starshot, about which more next week) will essentially be shooting blind. We have little idea what to expect at Proxima Centauri b, if that is our (logical) target. But a mission to the solar gravity focus (SGL) would give us a chance to examine any prospective target at close hand.
Indeed, so powerful are the effects if we can exploit this opportunity that we should be able to see continents, weather patterns, oceans and more if we can disentangle the Einstein Ring that the planet’s image forms as shaped by general relativity. We’ve discussed the phenomenon many a time: The Sun’s gravitational well so shapes the image of what is directly behind it as seen from the SGL so as to produce stupendous magnification, the image served up as a ‘ring’ around the Sun in the same way that astronomers now see some distant galaxies as rings around closer galaxies.

Image:The Einstein Ring and how we could sample it. By looking at different slices of the Einstein ring, enough information could be acquired for a computer deconvolution to reconstruct the planet. Credit: Geoffrey Landis (NASA GRC).
Within that ring there is bountiful information. Not only would we have an image we could reconstruct, but we also would have multipixel spectroscopy, allowing us to identify elements through the signature of light from the planet aand to map these properties in more than one dimension. So fecund is the information in the Einstein ring that we could detect all this with a spacecraft telescope no more than a meter or so in diameter. And because the SGL focal line extends to infinity, we can keep taking observations as we move outward from 650 AU to perhaps 900 AU.
Now comes JPL scientist Slava Turyshev with a trade study – an analysis made to evaluate and select the best propulsion technique to make a flight to the SGL possible within a rational timeframe, here seen as roughly thirty years. That seems like a lot, but bear in mind that even our far-flung Voyagers have yet to reach a distance that’s even halfway to the SGL region. Remember, too, that once we find a way to propel a craft to the SGL, we have to choose a trajectory so precise that our target will be exactly opposite the Sun from the spacecraft. In this business, alignment is everything.
Each new Turyshev paper into SGL territory reminds us that this work has been taken into Phase III status at the Jet Propulsion Laboratory, funded by NASA’s Institute for Advanced Concepts. The potential showstoppers of an SGL mission are daunting, and have been examined in papers that examine everything from sail design and ‘sundiver’ trajectories to deconvolution of an SGL image. Perhaps most futuristic has been the Turyshev team’s discussion of self-assembly of a payload divided into small packages into the completed observational equipment enroute. Previous Centauri Dreams articles such as Solar Gravitational Lens: Sailcraft and Inflight Assembly or Good News for a Gravitational Focus Mission may be helpful, though the pace of stories on the SGL has been accelerating, and for the complete sequence I suggest a search in the archives.
All this is bringing me around to the scope of the propulsion problem. In addition to the need for precise positioning within the SGL focal line, the spacecraft must be able to move laterally within the image, which is of considerable size. One recent calculation found that an Earth-sized planet orbiting Epsilon Eridani (10 light years away) would project an image 12.5 kilometers in diameter at 630 AU from the Sun. One envisions multiple spacecraft taking pixel samples at various locations within the image plane. The image must then be produced by integrating these samples. This is ‘deconvolution,’ turning the Einstein ring into a coherent image free of ‘noise.’
As Geoffrey Landis, who made this calculation, points out: The image is far larger than the spacecraft we send. Landis (NASA GRC) also notes that a one-meter telescope at the SGL collects the same amount of light as a telescope of 80 meters without the gravitational lens. So we definitely want to do this, but to make it happen, the spacecraft will need propulsion and power. All this has a bearing on payload, for in an environment where solar panels are not an option, we need a radioisotope or fission power source.

Back to the Turyshev paper. Propulsion emerges as perhaps the mission’s most significant challenge, although one that the author thinks can be met. Here we run into what I call the ‘generation clock,’ which is the desire to keep mission outcomes within the lifetime of researchers who launched the project. Twenty to thirty years in cruise is often mentioned in connection with the SGL mission, meaning we need the ability to reach 650 AU with our spacecraft within that timeframe. A daunting task, for it involves reaching 154 kilometers per second. On outbound trajectories we’ve yet to exceed Voyager’s 17.1 km/sec, highlighting the magnitude of the problem.
Image: JPL’s Slava Turyshev.
We can’t solve it with chemical rockets, not even with gravity assist strategies, but solar sails coupled with an Oberth maneuver loom large as a potential solution. Advances in materials science and the success of missions like the Parker Solar Probe remind us of the potential here, offering the option of deploying a sail in a tight perihelion pass to achieve a massive boost. To manage 650 AU in 20 years means we will need 32.5 AU per year. But if we can work with a perihelion pass at 0.05 AU (7,500,000 km), we can achieve that speed, and the Parker probe has already proven we know how to do this. Finding the metamaterials to make a sail survive such a passage is an ongoing task.
The paper sums the issue up:
Recent “extreme solar sailing” studies emphasize that very fast transits are achievable in principle only by combining ultra-low total areal density with very deep perihelia (a few solar radii), which moves the feasibility question from trajectory mechanics to coupled materials, thermal, and large-area deployment qualification. For example, [Davoyan et al., 2021] analyzed extreme-proximity solar sailing (≲ 5 R⊙) and discussed candidate metamaterial sail approaches together with the associated environmental and system challenges at these perihelia. These results reinforce the conclusion here: sub-20 yr sail-only access is not ruled out by physics, but it lives in a tightly coupled materials+structures+thermal qualification regime at mission scale.
So we have a lot to learn to make this happen. The paper notes that as we move from current sail readiness to what we will need for the SGL mission, we go from sails that are in the 10-meter class up to sails as much as 300 meters in diameter, while still needing to keep our sail material astonishingly thin and capable of surviving the perihelion temperatures. Operating at deep perihelia with metamaterials is a subject still very low on the TRL level, meaning technical readiness to produce and fly such a sail is nowhere near where it needs to be if we are to launch in the 2035-2040 window hoped for by mission planners. If we can launch multiple sails, we can consider self-assembly of the larger payload in transit, also at a very low TRL
Importantly, this maturity gap is not a physics limit: it is a program-and-demonstration limit. A focused late2020s/early-2030s development that couples (i) large-area deployment validation, (ii) deep-perihelion optical-property stability tests, and (iii) integrated areal-density demonstrations at the 104–105 m2 scale could credibly raise the SGL-class sail system TRL into the mission-start window, particularly for the 25–40 yr-class access regime.

Image: Sailcraft example trajectory toward the Solar Gravity Lens. Taken from an earlier report by Turyshev et al.
Nuclear electric propulsion (NEP) offers certain advantages over solar sails, including the fission reactor that powers its thrusters, for as mentioned, solar power at these distances is not practical. Turyshev’s calculations make the needed comparison, yielding a mission that can reach 650 AU in 27 years, putting it in range of what the sail strategy can deliver. Using propellant remaining in the craft upon arrival at the SGL, our spacecraft can now manage station-keeping and trajectory changes necessary to collect the needed pixels of our exoplanet image. In terms of operations, then, as well as payload capability, NEP stands out. Note that here again we have thermal issues, for the NEP-powered craft will need their own close perihelion pass to boost velocity. Turyshev points out that NEP will also demand large, deployable radiators to allow the escape of waste heat.
Nuclear thermal propulsion (NTP) now comes into the discussion, as the author considers potential hybrid missions. In NTP, liquid hydrogen is heated by the reactor core to produce thrust through the exhaust nozzle. Capable of high specific impulse, this method is treated here as “a high-thrust injection stage,” one that could be used during an Oberth maneuver to increase the velocity of an NEP-equipped spacecraft. The nuclear issues persist: We need safety analyses and ground testing facilities for the reactor, radiological handling protocols, and additional flight approval processes.
The three propulsion options play against each other in interesting ways. Sails avoid the problem of flight approval for nuclear materials as well as necessary infrastructure for ground testing. But materials and deployment issues still exist for these ultra-thin sails. An NEP engine that offers wider use beyond the SGL mission could lower incremental costs. And what if we tinker with mission duration? The fact remains that regardless of the choice of propulsion, we still have to operate in an environment that requires radioisotope or fission power, with all the implications for payload overhead that entails.
Programmatically, a credible 2035–2040 start requires aligning architecture choice with what can be demonstrated by the early 2030s. If minimum TOF [time of flight] is the primary requirement, solar sailing (with an explicit deep-perihelion materials and deployment qualification program) remains the most schedule-aligned approach. If delivered capability and operational robustness at the SGL dominate, NEP is uniquely attractive, but a 2035–2040 launch that depends on NEP for transportation must be preceded by an integrated stage demonstration that retires system-level coupling risks (thermal, EMI/EMC [Electromagnetic Interference / Electromagnetic Compatibility], plume, autonomy, and nuclear approval). In either case, SGL transportation should be treated as flagship-class in development complexity because the critical path runs through integrated demonstrations rather than through single-component maturity.
This is how missions get designed, and you can see how involved the process becomes long before actual hardware is even built. My belief is that the question of the generation clock is fading, for in dealing with issues like the SGL, we’re forced to contemplate scenarios in which those who plan the mission may not see its completion (although I hope Slava Turyshev is very much an exception!) In sending missions beyond the Solar System, we create gifts of data to future generations, who may well use what the SGL finds to plan missions much further afield, perhaps all the way to Proxima Centauri b.
The paper is Turyshev, “Propulsion Trades for a 2035-2040 Solar Gravitational Lens Mission,” currently available as a preprint. For more on acquisition of the lensed image, see Geoffrey Landis’ extremely useful slide presentation.



Nice article; good to see that people are continuing to think about using the solar gravitational lens.
A quick note, a little more technical detail is in the paper from the AIAA Aerospace Sciences Meeting: https://www.kiss.caltech.edu/papers/solar/papers/solar%20gravity%20lens-Landis-AIAA-2017.pdf
Like it or no–SLS Block 2 is the best option.
The author himself suggested use of both nuclear thermal upper stage and a nuclear electric bus.
I doubt solar sails will be of much utility outside of field measurements for the heliophysics guys trying to take over New Horizons.
The way to use a sail would require you to dive in very close to the sun. This will increase the thrust you get from the sail, and also multiply the delta-V you achieve, due to the Oberth effect of thrust deep in the sun’s gravitational well. However, the difficulty is that the sail will get very hot, unless it has extraordinarily high reflectivity.
Once the trajectory gets much past Mars’s orbit, the sail gets to be pretty useless.
Hi Jeff
An NEP would require a space-going nuclear reactor – which doesn’t exist. Static power supplies are proposed, but are sub-optimal for propulsion applications. There’s lots of work on NTR’s, but there’s so much engineering knowledge required and so little political will to achieve it.
Do we need to go to the SGL at all?
The mission is unusual in that, unlike most conventional optical instruments, the focal image is far larger than the receiving telescope. The Einstein ring focus (ER) is about 12.5 km in diameter. It needs to be deconvolved to view the image. So the receiving telescope at the SGL must traverse the focal image to receive samples of the ER. Because the planet is moving, both orbiting its star, and rotating on its axis, the image of the surface is changing over time. Therefore, it would be better to deploy a swarm of telescopes to sample their section of the ER at the same time and integrate those samples back on Earth.
This approach is the same as using multiple, distantly separated telescopes to integrate an image with a synthetic aperture of about their separation.
But if we can do this, why not use many more telescopes, much closer to the sun, to capture the small piece of the ER? The aperture is still about that of the sun, but the sampling sensors are a swarm situated in orbits that are approximately at the same relative position with regard to the ER. The swarm could orbit the sun, taking images of their portion of the ER as their orbits synchronize at the same relative point from the sun.
A large number of small telescopes is needed, orbiting at different angles to the sun’s equator, to capture many ER samples for the “same ER.”
IDK if this works to capture the needed information from the ER, but if it does, it would offer an advantage in that many different ERs could be captured, allowing planets and stellar surfaces to be captured from many stars, not just 1 target, as well as a much reduced time to deploy the telescopes.
All the data would be carefully linked to position and time so that it can be integrated later on terrestrial supercomputers to image the targets.
Do the telescopes even need to orbit the sun?
If we build the telescopes as statites, the synthetic aperture can be any size wanted, as long as the capture of photons is adequate. The photons wouldn’t need to be from the ER. Perhaps a maximum separation of 10 million km to create a synthetic telescope aperture of that size. The telescopes could be placed anywhere that is most operationally convenient.
We’ve seen that interferometry in optical wavelengths is possible from the imaging of the black hole. It should, in principle, be possible for a swarm of widely separated telescopes to image an exoplanet.
I agree algorithmic optical interferometry could render a SGL mission redundant, however I don’t see why a SGL would require it. A telescope scanning across a focal plane orders of magnitude larger than itself is no different than a telescope scanning the surface of a planet. Additional telescopes would speed up the scanning and make it easier to understand transient features.
If we’re talking about solar sails hundreds of metres in diameter, I wonder if it would be possible for them to do double-duty in the mission, perhaps by focusing light from a large area to a central telescope. That might reduce the amount of lateral movement needed.
Yes, or a make a sail 12.5km in diameter that has dual purpose as image collector with light sensors over its surface. This is very exciting.
Apart from NEP, note that solid core NTP would be inadequate for the mission. NASA are currently looking into the liquid molten core CNTR option however the Specific Impulse is still note quite there to meet mission requirements. Open-cycle-gas-core NTP would be a better option (among others), more info here:
https://www.linkedin.com/pulse/advanced-propulsion-literature-paul-titze-9a57c/
Alex, I have wondered the same if many telescopes can be integrated into one using time stamps, supercomputers and perhaps AI for best fit. I suppose it could be calibrated on a well defined object such as disc with known features like a light sail going towards the target object.
@Michael
While I don’t see much development headway on propulsion, I think we can be sure that computer hardware and software (including AI) will continue to rapidly improve so that computational approaches for high-resolution images will prove to be the better approach to getting these exoplanet images. We’ve seen signs of this in previous posts and news articles. The relatively rapid advances in computational approaches are why I see robotic probes becoming smaller and more capable, pushing humans out of space-traveling exploration in favor of machines. Huge, distributed telescopes could well be part of the mix. Radio telescopes first, as they are doing on Earth, and then optical, as computation becomes fast enough to handle offline interferometry and eventually real-time.
I’d like to see swarms of small telescopes with star shades used collectively for such observations, as well as used individually for other observations. Mass production and small size keep costs down, while computing provides the “glue” for the swarm to act as a single telescope.
I think Spacex prioritising the moon is a game changer. Lasers and particle beams and what Mr Bedford likes most microwave transmitters on the moon allows us much more freedom to build huge drive systems. The first large scale telescope integration could well be on the moon as well.
I hate to say it but I don’t see a science return of an SGL mission commensurate with the investment. The value, if any, is development of the component technologies that could be put to better use elsewhere, be it propulsion or otherwise.
With regard to optical interferometry, that is extraordinarily difficult. The problem is data collection, not data processing. The former must be done real time while the latter can be done at leisure. Direct digital sampling is only doable at present up into the low GHz range. Optical is in the hundreds of THz, 5 order of magnitude higher. All we can do at present is bit buckets.
Optical interferometry can only be done at present, and well into the future, with direct combining of telescope images. There are conceptual designs for instruments that push that technology to its limits but is still limited to hundreds of meters separation. Optical interferometry requires a combining alignment of multiple telescopes to within a few hundred Angstrom.
As extraordinarily challenging as that is, it’s orders of magnitude easier than an SGL mission. But I love the idea anyway. Oh well, I won’t be around for it anyway.
I’m no expert, but I put your idea to an AI and, with some insistence, on its own it estimated four orders of magnitude. I also asked it about Moore’s law in sampling and it estimated, with much uncertainty, 1 order of magnitude per decade. Potentially, it could compete with an SGL mission, but I bet there’s more to it than that.
Nonetheless, the SGL mission would go down in history as one of the great heroic missions of antiquity. I think the preconception that its function is to give us a 2D pixellated image of a planet is a bit of a stumbling block. Without so many telescopes, it could still give people a chance to obtain spectroscopic data from various portions of a planet by day and by night, giving a good sense of some of the topography, and also some notions as to the chemical composition of the landscape and – if the planet be well chosen – its vegetation.
“I bet there’s more to it than that.”
Indeed. So why the AI slop? Provide a technical reference.
“great heroic missions of antiquity”
Heroes have a habit of dying in the act. Heroic doesn’t imply success.
“if the planet be well chosen”
Good luck with that!
You’ve presented no good reason for me to change my view of the matter.
@Ron S.
I read “heroic” that Mike used, in the sense of impressive scale:
https://www.merriam-webster.com/dictionary/heroic
Alright, I deserved that. I don’t know this field that well. But “AI slop” is a lovely short-cut, and the answer the thing delivered just now really impresses me, at least. In my own simple-minded search for “solar gravitational lens spectroscopic” on Arxiv, all I really got was the original Turyshev, 2018 reference, which does discuss it. But the AI delivered Turyshev, 2022, specifically about spectroscopy, though in the end it teases us with “specific imaging strategies for prospective exoplanet targets are yet to be worked out…” But Madurowicz, 2022, also found by the AI, is more satisfying: “A pseudoinverse-based image reconstruction scheme demonstrates that direct reconstruction of an
Earth-like source from single measurements of the Einstein ring is possible when the critical caustic and observed SNR are sufficiently large.” Now, these are technical sources and I only understand a bit of them; I imagine you could readily pull much more out of this.
These telescope swarms could be used quickly to determine if a SGL mission would be worth it. Thousands of potential planets could be scanned before deciding on the best target.
Even if this is doable, do we have a suitable target to look at? My understanding is that we can look only at one spot before we have to move several AU to focus on another. Thus do we have a close enough star with a promising Earth like planet that is worth spending so much effort and resources? The targetted planet must be well studied beforehand to be sure it’s worth a closer but very expensive look. Any candidates in mind?
At present, Proxima Centauri b sounds like a good bet. A bit over Earth mass and in the habitable zone of its star.
I like the idea of Trappist
as a victim of this. So many
targets – so close together.
I still think a fission implosion drive is a viable engine. Firstlight have an impact design for fusion but it is easily adapted for fission. The amount of fissile material needed is surprisingly small.
Paul,
Have you thought about writing a new book. It will be twenty odd years since your last one. A sort of how far we have come. And thanks for signing the copy you sent me, it’s on my collectors shelf !
Glad you like the book, Michael. It was a pleasure to sign it for you. As to a new one, I purposely went to the Internet format in 2004 because I had seen how much occurred between writing something and seeing it finalized on the page of a printed book. So this site is how I decided to keep up with the field, because I could get right on the latest information. But there may well be another book in the works, a book of a different kind, and I’ll plan on saying more about that in the future a bit.
@Paul,
You might want to think about packaging up the posts in different subjects. The menu could have a topics selection (based on hashtags?) and the return could be a list of relevant posts, or better, posts packaged in a more book-like form for the viewer to “page” through, like that 1 page per star in Vistas of Many Worlds: A Journey Through Space and Time. Erik Anderson.
Not that long ago, when there was interest in avoiding the extortionate prices of student textbooks, there were ways to package web content (uncopyrighted or free to use non-commercially) from different sources into “textbooks” to be distributed as PDFs or printed material for educational courses. There was software to do the extraction and formatting. It should be possible to do this from the large archive of CD posts, using different approaches, such as “planetary formation” or “The changing ideas of planetary formation in the 21st century.” The textbook packages could be updated periodically to incorporate new posts. They might even be offered as an educational resource.
Excellent ideas, Alex. Let me ponder these.
A mission to the solar gravity focus (SGL) that takes 25 to 30 years could become outdated as new technology emerges during that time.
For instance, if we used technology available in the next five years, we could build a NERVA-type nuclear thermal rocket with a specific impulse of about 800 seconds. This mission would reach the SGL in 30 years.
Looking ahead, in the next ten years, lunar city missions might build an electromagnetic catapult to send large cargo into lunar orbit. With the benefits of space and vacuum, a nuclear light bulb thermal rocket with an ISP of 2000 could be created. The engine, scientific equipment, and several hydrogen-propellant drop tanks would be launched and assembled in lunar orbit. Not only is the engine significantly better, but it also allows much more propellant to be added to the mission with little additional cost.
At the optimal orbital window, the assembled spacecraft would launch toward Earth or Venus on a sunward trajectory, using planetary flybys to adjust its inclination as needed. The probe would then proceed to a close solar approach to execute an Oberth maneuver. The mission could approach the Sun closely by employing empty hydrogen propellant drop tanks as thermal shielding, which would be jettisoned after the closest approach.
This theoretical mission would reach the SGL with a larger, multipart scientific payload within 15 to 20 years after launch. This second mission would generate data both more rapidly and of higher quality. Additionally, subsequent missions could be launched at intervals of six months to one year, resulting in a volume of data that would render the initial mission largely obsolete in the context of scientific progress.
This concept could be further expanded to consider future missions employing nuclear fusion plasma engines. The development of such technology would not be developed exclusively for SGL missions but would be for a wide range of space exploration initiatives.
This is an inherent risk of a concept that requires rapid technological development, where advances—or improvements—occur so quickly that long-term planning/execution and mission duration based on previous technological capabilities are rendered obsolete by the very technology that facilitated the original project.
Excellent article.
When the specifications of the spacecraft were described for a close orbit around the sun, with a need for maneuvering, it occurred to me that a cigar-shaped vessel with a deployable, disposable solar sail on one end and a reflective shield on the other that could be cast off, would be appropriate. The needed fuel and power generation and whatever else is necessary could be accommodated in the central part.
Then it occurred to me that we had a recent visitor that might be the post-trip version of such a vessel.
A few readers seem to assume that conventional (even extremely large) classical optical facilities could be used to obtain true direct multipixel images of an exo-Earth at comparable fidelity. However, this is not the case — the required aperture/baseline and the associated system-level complexity to reach the necessary angular resolution and SNR become prohibitive, which is precisely why the SGL architecture is so compelling. I summarize the technical reasons in my recent paper (https://arxiv.org/abs/2506.20236), in case it’s useful.
Interesting paper. This is a far better reply to Mike Serfas above than any I could draft. But it is long! I’ve downloaded the paper and I hope to read some of it in the near future.
It can never compete which is agreed but large telescope constellations could be used to give a pretty good look at the planet before throwing a probe out that far.
I finally found a few hours to read Slava’s paper. I didn’t verify the math but that dedication of time hardly seemed necessary since even if there were errors it wouldn’t change the conclusions. The infeasibility is that large.
At the risk of offending the author I would summarize the paper thusly (not including SGL and fly-by missions, which are in a different class), with respect to on Earth and near Earth observatories:
1. Not enough photons.
2. If there are enough photons, there are too many unwanted photons (noise) to extract an image beyond a single pixel.
3. If there are enough photons and relatively low SNR, the distortions from signal combination are so large that no usable image beyond a few pixels can be extracted.
I assume that multi-year and even multi-day integration times are unattainable.
Although I can understand the need to quantify the degree of difficulty deriving an exoplanet image , all that I can say for myself is that the conclusions are unsurprising. But SGL and fly-by missions have immense challenges of their own which cannot be overcome until well into the future.
My understanding from our previous conversation about sampling rate is that if Moore’s Law continues for another half century, it will be possible for probes at arbitrary separation (at least 1 AU, because it is easily achieved by probes escaping from Earth’s orbit) to exchange recorded signals to generate an image.
The paper says that with 12.6 square meters of surface area, it will take 27,000 years to generate 100 pixels. But I don’t see any reason why it should not be much easier, rather than harder, to make very large mirrors in space, or more sophisticated active metamaterials, once the technology for space manufacturing has matured. With 1.5 square kilometers of surface area, the image should be available in one minute.
I’m also curious how much information about the planet can be extracted from such a telescope without resorting to pixelization. For example, can you study the land and sea of an exoplanet image independently, without physically knowing where they are located?
But what I find most appealing about the notion of recording signals and working out the interferometry computationally is that I’m daydreaming it could allow the visualization of everything, in every direction, all at once, starting with data from a cloud of innumerable tiny probes bearing a suspicious resemblance to cosmic dust.
since this concept first appeared on Centauri Dreams I have been fascinated by the physics and the technology involved. But I still have one big concern: It would appear that the mission assumes that the sun resides at a fixed point in the center of the solar system. But it does not. Beside obeying relativistic physics, it also observes Newtonian physics and moves about the center of mass of the solar system. Since Jupiter is the principal element of this multi-body system and a thousandth the mass of the sun, to first order it provides a reactive force applied to the sun and a consequent acceleration and velocity, which causes the sun to rotate around a center of radius just less than that of the sun over twelve years; then you have Saturn and the rest adding additional wiggles to its path.
So, if we picture the sun as an omni-directional search light and the Einstein radius a cloud bank, any spacecraft we come up with will not have the pleasure of coasting in orbit at ~600 AUs. With the solar principal cycle this wandering path about 12 years, the movement of GR line will far exceed orbital rate and have additional gyrations.
If a star is observed in the solar equatorial plane or the plane of the ecliptic, I suppose this interference is reduced. But exoplanet targets above or below this plane will have considerably increased observational problems.
In other words, if you want to observe exoplanets with a gravitational lens, one would get better results if you set up your observatory with a star that has no planets.
@wdk
The better locations for those nodes in a galactic network? Given the frequency of stars with planets, this may reduce the targets for observation.
As to our own system, it seems like the best time to do the data collection is when Jupiter in aligned with the sun and SGL telescope, so that the wobble is along the sun-telescope axis.
Would not the Einstein Rings be based around the center of mass of the solar system. Which I presume does not jiggle.
N.S., and A.T.,
With new concepts and looking on over the construction fence, sometimes one notices new opportunities or applications – and in this case there are some.
But the issue of solar “wandering” I just can’t get off my mind to enjoy the other prospects. When General Relativity was conserved by light bent by the sun during an eclipse, a background star was the light source at the edge of the solar disc and the distortion of position was the same angle associated with where we want to dispatch spacecraft as though out to catch a fly ball in the solar system outfield. But since the distance is on order of 600 astronomical units and the local circular velocity will be a fraction of the Earth’s
sqrt( 1/600 th) – That’s about 1.22 km/sec. Now if the sun and Jupiter rotate about a “barycenter”, what with the mass ratio of about 1000 to 1, the sun is rotating about a radius about 1/1000 the distance to Jupiter – which is about 777 million kilometers. So the radius of the sun’s counter rotation is about 780,000 km or a little larger than its surface radius.
Now to an observer in a solar orbit 600 AUs out, that motion does not describe much of an arc in the celestial sphere over 11 years. True. Bu if you using the sun to observe something directly behind it that is only micro-radians wide, the field of view behind it is rushing by in a sinusoidal fashion.
One way to look ( sic) at this problem is like an aircraft trying to keep up a rotating searchlight projected on clouds. The angular rate where the light is mounted is modest, but projected on the sky it is moving faster than aircraft in an aircraft landing approach.
So either there is a deployment of multiple sensor spacecraft at the focus or where it is going to be in the observation campaign – or else a spacecraft with capabilities to maneuver like a UFO itself, many times faster than orbital rate.
The sun is a small angular feature in sky at the focal distance ( 1/600th of its solar radius hereabouts. But object or objects behind it are even more infinitesimal.
In a way this concept has drawn me along into some interesting investigations and became an incentive to get a better understanding of GR and possibilities
it opens.
But as an alternative, arrays of larger space telescopes might be able to get a more continuous and nearer term picture of a particularly attractive target exoplanet. And still have the ability to slew to numerous other candidates.
@wdk
I’m reminded of another approach to developing an image. [I cannot remember the name(s)]. The first uses a single “pixel” detector that observes the object theough a number of 2D arrays of different random transparent and opaque elements. The detector simply receives the aggregate light intensity for each “exposure”. A computer then uses the 2D array masks to build up a picture of teh object. It works because each exposure has a different intensity that is determined by which elements in the mask are transparent.
A similar approach (which was once a CD post) did something similar, but used the rotation of the target exoplanet to offer different views of the landmasses. The paper showed that the exoplanet’s geographical features could be recreated from teh many exposures, which were recorded as a single point of light by the receiving telescope.
So, rather than trying to get the resolution of the exoplanet by clever interferometry of different, widely separated, telescopes (as was done for the BH image), which Dr. Turyshev states is impractical for extremely widely separated telescopes that offer the aperture width of the sun, perhaps we can use the simpler methods as outlined above to recreate the exoplanet’s geography. I don’t know if the separation of the telescopes for these 2 techniques is needed. just the total light collection area, which is achieved with many largish telescopes to agregate to collected photons. As there will be noise, the many space-based telescopes should reduce that noise if they are accurately pointed in teh same direction. The data is now just time-stamped intensities that are integrated with a computer. In the first method, the masks must be the same for each telescope so that the data can be integrated using the same mask for each exposure. AFAICS, there is no need to ensure the same mask is used for all the telescopes unless the exposure time is very long and a significant fraction of teh rotation rate. The 2nd method just needs the data for each exposure to recreate the geography.
Some years ago, I simulated the first method on a computer, and it did work, albeit the image (a bright shape on a dark background) was somewhat imperfect. If the mask was 50% transparent, it would need 2x the collection area to collect the same number of photons as an unmasked telescope. N telescopes would capture N/2 photons compared to a single telescope. What I don’t know is whether the telescope separation buys you anything (I doubt it), but the collection area should, just as it does with any telescope. All teh telescopes would also need sun shades to exclude the light from the exoplanet’s star, with preferably no other contaminating light source. For the first method, I don’t know how large the masks need to be, or even if it would work on a point source. The 2nd method should work, although I would need to find and read the paper to determine if it works on a point source or not.
There is also the idea that a non-uniform surface can be used as a focusing mirror if the orientation of each point on the surface is known, so that a corrected image can be reconstructed. (A sort of post-processing the light rather than actively deforming the mirror as done today.)
Just a thought on the use of computational means to solve the exoplanet imaging problem without resorting to the long mission times needed with current and prospective propulsion technologies for the SGL approach. Kippings’ “terrascope” is another approach, albeit with lower resolution than the SGL approach, and using the Earth’s atmosphere as the lens rather than gravity to focus the light, but with a far more accessible orbital distance.
There is also the intersection of AI with these techniques. Running computationally heavy algorithms to acquire a final data point has been shown to be avoidable by AI “predicting” the final datum from the input data, significantly reducing the computational load. How might this be applied to optics to solve the exoplanet imaging problem?
Using AI to search for time stamping images gives interesting results. Perhaps a probe set towards the star could give out a laser light pointing back at the constellation of telescopes. It would use this signal to give more accuracy by synchronisation.
No, the mission does noit assume that the Sun is stationary — all is moving. Here is the paper that addresses this point: https://arxiv.org/pdf/2112.03019
Hello, S. T.,
Just spotted your paper at Centauri Dreams and downloaded it for further study.
And I should take some time to study it before coming to any further conclusions.
But in the mean time, the abstract on the first page definitely does address what I was trying to convey above. My apologies to the readers. While it essentially addresses the issue, I have had some intermittent interference on line.
But now that we have the additional background which you have provided, I think I can narrow down what my concern has been:
We agree that if there were no other solar system bodies, the task of observing an exoplanet at given declination and right ascension ( celestial latitude and longitude) is less complicated than if it is observed with the perturbing masses of the solar system involved. For all purposes, in a circular 500- 600 AU orbit at the lens focal point one could observe until orbital angular rate shifts the celestial sphere background away. If a spacecraft observatory had not obtained enough data by then, maneuvers could be performed to shift the exoplanet back into the field of view.
But with the n-body solar system, mostly Jupiter, the line of sight of exoplanet, sun and position on orbit is shifting much more rapidly. While the Jupiter-Sun shift takes 11 + years to rotate 360 degrees, the field of view for observing a planet is on order of the 1/600 th half degree circular frame offered by the sun at 600 AUs
out, within which is a highly amplified image of micro-radians in diameter.
Now my concern is that if the orbital rate at 600 AUs out is about 5.72 % of Earth orbital velocity or about 1.7245, the projected image of the exoplanet in the spherical lens can be moving at much higher velocities much of the time, the lissajous like figure depending on where it is in the celestial sphere ( right ascension – declination – only from a solar system based standpoint.
The last/first (?) time I encountered this matter, it occurred to me that additional “camera spacecraft” could be deployed, anticipating the image path. But all the same, when you are out at 600 AUs, there is a lot of area to cover. That part of it was what still concerns me, not knowing what technology or maneuverability is needed to back the mission up.
Since I am very interested in exoplanets and technologies to investigate them, this has been a very interesting trip which you have provided already. And pursuing the program effort thus far, it has resulted in a lot of background research that I should have done years earlier. So at worse case, this mission investigation has been like a stick on a carrot to examine general relativity and some related matters.
Best regards.
Might be something I am missing, but I do not understand why one would use a solar sail, NTP or NEP for a SGL mission. It seems to me that a plasma magnet sail would be the best propulsion method for the mission, since it would reach the SGL in only 7 years, as described on the Wind Rider article here, while not requiring the use of radioactive isotopes for propulsion.