As the James Webb Space Telescope unfolds and makes its way to its final destination in space, NASA and ESA have done a great job of sharing the experience with the public. With webcasts, livestreams and a very active social media presence, the JWST team has allowed people to watch over the shoulders of engineers and scientists, as well as ask questions about the process of commissioning the new telescope.
The most often asked question on social media and at several press conferences seems to be, why weren’t cameras put on JWST to provide actual live footage from the telescope? Wouldn’t seeing it firsthand be better than just receiving telemetry?
After all, rockets now routinely beam back live footage from space as they deploy satellites, (even JWST’s deploy was live), the Perseverance rover had cameras that showed its descent to Mars surface, and the Chinese space agency has deployed remote “selfie cameras” to monitor its Mars rover and orbiter.
“No one would love more to see Webb doing its thing than us,” said Keith Parrish, Commissioning Manager for the Webb telescope, during a livestream on NASA TV during the final tensioning of the telescopes sunshield. “But Webb changes shape a lot, and we would need multiple cameras in multiple locations. The engineering usefulness of the cameras wasn’t there unless things got very complicated very fast. Plus, its very shiny on one side and very dark on the other — by design. The cameras wouldn’t see anything on the dark side.”
“Adding cameras to watch an unprecedented, complicated deployment of such a precious spacecraft as Webb sounds like a no-brainer,” wrote Paul Geithner, Webb’s deputy project technical manager. “But in Webb’s case, there’s much more to it than meets the eye. It’s not as straightforward as adding a doorbell cam or even a rocket cam.”
Difficult lighting is one of the most obvious challenges. “Our gold-coated mirrors were photogenic on Earth, but the mirror side of Webb is pitch dark in space,” said Webb’s Twitter feed. “Meanwhile, the other, Sun-facing side of Webb is so shiny that cameras there would have glare & contrast issues.”
Another reason is the extra engineering risk of the cameras needing extra power or causing interference with the telescope’s sensitive electronics. If engineers attached wiring harnesses to Webb in order to hold the cameras in place, they would cross moving parts of the telescope and run the risk of leaking heat or causing vibrations. Additionally, since the infrared detectors for Webb’s instruments need to be at temperatures near absolute zero to work properly, visible light cameras might not function at those temperatures. Any transfer of heat through wires could be an issue.
But what about remote cameras on cubesats? Mark McCaughrean, Senior Advisor for Science & Exploration at the European Space Agency and part of the JWST Science Working Group has been providing a wealth of details on Webb on his Twitter feed. He emphasized what a huge engineering challenge adding cubesats would be, especially since when Webb’s design was finalized, cubesats weren’t yet a viable option. “And how is that cubesat supposed to deploy, station keep, image, illuminate, relay back data, etc. all from 1 million km without adding more hardware, constraints, contamination and risk to JWST?”
Parrish said that engineers did actually try adding deployment cameras to full-scale mock-ups of Webb hardware at one point, but they found that the telescope’s comprehensive, built-in mechanical, thermal, and electrical sensors provided much better information on its status than cameras could.
“From an engineering perspective, the numbers tell us what is really happening,” he said. “We can take all the telemetry from all the sensors, and synthesize it into a visual for our teams. Maybe if we had started from day one with the concept of having cameras on board, we could have implemented that.”
But adding cameras in the middle of an already complicated an unprecedented design would have been complicated – and likely delayed – JWST even more.