Soon after the Hubble Space Telescope settled into orbit following its launch in 1990, astronomers discovered a big problem. The images Hubble sent back to Earth were blurry—embarrassingly, disappointingly blurry.
Several crack teams of engineers and scientists from NASA, industry, and academia worked frantically to resolve this predicament. The problem, they discovered, was that the primary mirror was misshapen, and the offending curve focused the incoming light incorrectly.
To rescue Hubble, experts worked for months to build new optics to intercept the light bouncing off the telescope's main mirror and correct its aberrant shape. By 1993, they had what they wanted. Astronauts spacewalked out from the space shuttle Endeavor to the malfunctioning telescope and replaced two of Hubble's original cameras with new optical systems. Hubble's images suddenly snapped into focus, and one of those new cameras ended up becoming the most productive astronomical instrument of the last 50 years.
What started out as NASA's greatest embarrassment set in motion a new wave of optics research with broad implications for astronomy and vision science. It took NASA three years, a space shuttle launch, and US $700 million to fix Hubble. We're hoping to eliminate all that trouble for future telescopes by measuring and analyzing blur in real time, using just the data in an image.
In our work at the Jet Propulsion Laboratory (JPL) at Caltech, in Pasadena, Calif., under a contract with NASA, we have built software to help us fix all kinds of blur, including but not limited to the type that afflicted Hubble.
The general idea is simple. We're taking advantage of the fact that our future telescopes will include flexible mirrors that bend and move upon command. By understanding the deficiencies in an image, we can compensate for them by remote control—no astronauts needed. The power of this method lies in its ability to use an optical system's existing camera as a sensor to detect its own error, without installing any separate devices. This software-based approach has already extended our telescopes' ability to peer into the darkness of the universe. On Earth, we believe the software could enable vision scientists to enhance human eyesight beyond "perfect" 20/20 vision, opening up the possibility of "superhuman" vision.
Most adults don't have 20/20 vision. Imperfect vision is caused by aberrations in the way the lens of the eye transmits light to the retina. Those deviations occur in an important aspect of light called the wavefront, a set of points that are all in the same phase. You might be familiar with the term from ads for LASIK eye surgery. The vision-correcting procedure uses a beam of laser light that penetrates the eye, reflects off the retina, and travels back through the eye, capturing and mapping the errors in the way the wavefront strikes the eye.
What causes the wavefront to become distorted? Objects emit or reflect light in spherical waves. Our eyes intercept a small portion of that wave surface, and at great distances this surface is considered basically flat. To form a perfect image on your retina, your eye forces these flat waves to curve inward, so that the waves converge at one point on the focal plane behind your eye's lens. If the converging waves are not perfectly curved, not all of the light will come into focus at a single point on your retina. The result is a blurry or distorted image. All the points of deviation from the perfect spherical wave shape are called wavefront error. Plotted on a 2-D map, the error would look like mountainous terrain, with peaks and valleys corresponding to each deviation from the ideal flat surface.