Our window into the digital universe has long been a glowing screen perched on a desk. It's called a computer monitor, and as you stare at it, light is focused into a dime-sized image on the retina at the back of your eyeball. The retina converts the light into signals that percolate into your brain via the optic nerve.
Here's a better way to connect with that universe: eliminate that bulky, power-hungry monitor altogether by painting the images themselves directly onto your retina. To do so, use tiny semiconductor lasers or special light-emitting diodes, one each for the three primary colors--red, green, and blue--and scan their light onto the retina, mixing the colors to produce the entire palette of human vision. Short of tapping into the optic nerve, there is no more efficient way to get an image into your brain.
The advantages, at least for some viewing situations, would be overwhelming. Scanning the light into only one of your eyes, for instance, would allow images to be laid over your view of real objects, giving you an animated, X-raylike glimpse of the simulated innards of something--a car's engine, say, or a human body. Alternatively, scanning slightly different images into each eye could render grippingly vivid three-dimensional scenes with pure, jewel-like spectral colors. Gamers could experience a heightened sense of reality that liquid-crystal-display goggles could never provide, because the laser or light-emitting diode system could dynamically refocus to simulate near and distant objects with utter realism.
Best of all, the system would waste essentially no photons, so it would be fantastically efficient and very well suited to the low-power requirements of mobile devices. In round numbers, lasers or LEDs would use hundreds of times less power than a small LCD screen typical of a subnotebook or handheld personal digital assistant. Imagine a cellphone or a PDA with a small, cameralike viewfinder that, by stimulating your retina when peered into, would show you an image rich in color and detail. The image would appear to your brain as a large, brightly lit display screen perhaps 65 centimeters away, which could be reconfigured quickly from, say, a traditional, boxy 4:3 format to the widescreen 16:9 format.
The forerunners of such systems, known as scanned-beam displays, are just now hitting the market. They are moving into several industries, including automotive service, to help service technicians keep track of the huge and ever-changing reams of repair data and display it precisely where and when they need it--in the service bay, while they are working on a car. This first-generation system, from the company I work for, Microvision Inc. of Bothell, Wash., was introduced to auto dealers earlier this year at the National Automobile Dealers Association Convention and Exposition in Las Vegas. The system is built around a lightweight display mounted on a baseball cap or visor [see diagram, Direct View]. In the current version, a wireless computer with a touch-pad control is worn on the belt.
Like a high-tech monocle, a clear, flat window angled in front of the technician's eye reflects scanned laser light to the eye. That lets the user view automobile diagnostics, as well as repair, service, and assembly instructions superimposed onto the field of vision. The information that the device displays comes from an automaker's service-information Web site through a computer running Microsoft Windows Server 2003 in the dealership or repair shop. The data gets to the display via an ordinary IEEE 802.11b Wi-Fi network, and all the technicians in the service center are able to access different information simultaneously from one server.
Those of you who remember the stern warnings printed on the side of the lasers in your school physics lab are probably wondering about the safety of aiming laser light directly into the eye. To ensure that its device is safe, Microvision applied rigorous safety standards from the American National Standards Institute, Washington, D.C., and the International Electrotechnical Commission, Geneva, derived from years of studying the effects of light on the eye. Laser light can be harmful because its beam is intense, capable of concentrating its power in a tiny area of incidence. This could be a problem if a fixed beam--as opposed to a scanned beam--were allowed to dwell on just one spot. We ensure that the retina is never overwhelmed by limiting the power of the laser light entering the eye to about a thousandth of a watt and using a high-reliability interlock circuit that turns on the laser only when the beam is scanning. Furthermore, because this very low-power light is continuously scanned onto the retina, its energy is dispersed over an area hundreds of thousands of times larger than a single spot of an incident beam.
Even at this very low power level, the monochrome system now being marketed, called the Nomad Expert Technician System, delivers images that are bright enough, and in a color distinct enough--a vivid red--that they can easily be seen over the background, even outdoors [see Direct View]. A test of the Nomad at the American Honda Motor Co. training center in Torrance, Calif., showed that skilled service technicians performed complex repair procedures in 39 percent less time, on average. Even at half that efficiency gain, a dealership would realize a net return on investment of US $2292 per technician per month, according to the Honda study. Honda has announced its intention to buy 3800 Nomad systems, which retail for $3995 each. Microvision has held trials with dealerships of many other leading automakers, including GM, DaimlerChrysler, and Volvo.
Much as mechanics do, medical doctors fix extremely complex machinery--the human body. Surgeons at the Wallace-Kettering Neuroscience Institute, the Baylor College of Medicine, and the Cleveland Clinic Foundation's Minimally Invasive Surgery Center have tested Microvision's see-through, or augmented-vision, laser-based display. As they operate, the surgeons are viewing vital patient data, including blood pressure and heart rate. And in such procedures as the placement of a catheter stent, overlaid images prepared from previously obtained magnetic resonance imaging or computed tomography scans assist in surgical navigation.
Several military units, including the U.S. Army's Stryker Brigade, are using adaptations of the system. The commander of a Stryker, an eight-wheel light-armored vehicle, can view its onboard battlefield computer with a helmet-mounted daylight-readable display. This enhances the commander's ability to observe the surroundings, choose the optimum path, command the vehicle, and use tactical information advantageously. Other military applications include a series of prototype helmet-mounted displays developed with the U.S. Army and Boeing Co. of Chicago. Currently in the initial stages of flight-testing, the system could be a relatively inexpensive way to provide utility- and attack-helicopter pilots with a digital display of the battle space.
The military gave scanned-beam technology its start in the 1980s as part of the U.S. Air Force's Super Cockpit program. Its team, led by Thomas A. Furness III, now at the University of Washington, Seattle, produced helmet-mounted displays with an extremely large field of view that let fighter pilots continuously see vital data such as weapons readiness. The displayed information moved with the pilot's head, giving him an unobstructed view of what was going on in front of him and helping him to distinguish friend from foe.
Now offshoots of that technology may even wind up in such mass-market products as digital cameras, where scanned-beam displays provide better image quality at lower power and cost than liquid-crystal-on-silicon and organic LED displays. Instead of using lasers, which provide the power necessary for bright, see-through, head-up displays, the non-see-through Microvision displays that require full color at low power and low cost--desirable for viewfinders and near-to-eye cellphone displays--rely on LEDs. Microvision is working with Canon Inc., Tokyo, on a near-to-eye microdisplay with several advantages over conventional optical and LCD electronic viewfinders. Linking the microdisplay to the camera's image sensor would give digital cameras the full viewfinder capabilities of a premium single-lens reflex camera. The user could preview a smooth, high-resolution, full-color image, allowing critical focus-control and depth-of-field adjustments to be made [see diagram, What You See Is What You Get].
There are just four primary components of a scanned-beam display: electronics, light sources, scanners, and optics. Yet with a modular approach, these simple elements can be combined to yield many different products [see diagram, Direct View, again].
Electronics acquire and process signals from an image or data source, such as a Web page or video camera. The processed signals contain information for the intensity and mix of color that best renders the intended image at each location that will be scanned, in sequence. These values are the individual picture elements--pixels--that make up the image. This information is stored in memory until needed, when the data pass through a digital-to-analog converter that controls the light source. Once the image has been rendered into memory, there is no need to recalculate it unless something has changed. The data can simply be replayed from memory, a feature that can be exploited to cut costs or save power.
Depending on the application and cost and size requirements, we can use single color or multiple low-power solid-state lasers, laser diodes, or LEDs as the light source. In the case of a full-color electronic viewfinder display on a camera where low cost and power consumption are critical, modulated red, green, and blue LEDs produce color pixels of varied intensities to generate a complete palette of colors and shades.
If the light source is the paint, Microvision's proprietary microelectromechanical systems (MEMS) biaxial scanner is the brush that applies the image to the retina. The scanner's main component is a mirror 1.5 millimeters in diameter that rapidly sweeps the light beam horizontally to position the pixels in a row, also moving the beam downward, to draw successive rows of pixels. This process continues until an entire field of rows has been placed and a full image appears to the user--quite similar to the process in a regular cathode-ray television, in which the magnetic deflection coils direct the electron beam to scan the phosphor-coated screen. But while a conventional display can create jagged edges on images because the pixels are fixed onto screen hardware, a scanned-beam display has no hard pixels: the continuously scanning beam creates a much smoother image.
For applications in which the scanned-beam display is to be worn on the head or held closely to the eye, we need to deliver the light beam into what is basically a moving target: the human eye. Constantly darting around in its socket, the eye has a range of motion that covers some 10 to 15mm. One way to hit this target is to focus the scanned beam onto an optical element called an exit pupil expander. When light from the expander is collected by a lens, and guided by a mirror and a see-through monocle to the eye, it covers the entire area over which the pupil may roam. For applications that require better image quality using less power, we can dispense with the exit pupil expander altogether either by using a larger scan mirror to make a larger exit pupil or by actively tracking the pupil to steer light into it.
The Simplicity And Elegance of the scanned-beam concept belies the underlying complexity of the enabling advancements over the past four decades in scanning, light-source, and image-processing technologies.
Early on, Microvision researchers identified the scanner as the crucial element in this emerging technology. Eight years ago, we scanned using a polished metal plate that combined the scan mirror and a stiff torsion spring that had a resonance of about 20 kilohertz. When driven by magnetic coils, the plate scans in a large, twisting, resonant motion. With that proof of principle in hand, we developed a MEMS version of the scanner. MEMS are electromechanical devices that are photolithographically defined on a silicon wafer, much as integrated circuits are made, and in quantities of more than 100 per wafer.
A typical MEMS scanner today measures about 5 mm across, with a 1.5-mm-diameter scan mirror capable of motion on two scan axes simultaneously [see photo, Image Painter].
Using MEMS allows us to integrate the scanner, coil windings, and angle-sensor functions all on one chip. Such a scanner provides SVGA (800-by-600) equivalent resolution at a 60-hertz refresh rate and is now in production and in products. We expect a higher performance per scanner as we more fully exploit the basic advantages of MEMS, which include the potential of very low costs in small packages. In addition, multiple scanners could provide higher-resolution images by each providing full detail in a tiled subarea. Eventually, costs will become low enough to make this practical, allowing the scanned-beam approach to surpass the equivalent pixel count of any other display technology.
With green laser diodes, we'll be able to build bright, full-color see-through displays
While the MEMS scanner is a relatively recent development, the laser, another indispensable element of the scanned-beam display, traces its origins back to 1960 and provides a compact source of spectrally pure, focused, virtually noise-free light. Microvision uses laser light sources in many of its see-through products because our customers' applications demand display performances with color-gamut and brightness levels far exceeding the capabilities of flat panel displays, notebook displays, and even higher-end desktop displays. For today's commercial products, only red laser diodes are small enough, efficient enough, and cheap enough to use in such see-through mobile devices as Nomad. Blue and green diode-pumped solid-state lasers are still too expensive for bright, full-color, head-up or projection displays for mainstream markets, but that could change soon. In the mid-1990s Shuji Nakamura of Nichia Chemical Industries Ltd. (now Nichia Corp., Tokushima, Japan) demonstrated efficient blue and green LEDs, and then blue laser diodes made of gallium nitride. When these designs and materials are extended to green laser diodes, we'll be able to build bright, full-color see-through displays.
On another front, Microvision recognized that the total amount of light that enters your eye from a desktop display is actually less than a microwatt, and that this is small compared with what an LED can supply. Although the power required is low, the light must be collected and focused down to a pinpoint--easy to do with a laser, but not so easy with an LED. A scanned-beam display placed near the eye, such as a camera viewfinder, wastes little light, especially if it does not have to overcome a background scene. Even so, we've needed advances in LED technology to further concentrate the light coming from these devices.
Enter the edge-emitting LED. Unlike conventional LEDs, which emit light from the surface of the chip, an edge-emitting LED has a sandwich-like physical structure similar to that of an injection-laser diode, but it operates below the lasing threshold. These LEDs emit incoherent beams of light that, while not so fine as a laser's beam, provide a tenfold increase in brightness. We also use multiple inexpensive surface-emitting LEDs, each contributing a portion of the overall power, to achieve high brightness. Further performance improvements of LED materials driven by huge investments aimed at general lighting applications will increase the brightness and range of applications for scanned-beam displays based on green and blue gallium nitride devices and aluminum gallium indium phosphide red LEDs.
On top of improvements in LEDs, lasers, and MEMS, memory density and processor power are expected to double every two years, translating directly into better performance from our displays. Increased memory and computational capacity boost the update rate of the light source and refine its control, increasing resolution even further.
In Addition To Displaying Images , the scanned-beam technology can capture them. In a display, the data channel through a digital-to-analog converter controls the light source to paint a picture on a blank canvas. In image capture, the light source is steadily on, and the data channel looks at the reflections from the object through an analog-to-digital converter connected to a photodiode. The light source, beam optics, and scanner are essentially the same in both applications.
Exploiting this versatility, we developed a design for an endoscope,the long, slender medical instrument that is used for examiningthe interior of a bodily organ or performing minor surgery.Composite red, green, and blue light from lasers travelsdown a single-mode fiber to the far tip of the endoscope. Thereat the tip, a simple lens collects the light into a singlebeam that then culminates in a fine point. A MEMS scannerdirects the fine point of light over an area that is 10-100mmdistant from the tip. Reflected light collected by fibersand conducted back to detectors contains the informationabout objects encountered. The detected light is digitized and,with software, reconstructed into an image of the objectencountered by the scanned beam.
Microvision just completed a study showing that a 2.5-mm-diameter MEMS chip scanning at large angles would provide resolution as good as that of the leading endoscopes. This MEMS chip is smaller than the sensor chip that is used in CMOS or charge-coupled-device imagers. Small size, of utmost importance for minimally invasive surgery, combined with simple optics, results in a disposable endoscope probe. Such a probe would reduce the cost of medical procedures by saving on the time and cost of sterilization while minimizing the risk of cross-contamination.
Medical-device applications will take more time to develop and to qualify for use. Meanwhile, to provide revenue and gain experience in high-volume manufacturing, Microvision is applying this rather exotic technology to the $1.8-billion bar-code-scanner market with the $99.95 Flic laser bar-code scanner, which we introduced in September 2002. The resolution and scan speeds can be much lower than those needed for display applications, so we can reduce costs by using a plastic multifaceted scan mirror operated by the energy harvested from pressing the scan button. NCR Corp., Dayton, Ohio, has recently introduced the Flic scanner under its own label, called the RealScan Companion. Consumer applications will take longer to come to market, but we expect that in the next five years, our displays will pop up in cellphones and cameras, giving users an HDTV experience on the go, and at a fraction of the power, weight, and cost required by today's devices.