Digital Photography: The Power of Pixels
Digital photography changed not only how we take pictures but also how we communicate
Illustration: Frank Chimero
This is part of IEEE Spectrum's special report: Top 11 Technologies of the Decade
Ten years ago, photography for the most part meant film. We carried rolls of it on vacation, dropped it off for processing when we got back, picked up our prints, then put them in albums or scrapbooks or, more typically, in cardboard boxes. On occasion, we thought about sending a duplicate to distant relatives, but we'd often forget. Photographs were for documenting our history, for framing, for saving.
What a difference a decade makes! The vast majority of us haven't handled a roll of film in years—it's a retro novelty at best. Digital technology has changed the very nature of photography. Digital images are free and easy and can be instantly distributed. As a result, the vast majority of photos are no longer taken to capture special moments; they're used to communicate the ordinary, with less forethought than a phone call.
Of course, digital cameras didn't simply materialize in our hands a few years ago, although it may seem like it. You could trace their history back to 1969, when the charge-coupled device (CCD) was invented at Bell Telephone Laboratories, or to 1957, when the first digital image scanner was created at the U.S. National Bureau of Standards.
Or you could start in December 1975, when Steven Sasson, an electrical engineer at Eastman Kodak Co., in Rochester, N.Y., became the first person to pick up a digital camera and take a picture.
Sasson, hired by Kodak in 1973, fresh from a master's program at Rensselaer Polytechnic Institute, in Troy, N.Y., was a fish out of water. Kodak was essentially a company staffed by chemical and mechanical engineers, but in the early '70s it started hiring a handful of EEs to develop electronic controls for cameras, like exposure systems and motor drives. One of Sasson's first assignments was to check out the new 100- by 100-pixel CCD chip developed by Fairchild Semiconductor, to see if it would be useful for Kodak.
Photo: David Yellen
Say cheese! Electrical engineer Steven Sasson took the first-ever digital snapshot in December 1975.
Sasson decided that the best way to study the chip was to build it into a camera. Being an electrical engineer, he thought it would be cool to create a new, all-electronic camera, with no moving parts, rather than sticking the CCD into an existing mechanical body. He spent about a year on the effort, working on it in between other assignments, cobbling together the materials he needed from catalogs and used-parts bins. He found a tiny digital-data cassette recorder, adapted an analog-to-digital converter from a Motorola digital voltmeter, and grabbed a lens from an old 8-mm movie camera.
In December 1975 he pointed the completed prototype at a lab technician and took his first picture. He then went to his supervisor and told him that he'd turned that CCD into a working camera.
Sasson recalls his supervisor saying he'd bring some people to the lab for a demo. No, Sasson responded, it's portable. I can bring it to you. His supervisor was amazed.
Sasson started a round of demos, bringing groups of Kodak engineers and executives into a conference room, taking a quick picture of one of them, and then popping the tape out and putting it into a player to show it on a TV screen. "In 1976 we were taking pictures without film and viewing them without paper," he says.
Sasson's project never went beyond the prototype stage. At the time, he told Kodak executives that digital cameras wouldn't catch on until they could produce images with 2 million pixels; he thought that day would come in 1990 or 1995. And the executives, he recalls, though recognizing that this would be earth-shattering for the film photography business, believed they didn't need to become too concerned because it wouldn't matter for a long time. Sasson built more cameras at Kodak over the years. Then, in 1990, he moved to the output side of digital imaging, developing color printers. His original digital camera patent, issued in 1978, expired in 1995.
Meanwhile, in 1981, Sony came out with an analog electronic camera, the Mavica. It recorded images using a television video signal, storing them on a floppy disk.
"I liked it because it woke Kodak up," Sasson recalls. "I also liked it because I knew that it was not going to succeed; it was analog, and to succeed it had to be digital."
About seven years later, Kodak created the first commercial megapixel digital camera, called the Hawkeye II Imaging Accessory. It was sold at a list price of US $23 000 each to U.S. government organizations; one camera went along on a shuttle mission in 1991.
Then in 1991, the company introduced a commercial black-and-white digital camera, the Kodak Professional Digital Camera System, later referred to as the DCS 100. In a sense, it was a step back from Sasson's prototype, because it wasn't an all-in-one device; instead, the system tethered a modified Nikon camera to 5 kilograms of electronics in a shoulder bag. It was marketed to news organizations at $20 000 to $25 000.
COMPANY TO WATCH:
If you want your phone to replace your wallet and keys, it will need a near-field communications (NFC) chip. Broadcom already supplies companies like Apple with integrated Wi-Fi and Bluetooth chips, and thanks to its August 2010 purchase of UK-based Innovision, the company should soon be able to add NFC for less than US $1 per unit.
"Journalists just laughed at it," recalls John Henshall, then president of the British Institute of Professional Photography and a consultant with Kodak. But a few did begin using the device, because of two key features—it enabled them to immediately review the captured image on the electronic display, and it was possible to easily transmit these images by dial-up modem.
Kodak's $9995 DCS 200 in 1992 put all the electronics in the camera. Recalls photographer Stephen Johnson, "It was pretty amazing. I took my first images with it walking through the snow in Camden, Maine."
But it took Apple's marketing to finally make Sasson's vision of a handheld all-electronic consumer camera a reality. The under-$1000 Apple QuickTake 100, designed and manufactured for Apple by Kodak, came out in 1994. At its highest resolution, 640 by 480, it could store up to eight images on its internal memory. It sold only about 50 000 units, but it was a huge landmark.
"Apple legitimized the category," says Alexis Gerard, founder and president of analyst firm Future Image and of the 6Sight Future of Imaging Conference.
It came shortly after the creation of JPEG, the image compression standard that made the most of memory—still very expensive—and of Internet bandwidth. "Having a technology to crunch those huge files down to what the infrastructure could deal with was very important," Gerard says, adding, "Without JPEG, we would have had to wait another five to seven years" for the technology to catch up. And JPEG meant that digital photos taken by different cameras were fully interoperable.
Apple's second digital camera, the QuickTake 150, again from Kodak, offered JPEG compression. After that, things quickly marched forward. The Casio QV10 in 1995 had the first built-in liquid-crystal display. Kodak introduced the first megapixel consumer color camera, the DC210, in 1997. The first camera phones appeared in Japan in 2001; they hit the United States in 2004.
"It was like a snowball rolling down a mountain; it gathered more and more snow until it blew away everything in its track," says Henshall.
The snowball also changed the essential meaning of photography. The principal purpose of photography had been to capture images for posterity. Today that is no longer primary. "Images are being used for more than just memories," Gerard says. "Many are not intended to be stored but purely to communicate information that only has value in the moment: Where did I park my car? What does this office space we're considering look like?"
Even as an art form, photography is changing, Gerard says. "If you point a camera at a good image in the real world, you will likely get a good image—there is no learning curve. For the first time we have a creative tool that people can jump into right away."
While digital photography has vanquished film, it is far from perfect. Although some people say that the cameras themselves could do a better job at matching what the eyes see—going to three dimensions, in particular—for the most part, it's not the cameras themselves that need improving. "The capability of the cameras being sold today far outstrips the average consumer's ability to use them," Sasson says. The problem is what to do with the images once you've taken them. The scrapbooks and shoeboxes of the film world are being replicated in digital forms, but they're overloaded and becoming impossible to manage. They're also not necessarily as reliable as a shoebox: Can you trust that your online photo storage company will be around in 50 years or that computers will read old camera formats?
"This is the last frontier," Sasson says. "How do you manage these images? How do you save them for 50 or 60 years, with format obsolescence, changing standards? Images are the only digital files that get more valuable the older they get."
For all of IEEE Spectrum's Top 11 Technologies of the Decade, visit the special report.