The Ubiquitous Webcam

Do we really want to know what the dog does when it’s home alone

3 min read
Illustration of many hands around the globe holding mobile phones with screens showing a single eye.
Illustration: James Steinberg

In the 1990s, when webcams first hit the market, I made up some special slides (yes, 35-mm slides!) to illustrate a story I used in several speeches.

I began the story by asking a question that has always mystified me: What did my dog do when I wasn’t home? Well, I said, I could install a webcam in the hall and watch. But then maybe I should tie the webcam to the dog’s head so I could follow his field of view. In my presentation, I would then show a charming picture of my dog with the webcam attached to his head. Everyone would laugh.

Even better, I’d say, let’s pass a law that all dogs have to have GPS-enabled webcams. Then when anything interesting was happening in the world, you’d need only tune in to the nearest dog to watch. I’d then show another picture, this time of heads of state at a summit meeting, with my dog off to the side wearing his webcam.

Well, nearly two decades have passed, and we don’t need dogs now—we have people with smartphones. And people are everywhere. We’ve come a long way since some students at the University of Cambridge installed what we would now call a webcam in their coffee room. For a while back then, such public webcams were a fascination. I don’t think they are especially popular now—there is too much to do to watch nothing happening somewhere else.

But when something does happen, all those cameras constitute an evolving capability that we have yet to fully exploit or even understand. Unlike my proposed cam-enabled dogs, people choose their pictures and videos. For better and for worse, we can far outdo my imaginary dogcams.

Several billion people with cellphones are moving around in a world characterized by an evolving visual ubiquity that also includes increasingly dense real-time closed-circuit TV surveillance, satellite imagery, and street-view photography. It’s amazing how quickly all this has happened. It seems only a few years ago that our main visual connection to faraway places was National Geographic magazine.

A hint of what we can do with all this imagery was seen in the recent DARPA Network Challenge. Ten red weather balloons were moored at various random places within the continental United States. A US $40 000 prize would go to the first competitor to locate all the balloons. Amazingly, it took a team from MIT less than 9 hours. This feat seems extraordinary when you consider that there are about 10 million square kilometers in the country. The MIT group solved two problems: how to encourage a lot of people to help, and how to filter all the input.

Another example is in the work of the organization Ushahidi, which uses crowdsourcing to aid humanitarian efforts. Following the devastating 2010 earthquake in Haiti, Ushahidi began an open-source project to develop an accurate crisis map of the country by compiling and integrating real-time reports from volunteers in local neighborhoods. Their contributions to the humanitarian efforts were remarkable and could not have been obtained otherwise.

There are ongoing technological developments that can augment this visual ubiquity. Face recognition, and the ability to track individuals through crowds and across multiple cameras, as in airport security, need algorithmic development to be more real-time and less labor intensive. There are also efforts to stitch together the billions of amateur photos on the Web to create a worldwide panorama.

After all these years, I still don’t have a home webcam. I’d like to, but I don’t know what to do with it. I could point it at my driveway, which stretches westward for some distance through trees, but frankly nothing ever happens there—just like on most of the planet, I imagine. Maybe once a year my camera might see the glint of a deer’s eyes. I know that there are companies that offer cloud-based services to filter out all the nothingness using change-detection algorithms, but I fear I’d be left with only an annual summary that says—to borrow a literary phrase—“all quiet on the western front.” But then, you never know. And maybe my dog does something interesting while I’m gone, too.

This article originally appeared in print as “Visual Ubiquity.”

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions