How Neuromorphic Image Sensors Steal Tricks From the Human Eye

By prioritizing the dynamic parts of a scene, machines can capture images more efficiently

9 min read
How Neuromorphic Image Sensors Steal Tricks From the Human Eye
Photo-illustration: The Voorhes

When Eadweard Muybridge set up his cameras at Leland Stanford’s Palo Alto horse farm in 1878, he could scarcely have imagined the revolution he was about to spark. Muybridge rigged a dozen or more separate cameras using trip wires so that they triggered in a rapid-fire sequence that would record one of Stanford’s thoroughbreds at speed. The photographic results ended a debate among racing enthusiasts, establishing that a galloping horse briefly has all four legs off the ground—although it happens so fast it’s impossible for ­anyone to see. More important, Muybridge soon figured out how to replay copies of the images he took of animal gaits in a way that made his subjects appear to move.

Generations of film and video cameras, including today’s best imaging systems, can trace their lineage back to ­Muybridge’s boxy cameras. Of course, modern equipment uses solid-state detectors instead of glass plates, and the number of frames that can be taken each second is vastly greater. But the basic strategy is identical: You capture a sequence of still images, which when played back rapidly gives the viewer the illusion of motion.

Keep reading...Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Boston Dynamics AI Institute Targets Basic Research

Hyundai’s new robotics venture recalls Bell Labs’ and Xerox PARC’s glory days

4 min read
A collage of a headshot of Marc Raibert who is an older man with a beard and glasses in a flower print shirt, and an large black and white Atlas humanoid robot
Photo-illustration: IEEE Spectrum; Photos: Boston Dynamics

This morning, Hyundai Motor Group and Boston Dynamics announced the launch of the Boston Dynamics AI Institute, to “spearhead advancements in artificial intelligence and robotics.” BDAII (I guess we’ll have to get used to that acronym!) will be located in Cambridge, Mass., with more than US $400 million of initial investment from Hyundai (Boston Dynamics’ parent company) and BD itself to get things started. Heading up the whole thing will be Boston Dynamics founder Marc Raibert himself, with Al Rizzi (Boston Dynamics’ chief scientist) as chief technology officer.

This new venture looks promising.

Keep Reading ↓Show less

Where the President-Elect Candidates Stand on Key Issues

The four weigh in on climate change, education programs, and diversity

6 min read
A photo of four people standing next to each other.

Life Fellow Thomas Coughlin, Senior Members Kathleen Kramer and Maike Luiken, and Life Fellow Kazuhiro Kosuge are running for 2023 President-Elect.

Steve Schneider

Two virtual events were held in June and July for members to get to know the four candidates running for 2023 IEEE president-elect. President Ray Liu asked Thomas M. Coughlin, Kazuhiro Kosuge, Kathleen A. Kramer, and Maike Luiken questions submitted by members on issues important to them.

The candidates were asked about their plans for increasing diversity, equity, and inclusion at IEEE; expanding science, technology, engineering, and math education programs; and ways to attract and retain members. They also spoke about IEEE’s role in addressing the global climate crisis.

Keep Reading ↓Show less

Modeling Microfluidic Organ-on-a-Chip Devices

Register for this webinar to enhance your modeling and design processes for microfluidic organ-on-a-chip devices using COMSOL Multiphysics

1 min read
Comsol Logo
Comsol

If you want to enhance your modeling and design processes for microfluidic organ-on-a-chip devices, tune into this webinar.

You will learn methods for simulating the performance and behavior of microfluidic organ-on-a-chip devices and microphysiological systems in COMSOL Multiphysics. Additionally, you will see how to couple multiple physical effects in your model, including chemical transport, particle tracing, and fluid–structure interaction. You will also learn how to distill simulation output to find key design parameters and obtain a high-level description of system performance and behavior.

Keep Reading ↓Show less