Beyond Pokémon GO: The Secret to a Better Augmented Reality Experience

Software that can analyze tiny motions in video allows real objects to be dynamically simulated in augmented reality

3 min read
Pokemon Go creature frolics in bushes
Gif: Abe Davis/IEEE Spectrum

imgWith interactive dynamic video, Pokémon characters can be made to appear to interact with the real world instead of merely overlaying it.Gif: Abe Davis/IEEE Spectrum

Whether or not you understand the recent drive to fill the world around you with obnoxious animated characters that you can only see as long as you hold your phone up in front of your face at all times, augmented reality does have the potential to enhance our world in ways that are occasionally useful. However, the AR experience is currently a sterile one, with augmentations overlaid on top of, but not really a part of, the underlying reality.

Abe Davis is a graduate student at MIT who we've written about before in the context of using a candy wrapper and a camera as a microphone. You should absolutely click here for a more intimate introduction to Abe Davis, but if you're not a fan of incredibly nerdy rap music, we'll just move on to Abe's thesis dissertation, which describes interactive dynamic video (IDV). Rather than using 3D graphics to model the motion characteristics of objects, IDV extracts motion information from a small amount of 2D video, and then generates simulations of objects in that video, allowing the augmented part of AR to interact directly with the reality part, turning static objects into objects that you (or your virtual characters) can play with.

To understand how this works, imagine a simple moving object, like bedsheet hanging on a clothesline in a gentle breeze. As the wind blows, the sheet will ripple, and those ripples will consist of a horizontal component (movement across the sheet) as well as a vertical component (movement up or down the sheet). Once you've figured out these motion components, called resonant or vibration modes, you can simulate them individually or combine them together in ways that can mimic a breeze blowing in a different direction or at a different strength. The fundamental knowledge that you gain about how the real sheet moves on a very basic level allows you (and a real-time video editor) to make the sheet in the video move in a realistic way as well, without being constrained by reality itself.

This method can be applied to much more complicated objects than bedsheets, although it gets trickier to pick out all of the vibration modes. For it to work properly, you need a stable, baseline video of the scene, you need to move the object that you're interested in simulating, and you need to watch it move for long enough that you can accurately extract the vibration modes that you care about, which may take a minute or two. Unfortunately, this means that using it for Pokemon GO is probably not realistic unless you have a lot more patience and restraint than the typical Pokemon GO player seems to have, because you can't just wander around and point your phone at physical objects that can be instantly animated (yet):

As curmudgeonly as we are about whatever silly games kids (and some adults) are playing these days, the underlying technology here is very cool, and there's potential for many different applications. Generally, modeling a real world object for any purpose, from civil engineering analysis to making animated movies, requires first making a detailed 3D model of that object and applying a physics engine to get it to move. With interactive dynamic video, you can skip the complicated and time-consuming 3D modeling step to create a virtual model with reality-based physics. Such a model may not offer the same movement space as a full 3D model, but that's a reasonable trade-off for only having to spend a minute or two with video camera and a tripod to create it. 

While Davis has no immediate plans to commercialize any of this (he's heading to Stanford for his postdoc in the fall), MIT has a patent on the technique, and it's not difficult for Davis to speculate about where we might see IDV in the near future.

“[W]hen you look at VR companies like Oculus, they are often simulating virtual objects in real spaces," he says. "This sort of work turns that on its head, allowing us to see how far we can go in terms of capturing and manipulating real objects in virtual space.”

The Conversation (0)

Digging Into the New QD-OLED TVs

Formerly rival technologies have come together in Samsung displays

5 min read
Television screen displaying closeup of crystals

Sony's A95K televisions incorporate Samsung's new QD-OLED display technology.

Sony
Blue
Televisions and computer monitors with QD-OLED displays are now on store shelves. The image quality is—as expected—impressive, with amazing black levels, wide viewing angles, a broad color gamut, and high brightness. The products include:

All these products use display panels manufactured by Samsung but have their own unique display assembly, operating system, and electronics.

I took apart a 55-inch Samsung S95B to learn just how these new displays are put together (destroying it in the process). I found an extremely thin OLED backplane that generates blue light with an equally thin QD color-converting structure that completes the optical stack. I used a UV light source, a microscope, and a spectrometer to learn a lot about how these displays work.

Keep Reading ↓Show less