Spectrum editor is motion captured and turned into a guy at Game Developers Conference

I thought I was in my office yesterday, working on articles for upcoming Spectrum issues. Instead, I apparently was at the Game Developers Conference in San Francisco, where I was being featured at the Mova booth.

â''Lots of journalists recognized you,â'' Mova founder Steve Perlman told me later.

I guess thatâ''s a good thing. Except apparently instead of being my normal short, female self, I was a big, bald guy.

Let me explain.

Last year, around this time, I was editing an article by Eric Pavey of Electronic Arts on advances in computer graphics that are leading to realistic digital humans. In particular, Pavey talked about techniques for translating facial movements into digital images that could then be manipulated. I decided to get myself â''capturedâ'' by a new system still under development, Contour, from Mova, a company incubated by Perlmanâ''s Rearden Companies. It was a fascinating process, involving banks of cameras, phosphorescent paint, and a director with a clapboard. I felt like a star, and wrote about it.

And then promptly forgot about it. Mova, however, saved the digital me, and used it to show off its new â''retargetingâ'' tool at the Game Developerâ''s Conference.

Retargeting allows one actorâ''s performance to drive another actorâ''s. In this demo (above), Contour took my original performance (left), made it into a digital mesh (right), and then took a few images of an actor named David. The technicians aligned Davidâ''s face with mine, and then the Contour system used the sequence of motions in my digital mesh to move the image of Davidâ''s face. The movements of my gaze, recorded by an eye tracking tool, directed Davidâ''s eye motion. The results are distinctly odd, as you can see. Thatâ''s my voice, those are my expressions, but that sure isnâ''t me.

What's the point? Well, suppose a movie director wanted to add a new scene long after shooting stopped and the lead actor had moved on to other projects and was not available. The director could use another actor's motion, and the lead actor's image. Or, more commonly, in translating a movie to a videogame, game developers would be able to use realistic digital facsimiles of the original actors, but would not need those actors to record entire motion sequences.

Perlman still canâ''t say exactly when Contour-generated faces will hit the big screen; heâ''s at the mercy of movie company nondisclosures. But he says heâ''s working with a number of major studios and A-list actors, and the results will likely premiere this year, in both movies and videogames.

Related Stories

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Advertisement