Tech Talk iconTech Talk

Q&A: Bringing Chappie to Life with VFX Supervisor Chris Harvey

What would existence be like for the first sentient robot? Pretty eventful, according to the action science-fiction movie Chappie, which officially opens today in the United States.

Directed by Neill Blomkamp, who previously helmed 2009’s District 9 and 2013’s Elysium, Chappie is the story of an eponymous upgraded police robot, programmed to learn and experience emotion. Post-upgrade, Chappie must then find his place in the world, while fending off forces determined to destroy him.

The key character of Chappie is computer-generated, based on a performance on-set by the actor Sharlto Copley. Creating a believable robot that could engage an audience’s sympathies, and a near-future world for it to inhabit, was largely the responsibility of Chris Harvey, Chappie’s visual effects (VFX) supervisor. Harvey talked with me about how the movie’s creators approached portraying futuristic but believable technology:

Stephen Cass: What is the role of a VFX supervisor?

Chris Harvey: I oversee all of the visual effects in the film. So that includes planning, in pre-production, how we are going to shoot things that are going to need effects integrated later. Then during the shoot we’re making  sure that we’re shooting it in the way that we need and acquiring the appropriate data. Depending on the director we can be very involved in what those video effects will be.… Once you get into post [production], you’re right next to the director: Everything from the effects creative staff is coming to you for approval and then presentation to the director.

SC: How did you work with Blomkamp—what was your brief?

CH: [I started working with Blomkamp] almost 8 months before shooting began… The brief was “we need to create this robot that everyone has to believe is real from a visual perspective [and make him so that people can] connect and relate and emote with him.” That was the big challenge of this movie—we had to take it past just believing he’s present.

SC: How do you go about building a world that’s realistic, but still visually engaging? How do you make extrapolations about how things might look in the future?

CH: Every film is different, because every film has a different visual style. … But the key is always looking to the real world and finding references.… Even if [the thing on screen] is something fantastical that no-one’s ever seen before, not having something to ground it in the familiar reality of our world will always make it much harder to accept as being real.

Now, with Chappie we took that right to the extreme. There’s literally no part of Chappie that is just completely invented. Everything about his design is purpose driven. There’s a function to every component.  There’s a ridiculous amount of real world material built into him. All of his joints and gears and all of that stuff comes from references from real robotics that exist today, so that he feel very, very tangible.

SC: What were the biggest technical challenges in making Chappie feel real and relatable, especially around actors on physical sets?

CH: All of the shooting happened in real locations, so we were on location the whole time. We shot Sharlto in all these scenes, but sometimes [he’s not exactly where he needs to be] so you have to erase him. No one likes to talk about that, but there was an army of people and all they did was just erase Sharlto out of the frame! That’s not new, but it’s certainly something that deserves acknowledgement. The animation was certainly a technical/creative achievement. We did not use motion capture like everyone thinks we did. We used Sharlto as reference, but the animators hand key-framed [the robot] on top of him.…

An interesting challenge was that [Chappie] is one character, but he evolves throughout the course of a scene. So even though you might think of him as one digital model or asset, there are 16 versions of him that we had to track from shot to shot throughout the whole film. [And, for example,] he has a battery light on his chest. And for that battery light we had to make sure it tracked the [correct] percentage from shot to shot.…

One thing we really tried to do a lot, in terms of making [Chappie] feel present in a scene, is physical interaction. We really didn’t shy away from that. If people wanted to hug him or touch him—we really encouraged the actors to interact with him as much as possible. We put these chains on him that could shake around and bump into things.... Those things are very, very hard to achieve, but the more of it that we could do, the more it sells that he’s there.

SC: Once, audience accepted robots like Robby. Now expectations are higher—are those expectations continuing to rise?

CH: Tremendously so. There’s just such an onslaught of big visual effects films. Huge amounts of eye candy. So people are becoming educated. It’s not just films—that technology is making its way into games and into everyday life. People are much more educated on it and much less accepting to the point where there’s a lot of people out there who literally go in looking for the mistakes. Which I think is a shame! 

Google Tests First Error Correction in Quantum Computing

Quantum computers won’t ever outperform today’s classical computers unless they can correct for errors that disrupt the fragile quantum states of their qubits. A team at Google has taken the next huge step toward making quantum computing practical by demonstrating the first system capable of correcting such errors.

Google’s breakthrough originated with the hiring of a quantum computing research group from the University California, Santa Barbara in the autumn of 2014. The UCSB researchers had previously built a system of superconducting quantum circuits that performed with enough accuracy to make error correction a possibility. That earlier achievement paved the way for the researchersmany now employed at Google—to build a system that can correct the errors that naturally arise during quantum computing operations. Their work is detailed in the 4 March 2015 issue of the journal Nature.

“This is the first time natural errors arising from the qubit environment were corrected,” said Rami Barends, a quantum electronics engineer at Google. “It’s the first device that can correct its own errors.”

Read More

An Unhackable QR Code to Fight Bogus Chips

To combat the rising threat posed by counterfeit microchips, researchers from the University of Connecticut now suggest the QR codes often used in ads and signage could be made nearly impossible to hack for use in security.

The UConn researchers claim that their new codes are very difficult to duplicate and decrypt, and could potentially help stem the flood of counterfeit electronics. They detailed their findings in the February issue of IEEE Photonics Journal.

Counterfeit electronics are on the rise, and could pose dangers to national security. For instance, a 2011 Senate Armed Services Committee hearing noted an increasing number of counterfeiting incidents were seen in the Department of Defense supply chain, from 3,868 incidents in 2005 to 9,356 incidents in 2008, mostly involving chips that came from China. In one key case, mission computers for high-altitude missiles used to destroy incoming missiles contained suspect counterfeit memory devices, a problem that cost nearly $2.7 million to fix.

Read More

A Drone with Bug Vision

Almost anything that flies, be it a plane, a spacecraft, or a drone, has an inertial navigation system with accelerometers and gyroscopes that control yaw, pitch and roll, and thus the flight path. Flying insects like bees, however, don't have inertial systems to guide them; they rely exclusively on what they see. This has inspired two researchers at the Aix-Marseille University in France to build a drone that imitates the way these insects navigate. Their mission was to design it to fly and circumvent obstacles by relying solely on visual cues. In a paper published in the 26 February issue of Bioinspiration & Biomimetics, Fabien Expert and Franck Ruffier describe how a drone, which they call BeeRotor, was able to traverse a circular tunnel, avoiding crashes and obstacles.

The tiny craft, which was guided by panoramic optic flow (OF) sensors and no inertial navigation, weighs 80 grams and is 47 centimeters long. It was attached to the end of a freely rotating arm at the center of a circular tunnel. Two rotors kept the drone aloft. The ceiling and the floor of the tunnel were covered with photographs of natural surfaces, and the speed at which the details of these photographs passed under or above the drone were continuously monitored by the panoramic optic flow sensors’ 26 photodiodes.

The principle of the optic flow guiding mechanism is quite simple, explains Ruffier. The robot does not measure its own speed or distance from the floor, but if the details of the underlying structure pass by too fast, the optic flow sensor triggers a feed-back system increasing the speed of the rotors, moving the craft away from the floor, he says. The “eye” has four optic flow sensors, two directed toward the front of the craft and two that look backwards. Each sensor is equipped with a lens that focuses the image on six photodiodes that record the speed of a passing element as its image transfers from one pixel to the other. The eye keeps itself aligned with the nearest surface. “Orienting its eye allows the robot to avoid slopes of up to 30 degrees,” says Ruffier.

The team is now working on a rotorcraft that will be able to leave its perch and fly around freely. This will require two additional optic flow sensor systems. They will control the roll and yaw so that the craft can veer to the right or left when it is approaching an obstacle.

What is the need for drones guided by optic flow sensors if inertial navigation systems have done the same job well for decades? Drone manufacturers are asking that question, but there is interest from European the aerospace industry, especially in Europe, says Ruffier. The probes that land on the Moon, Mars, or comets contain inertial systems that represent about 20 percent of their weight. Space agencies, Ruffier says, are interested in visual navigation systems because they will be much lighter. And even if they don’t replace inertial systems on spacecraft, they could act as back-up systems poised to save space missions, he adds.

Android Phone's Battery Use Can Reveal User Location

Most Android smartphone owners probably feel secure knowing that apps must ask permission to access their location. That sense of security is misplaced, say U.S. and Israeli researchers who have figured out how to track smartphone owners based on a mobile device’s battery use alone.

Read More

ISPs, Lobbyists, and Everybody Else React to FCC Net Neutrality Decision

Reactions to the Federal Communications Commission's net neutrality ruling are as sharply divided as the 3-2 party-line vote at a February 26 meeting streamed live on the web. “Today the FCC has taken historic action to protect the Internet for a next generation of Americans online," proclaimed Alan Davidson, Vice President of the nonprofit New America and Director of its Open Technology Institute. "This order imposes intrusive government regulations to solve a problem that doesn't exist," said Republican commissioner Ajit Pai before casting his nay vote.

Read More

FCC Votes "Yes" on Net Neutrality

As expected, the Federal Communications Commission today approved proposed Net Neutrality rules by a 3 to 2 vote. Two Democrats joined FCC chairman Tom Wheeler in voting for the rules. Two Republicans dissented at great length. The audience at the open meeting greeted the approval with loud applause. The rules apply to both wireless and fixed broadband. 

At the heart of the FCC proposal are three "bright line rules" embraced by net neutrality advocates. They bar broadband providers from blocking or throttling legal content and services transmitted over the Internet, and prohibit providers from charging content or service providers such as Netflix a premium for high-speed connections. 

Read More

FCC Gives Municipal Broadband Providers (and Internet Competition) a Boost

Today, in addition to reclassifying broadband from an “information service” to a “telecommunication service,” the Federal Communications Commission voted to preempt state laws in Tennessee and North Carolina that previously restricted municipal governments from expanding their broadband services and competing with commercial Internet service providers in surrounding areas. The decision was not a surprise, as FCC Chairman Tom Wheeler has been talking about preempting such state laws for over a year

There are at least 19 states with laws that restrict or limit municipal broadband. Just last month, Missouri lawmakers proposed strengthening existing restrictions in their state. Today’s decision only applied to the two specific petitions filed with the FCC—one from The Electric Power Board of Chattanooga, Tennessee, and a second from the city Wilson, North Carolina—but it may set a precedent that encourages other municipalities to submit petitions. In the meeting, Wheeler said, “I do hope, however, that this attention…calls out the activities of incumbents to block consumer choice and competition through legislation.”

Read More

Review: Is Your Data Worth a RAID from Western Digital?

Hearing someone tell you that you should backup your data is, in the words of one IEEE Spectrum editor, "about as impactful as your dentist telling you to floss." It's not fun, it's not exciting, and the short term consequences are usually nonexistent, so we often don't do it. You know what, though? Teeth are replaceable. Your data aren't. Back your stuff up. And if you're looking for a place to start, Western Digital has some ideas, just announced today.

Read More

How to Make Multicore Chips Faster, More Efficient

Although transistors continue to get smaller and more numerous on each microchip, they have stopped getting faster because they would get too hot to work if they sped up further. To continue improving electronics, chipmakers are instead giving chips more processing units, or cores, to execute computations in parallel.

The way in which a chip distributes its operations can make a big difference to performance. In 2013, MIT electrical engineer and computer scientist Daniel Sanchez and his colleagues showed a way to distribute data around the memory banks of multicore chips that could improve speed by about 18 percent on average.

Now, in simulations involving a 64-core chip, Sanchez and his colleagues find a new way to distribute both computations and data on such a chip can boost computational speeds by 46 percent and reduce power consumption by 36 percent.

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More