Tech Talk iconTech Talk

NASA's Dawn Makes History by Orbiting a Dwarf Planet

NASA’s Dawn spacecraft has achieved the distinction of becoming the first to orbit a dwarf planet. The spacecraft recently began circling Ceres, the largest known body in the main asteroid belt that sits between Mars and Jupiter.

Read More

Draw Biosensors on Your Skin

Someday soon, on-demand diagnostics could be as simple as doodling on your arm or leg. Special sensor-laden inks would help diabetics monitor their blood sugar and allow people to stay on top of other elements of their body chemistry. The write-once, read-several-times inks could also let homeowners test for toxic pollutants, and help soldiers detect explosives and nerve agents on the battlefield.

Researchers at the University of California, San Diego (UCSD) who developed the inks published their results in the 26 February issue of the journal Advanced Healthcare Materials. They revealed that the main ingredients of these inks are the enzymes glucose oxidase, which responds to blood glucose, and tyrosinase, which responds to common pollutants known as phenols. To make these bio-inks serve as electrodes, they added electrically conductive graphite powder. They also added: chitosan, a clotting agent used in bandages, to help the ink stick to surfaces; xylitol, a sugar substitute, to help stabilize the enzymes during chemical reactions; and biocompatible polyethylene glycol, which is used in several drug delivery applications, to help bind all these ingredients together.

The scientists filled off-the-shelf ballpoint pens with their enzymatic inks. The pens could create a blood glucose sensor simply by drawing glucose-sensitive ink on a transparent, flexible strip that included an electrode. When a blood drop was placed on the sensor, the ink reacted with glucose in the blood and the sensor measured the reaction.

The UCSD team demonstrated that these biosensors were reusable. They only had to wipe the strips clean and draw more ink on them.

But the eye-popping achievement they reported was that the inks could eliminate painful finger pricks to draw blood simply by scrawling sensors directly on the body. The diagnostic body art, they said in the paper detailing their findings, would take glucose readings through the skin and deliver the results via a Bluetooth device. The investigators estimated one pen held enough ink to write the equivalent of roughly 500 blood glucose sensor strips.

The pens could also draw sensors on leaves to detect pollution using phenol-sensitive ink. The scientists noted that the inks could be modified to react with many other pollutants, such as heavy metals or pesticides, and can be drawn on a variety of surfaces such as smartphone cases and building windows.

The UCSD researchers noted these pens could help people build useful sensors anywhere easily and cheaply anytime they need to, without needing to know ahead of time where and when sensors might be required. The group’s next steps include connecting the handwritten sensors wirelessly to monitoring devices and analyzing how they perform in difficult conditions such as extreme temperatures, varying humidity levels, and extended exposure to sunlight.

Q&A: Bringing Chappie to Life with VFX Supervisor Chris Harvey

What would existence be like for the first sentient robot? Pretty eventful, according to the action science-fiction movie Chappie, which officially opens today in the United States.

Directed by Neill Blomkamp, who previously helmed 2009’s District 9 and 2013’s Elysium, Chappie is the story of an eponymous upgraded police robot, programmed to learn and experience emotion. Post-upgrade, Chappie must then find his place in the world, while fending off forces determined to destroy him.

The key character of Chappie is computer-generated, based on a performance on-set by the actor Sharlto Copley. Creating a believable robot that could engage an audience’s sympathies, and a near-future world for it to inhabit, was largely the responsibility of Chris Harvey, Chappie’s visual effects (VFX) supervisor. Harvey talked with me about how the movie’s creators approached portraying futuristic but believable technology:

Stephen Cass: What is the role of a VFX supervisor?

Chris Harvey: I oversee all of the visual effects in the film. So that includes planning, in pre-production, how we are going to shoot things that are going to need effects integrated later. Then during the shoot we’re making  sure that we’re shooting it in the way that we need and acquiring the appropriate data. Depending on the director we can be very involved in what those video effects will be.… Once you get into post [production], you’re right next to the director: Everything from the effects creative staff is coming to you for approval and then presentation to the director.

SC: How did you work with Blomkamp—what was your brief?

CH: [I started working with Blomkamp] almost 8 months before shooting began… The brief was “we need to create this robot that everyone has to believe is real from a visual perspective [and make him so that people can] connect and relate and emote with him.” That was the big challenge of this movie—we had to take it past just believing he’s present.

SC: How do you go about building a world that’s realistic, but still visually engaging? How do you make extrapolations about how things might look in the future?

CH: Every film is different, because every film has a different visual style. … But the key is always looking to the real world and finding references.… Even if [the thing on screen] is something fantastical that no-one’s ever seen before, not having something to ground it in the familiar reality of our world will always make it much harder to accept as being real.

Now, with Chappie we took that right to the extreme. There’s literally no part of Chappie that is just completely invented. Everything about his design is purpose driven. There’s a function to every component.  There’s a ridiculous amount of real world material built into him. All of his joints and gears and all of that stuff comes from references from real robotics that exist today, so that he feel very, very tangible.

SC: What were the biggest technical challenges in making Chappie feel real and relatable, especially around actors on physical sets?

CH: All of the shooting happened in real locations, so we were on location the whole time. We shot Sharlto in all these scenes, but sometimes [he’s not exactly where he needs to be] so you have to erase him. No one likes to talk about that, but there was an army of people and all they did was just erase Sharlto out of the frame! That’s not new, but it’s certainly something that deserves acknowledgement. The animation was certainly a technical/creative achievement. We did not use motion capture like everyone thinks we did. We used Sharlto as reference, but the animators hand key-framed [the robot] on top of him.…

An interesting challenge was that [Chappie] is one character, but he evolves throughout the course of a scene. So even though you might think of him as one digital model or asset, there are 16 versions of him that we had to track from shot to shot throughout the whole film. [And, for example,] he has a battery light on his chest. And for that battery light we had to make sure it tracked the [correct] percentage from shot to shot.…

One thing we really tried to do a lot, in terms of making [Chappie] feel present in a scene, is physical interaction. We really didn’t shy away from that. If people wanted to hug him or touch him—we really encouraged the actors to interact with him as much as possible. We put these chains on him that could shake around and bump into things.... Those things are very, very hard to achieve, but the more of it that we could do, the more it sells that he’s there.

SC: Once, audience accepted robots like Robby. Now expectations are higher—are those expectations continuing to rise?

CH: Tremendously so. There’s just such an onslaught of big visual effects films. Huge amounts of eye candy. So people are becoming educated. It’s not just films—that technology is making its way into games and into everyday life. People are much more educated on it and much less accepting to the point where there’s a lot of people out there who literally go in looking for the mistakes. Which I think is a shame! 

Google Tests First Error Correction in Quantum Computing

Quantum computers won’t ever outperform today’s classical computers unless they can correct for errors that disrupt the fragile quantum states of their qubits. A team at Google has taken the next huge step toward making quantum computing practical by demonstrating the first system capable of correcting such errors.

Google’s breakthrough originated with the hiring of a quantum computing research group from the University California, Santa Barbara in the autumn of 2014. The UCSB researchers had previously built a system of superconducting quantum circuits that performed with enough accuracy to make error correction a possibility. That earlier achievement paved the way for the researchersmany now employed at Google—to build a system that can correct the errors that naturally arise during quantum computing operations. Their work is detailed in the 4 March 2015 issue of the journal Nature.

“This is the first time natural errors arising from the qubit environment were corrected,” said Rami Barends, a quantum electronics engineer at Google. “It’s the first device that can correct its own errors.”

Read More

An Unhackable QR Code to Fight Bogus Chips

To combat the rising threat posed by counterfeit microchips, researchers from the University of Connecticut now suggest the QR codes often used in ads and signage could be made nearly impossible to hack for use in security.

The UConn researchers claim that their new codes are very difficult to duplicate and decrypt, and could potentially help stem the flood of counterfeit electronics. They detailed their findings in the February issue of IEEE Photonics Journal.

Counterfeit electronics are on the rise, and could pose dangers to national security. For instance, a 2011 Senate Armed Services Committee hearing noted an increasing number of counterfeiting incidents were seen in the Department of Defense supply chain, from 3,868 incidents in 2005 to 9,356 incidents in 2008, mostly involving chips that came from China. In one key case, mission computers for high-altitude missiles used to destroy incoming missiles contained suspect counterfeit memory devices, a problem that cost nearly $2.7 million to fix.

Read More

A Drone with Bug Vision

Almost anything that flies, be it a plane, a spacecraft, or a drone, has an inertial navigation system with accelerometers and gyroscopes that control yaw, pitch and roll, and thus the flight path. Flying insects like bees, however, don't have inertial systems to guide them; they rely exclusively on what they see. This has inspired two researchers at the Aix-Marseille University in France to build a drone that imitates the way these insects navigate. Their mission was to design it to fly and circumvent obstacles by relying solely on visual cues. In a paper published in the 26 February issue of Bioinspiration & Biomimetics, Fabien Expert and Franck Ruffier describe how a drone, which they call BeeRotor, was able to traverse a circular tunnel, avoiding crashes and obstacles.

The tiny craft, which was guided by panoramic optic flow (OF) sensors and no inertial navigation, weighs 80 grams and is 47 centimeters long. It was attached to the end of a freely rotating arm at the center of a circular tunnel. Two rotors kept the drone aloft. The ceiling and the floor of the tunnel were covered with photographs of natural surfaces, and the speed at which the details of these photographs passed under or above the drone were continuously monitored by the panoramic optic flow sensors’ 26 photodiodes.

The principle of the optic flow guiding mechanism is quite simple, explains Ruffier. The robot does not measure its own speed or distance from the floor, but if the details of the underlying structure pass by too fast, the optic flow sensor triggers a feed-back system increasing the speed of the rotors, moving the craft away from the floor, he says. The “eye” has four optic flow sensors, two directed toward the front of the craft and two that look backwards. Each sensor is equipped with a lens that focuses the image on six photodiodes that record the speed of a passing element as its image transfers from one pixel to the other. The eye keeps itself aligned with the nearest surface. “Orienting its eye allows the robot to avoid slopes of up to 30 degrees,” says Ruffier.

The team is now working on a rotorcraft that will be able to leave its perch and fly around freely. This will require two additional optic flow sensor systems. They will control the roll and yaw so that the craft can veer to the right or left when it is approaching an obstacle.

What is the need for drones guided by optic flow sensors if inertial navigation systems have done the same job well for decades? Drone manufacturers are asking that question, but there is interest from European the aerospace industry, especially in Europe, says Ruffier. The probes that land on the Moon, Mars, or comets contain inertial systems that represent about 20 percent of their weight. Space agencies, Ruffier says, are interested in visual navigation systems because they will be much lighter. And even if they don’t replace inertial systems on spacecraft, they could act as back-up systems poised to save space missions, he adds.

Android Phone's Battery Use Can Reveal User Location

Most Android smartphone owners probably feel secure knowing that apps must ask permission to access their location. That sense of security is misplaced, say U.S. and Israeli researchers who have figured out how to track smartphone owners based on a mobile device’s battery use alone.

Read More

ISPs, Lobbyists, and Everybody Else React to FCC Net Neutrality Decision

Reactions to the Federal Communications Commission's net neutrality ruling are as sharply divided as the 3-2 party-line vote at a February 26 meeting streamed live on the web. “Today the FCC has taken historic action to protect the Internet for a next generation of Americans online," proclaimed Alan Davidson, Vice President of the nonprofit New America and Director of its Open Technology Institute. "This order imposes intrusive government regulations to solve a problem that doesn't exist," said Republican commissioner Ajit Pai before casting his nay vote.

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More