Tech Talk iconTech Talk

U.S. Suspicions of China's Huawei Based Partly on NSA's Own Spy Tricks

Fears of Chinese espionage based on "back doors" built into computer hardware have prompted the U.S. government to block China's technology giant Huawei from doing business on U.S. shores. Such suspicions may come in large part from the knowledge that U.S. spies have already learned how to install similar "back doors" in computer hardware and software.

Read More

Sulon Cortex Headset Seeks to Meld Real and Virtual Worlds

Last week at the 2014 Game Developer's Conference in San Francisco, three promising new virtual reality headsets were introduced. One of them, from a Canadian startup called Sulon Technologies, caught our eye because they decided to tackle one of the most challenging issues that VR has to conquer: when you're wandering around a virtual environment, how do you keep from running into real walls?

Read More

Need a Space Robot? There’s An App for That

NASA's SPHERES smart-phone enabled robot travels on a bed of air when it's on the ground; it will navigate in three dimensions aboard the Space Station.


Last week, the New York Times Magazine published “Silicon Valley’s Youth Problem,” complaining that, among other things, the ability to crank out code is trumping other tech talents like expertise in semiconductors or data storage, and that all the cool kids want to work for the same sexting app. Author Yiren Lu pointed to “the vague sense of a frenzied bubble of app-making and an even vaguer dread that what we are making might not be that meaningful.” The takeaway question: Should we be worried that apps are taking over the Silicon Valley mindshare? Bill Gates quickly weighed in, telling Rolling Stone that we shouldn’t worry. I wasn’t quite sure what I thought about the debate.

Then on Monday, I found myself at NASA Ames Research, invited to trail along with NASA Administrator Charles Bolden on a quick tour of a few laboratories.

Now, NASA is the last place I thought I’d find folks developing smart phone apps. But find them I did, in a laboratory dedicated to turning SPHERES—free-flying basketball-sized satellites that have been on board the International Space Station since 2006—into robots. SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellites) are 18-sided polyhedrons that move themselves in a weightless environment by shooting out puffs of carbon dioxide; they can be navigated remotely from the ground and oriented using beacons strategically placed inside the Space Station.

Turning the Spheres into robots involves giving them sensors and more smarts. The team working on this project started out by making a list of what kinds of sensors they needed to add, jotting the suggestions down on their smartphones. Then they suddenly had a light bulb moment: everything they needed was right in front of them, in their phones. So they made some modifications on a standard Android phone (removing the cellular modem—that is, putting it permanently into “airplane mode”—and swapping out the flammable lithium ion battery with alkaline batteries) and docked it to a test Sphere. The gyros on the phone tell the Sphere its orientation, the accelerometer tells it where it’s going, and the camera allows it to do visual inspections, as well as navigate. The phone also lets Sphere communicate by Wi-Fi; a capability it previously lacked.

A researcher working on the project told me that once this realization dawned and they had a phone attached to the test Sphere, developing the software was just a matter of writing an Android app. And with all the developer tools out in the world, that was fast and easy. They sent the Spheres units on board the Space Station their first smartphone, a Samsung Nexus S, in 2011. This year, that handset will be replaced with a Project Tango (Google’s smartphone development effort) prototype that includes a 3-D position sensor.

NASA researchers expect the robotic Spheres to be able to freely move about the Space Station, beyond the small section delineated by navigational beacons. So far, one has been sent off to do visual inspections of the payload racks. Eventually, future generations (with different propulsion schemes and the sensors and smarts built in, instead of packed in a phone perched on the outside) will be able to go outside the Space Station to do external inspections, said Chris Provencher, Smart Spheres Project Manager.

So, what of that New York Times article, Silicon Valley’s youth problem, and all the effort going into apps? Like Bill Gates, I’m no longer worried. Because yes, young techies may be wasting energy writing the latest version of “Hot or Not,” (the latest variant on that, FYI, is the an app called Tinder. Or it was last week.) But far more energy is being saved by using apps and smartphones to jumpstart engineering development, instead of pulling together platforms and code from scratch. And the gap between a smartphone app and rocket science is turning out to be a lot smaller than it seems.

Follow me on Twitter @TeklaPerry.

New Computer Vision System: "Nope. She's Faking It"

A new computerized system is better than humans at telling a genuine expression of pain from a fake one.

Researchers who developed the system say it could be used to detect when someone is feigning illness or pain to escape work. It could also spot attempts to minimize or mask pain, which could be useful during, say, interrogations or health assessments.

According to a report in Wired, the new system is based on something called the Facial Action Coding System, developed by psychologist Paul Eckman. The system is used by animators to give their computerized characters realistic human expressions. The idea is that any facial expression can be mapped to a specific group of muscles in the face.

When people attempt to fake pain, they use the exact same facial muscles that are contracted during real pain. What distinguishes a deliberate expression from a spontaneous one is the dynamics: things like when, how much, and how quickly the muscles move. Humans, it turns out, aren’t great at picking up on this subtle difference in the dynamics of facial motion.

To test this, researchers at the University of California at San Diego and the University of Toronto first recorded videos of volunteers’ facial expressions as they experienced real pain by dipping their arm in icy water and also as they faked pain while putting their arm in warm water. Then the researchers showed the videos to 170 people and asked them tell real pain expressions from fake ones. The observers could only guess correctly about half the time. With training, their accuracy went up to only 55 percent.

The researchers’ computer vision and machine learning system, on the other hand, was much better at spotting the difference in the dynamics of muscle movement. It could distinguish between real and fake expressions with 85 percent accuracy.

The study, which was published today in the journal Current Biology, shows that the single most predictive feature of fake expressions is the mouth, and how and when it opens. The researchers found that when people fake pain, their mouth-opening action during grimaces is too regular. Both the interval between mouth opening and the time for which they open their mouth is too consistent.

In a press release, Marian Bartlett, research professor at UC San Diego's Institute for Neural Computation and an author of the study, said: "Our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers' expressions of sleepiness, students' expressions of attention and comprehension of lectures, or responses to treatment of affective disorders."

A Transistor that Stands Up to Blistering Nuclear Reactor Temperatures

Wonderful as silicon-based transistors are, they break down at temperatures above 350 °C. For higher-temperature environments, such as those found in jet engines and deep oil wells, researchers have had to turn to other options such as silicon carbide circuits, which can survive up to 550 °C.

Now, researchers at the University of Utah have made tiny plasma-based transistors that work at the blistering temperatures found inside nuclear reactors. While plasma transistors were first reported five years ago, the new devices are 500 times smaller than those early versions.

The new micro-plasma transistors work at temperatures of up to 790 °C. They could be used to make electronics for controlling robots that conduct tasks inside a nuclear reactor, says Massood Tabib-Azar, the professor of electrical and computer engineering at the University of Utah who developed the  devices. Such extreme-temperature logic circuits could also control nuclear reactors in case of emergencies or nuclear attacks. Tabib-Azar and his postdoctoral researcher, Pradeep Pai, reported the plasma transistors online today in the journal IEEE Electron Device Letters.

In a conventional three-terminal field-effect transistor, the voltage applied at the gate terminal controls the current flowing through a semiconductor channel. A voltage that is above a certain threshold turns the device on.

The channel in a plasma transistor consists of a partially ionized gas, or plasma, instead of a semiconductor. An electron emitter, typically silicon, injects electrons into the plasma when a voltage is applied to it. Plasmas are generated at very high temperatures, making them suitable for an extreme-environment transistor. Today’s plasma transistors, which are used in light sources and medical instruments, are about 500 micrometers long and operate at more than 300 volts, requiring special high-voltage sources.

The new devices are between 1 and 6 microns in length and operate at one-sixth the voltage. Tabib-Azar and Pai made the transistors by first depositing layers of a metal alloy to form the gate on a 10-centimeter glass wafer. They deposited a thin layer of silicon on top of the gate. Then they etched away portions of the silicon film using a chemically reactive gas, creating cavities and empty spaces that they could fill with the plasma to form the transistor's channel. They used helium as the plasma source.

The researchers are working on connecting the devices to make logic circuits that they plan to test in the experimental nuclear reactor at the University of Utah.

In addition to working in nuclear reactors, the new extreme-temperature transistors could be used to generate X-rays. Instead of using bulky lenses and X-ray shaping devices, engineers could use these tiny devices to pattern microscale devices in silicon. Or this type of transistor could be incorporated in a smartphone, creating an X-ray imaging source to collect images of wounded soldiers in the battlefield, says Tabib-Azar.

Photo: Dan Hixson/University of Utah

Academic Inventions Funded by Industry Benefit Innovation

Industry money in university labs can raise eyebrows among researchers who worry that corporate interests might hoard academic inventions through exclusive licensing deals and stifle broader innovation. But a new study based on two decades of evidence from the University of California system suggests such fears surrounding industry-funded university research may be overblown.

Read More

Laser Makes More Accurate Radar System

Using a laser to generate radio-frequency radar pulses, a group of researchers has demonstrated a radar system that they say can be smaller, more efficient, and more accurate than anything available today.

The work could make it easier for radar systems to use software-defined radio, which allows users to rapidly change the signal they generate using software rather than analog hardware components such as mixers, amplifiers, and the like. Dispensing with these electronic components would make radars smaller, lighter, and more energy efficient, making them attractive for use aboard airplanes and in remote locations. They could even be switched on the fly from acting as radars to working as communications devices, says Paolo Ghelfi, an optical communications researcher at the National Inter-University Consortium for Telecommunications in Pisa, Italy. Ghelfi, head of research Antonella Bogoni, and their colleagues describe their photonics-based, fully-digital radar system, PHODIR, in this week’s issue of Nature.

Read More

The Newest Organized Labor Group: Start-up Employees

More than 300 employees of Silicon Valley companies gathered last Friday night in Palo Alto to share war stories and to start developing what organizers called a "Startup Employee Equity Bill of Rights".

Mary Russell, an attorney, and Chris Zaharias, who ran tech sales teams at a number of start-ups and recently founded SearchQuant, put together the event. They hope it represents the beginning of a movement, bringing more transparency and fairness to the process of equity compensation for start-up employees.

The launch was much bigger than the two had anticipated. They originally expected a couple of dozen attendees (start-up employees only, no founders or venture capitalists allowed), and had scheduled the meeting to be held in a downtown office. When RSVPs pushed 300, they scrambled to relocate it to the theater at a local high school. And the crowd of engineers, computer scientists, and business grads pretty much filled the joint.

They came because they had gotten raw deals from a start-up in the past, were in the process of negotiating employment contracts with start-ups right now, or are working for most established companies but hoping to join start-ups in the future. In any case, they came because they feel lost when they try to understand the way start-ups compensate employees.

Read More

IBM Watson Takes on the Genetics of Brain Cancer

Twenty patients with an aggressive form of brain cancer will have a new doctor on their medical team: the learned geneticist known as IBM Watson. In a collaboration announced today between IBM and the New York Genome Center, IBM's Jeopardy-beating AI will analyze the genomes of those 20 patients in hopes of providing insights for their oncologists. 

IBM has been promoting its AI as a killer app for health care, thanks to Watson's natural language processing skills and machine learning abilities. Over the past two years Watson has been engaged in a separate project at New York's Memorial Sloan-Kettering Cancer Center, in which doctors are training the AI to understand the language of medicine. In that project, Watson is being taught to read patients' records and search the medical literature for relevant suggestions on treatment. 

This new project will show that Watson can provide deeper analysis for such point-of-care applications, said IBM Research's Raminderpal Singh after a press conference in New York today. For these 20 cancer patients, Watson won't just scan the medical literature for information. The AI will also scan the genetic data from the patients' own healthy cells and cancer cells, and will then search for information that's relevant to the genetic mutations in the tumor. "As genome sequencing becomes more commonplace—and it will—we'll need a way to go from mutation information to clinically actionable information," said Singh.   

Read More

The Earth Was Lucky to Dodge a Massive Solar Magnetic Storm in 2012

When the largest magnetic storm ever recorded struck Earth in 1859, telegraph systems failed across North America and Europe and gave electric shocks to some telegraph operators. Researchers recently analyzed a similarly huge magnetic storm that missed Earth by just nine days in 2012, which could have caused trillions of dollars worth of damages to satellites and power grids.

Read More

Most Commented Posts

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More