Tech Talk iconTech Talk

A blue and yellow logo spelling out CES

CES 2017: AR, VR, and IoT Will Be Hot, 3D Printing Not

This week sees the annual consumer technology extravaganza that is the CES 2017 show in Las Vegas. Once almost an afterthought, technologically speaking, consumer electronics have become increasingly important in driving the entire global tech industry. What products companies choose to bring to the show often represent an interesting tension between hard-nosed calculations and corporate wish fulfillment about the direction tech is expected to take in the coming months and years.

At CES 2017 we in the IEEE Consumer Electronics Society expect to see a reduced focus on drones compared to 2016. Drones haven't gone away, but there are few solid practical applications for most consumers. Still, small inexpensive drones could be a growth area as toys and hobby vehicles. Instead we expect to see a lot more focus on augmented reality (AR), virtual reality (VR), and home health. (And, or course, the occasional surprising and interesting product or announcement.)

There are many long- and short-form VR projects ongoing (both professional and amateur), helped by the availability of consumer versions of selfie-stick VR systems along with a variety of cameras. Social media sites and YouTube now offer 360-degree video support as a matter of course, also helping to drive adoption.

Wearables will be important, although the smart-watch market hasn't picked up as fast as many had hoped. These really need to find their killer applications (perhaps some AR application using phones and watches such as we’ve seen with Pokémon Go).

There will be an increase in Internet of Things (IoT) consumer applications (we look forward to seeing this year’s incarnation of the proverbial smart fridge) as well as cloud-based IoT offerings that provide services to consumers.

Wearable and cloud-based IoT services will also mean we’ll be seeing AI and machine-learning applications. These applications could be big enablers of new consumer services running on wearable devices as well as household voice-activated products from Amazon, Google, and other companies. For example, voice control will be a big theme at CES 2017 with new product introductions by Amazon, Google, and others. Machine intelligence will also make still and video images more useful with increasing capabilities for image recognition. Large enterprise companies with strong machine-learning capabilities will be showing how data from connected intelligent consumer devices will enable new ways to reach customers and offer them additional services.

I would also expect that there will be a greater focus on security and privacy, with the proliferation of connected consumer devices and recent reports that some of these devices have been hijacked as bots in denial-of-service attacks. Greater security and anonymity for shared content will be important safeguards to make sure that consumers feel safe with their connected devices and services.

Turning to televisions, 4K TVs now have a standard that takes full advantage of their potential, including expected HDR (high-dynamic-range images) as well as their resolution and color capabilities. Coupled with decreasing prices, these TVs should see greater pickup by both leading-edge consumers and the higher end of mainstream consumers. Many consumers are increasingly considering 4K TVs for their next replacement TV. So lower-cost 4K TVs will be a big presence at CES. In addition, UHD (ultrahigh definition) streaming services will be present, as well as Blu-ray disc UHD players that will provide content for viewing on these displays. (Almost all new content is captured in at least 4K nowadays.)

On a smaller scale, there could be more maker-oriented items as well as craft projects including microbrewing (both coffee and beer) at CES 2017, although I don't expect to see the big 3D-printer displays we saw the last few years. However, unusual 3D printing (printed pancakes, anyone?) could be sneak hits at the 2017 show.

Finally, automobile technology will continue to play a big role at CES as more and more autonomous driving functions are included in new model cars. This will also include tying consumer applications into automobiles and mobile activities.

Toward the end of the CES show, on January 8, the 2017 IEEE ICCE Conference will have a focus on Virtual and Augmented Lifestyles. The ICCE conference focuses on consumer technologies that will be the hottest thing three years from today. As a teaser of what’s to come, new for this year are tracks organized with the IEEE Biometrics and RFID Councils, the IEEE Cloud Computing initiative, and the IEEE Society for the Social Implications of Technology.

About the Author: Tom Coughlin is an IEEE Senior Member and Chair of the IEEE Consumer Electronics Society (ICCE) Future Directions Committee.

Wearing nervous expressions, a slightly disheveled, but attractive, man and woman peer through a large circular hatch, at some glowing light source outside the frame of the picture

Meet Jon Spaihts, the Writer Behind the Movie <i>Passengers</i>

Just released in time for the Holiday weekend in the United States is the science fiction movie Passengers, starring two of Hollywood’s most bankable stars, Jennifer Lawrence and Chris Pratt. The action takes place on board an interstellar colony ship, in which all the crew and colonists have been placed in hibernation for the duration of the 120-year voyage—until a mishap wakes Pratt’s character up just 30 years into the trip.

IEEE Spectrum’s Stephen Cass talked with the screenwriter of Passengers, Jon Spaihts, about his inspiration for the movie and the process of bringing his ideas to life in a big Hollywood movie. (Mild spoilers below. The conversation has been edited for concision and clarity.)

Stephen Cass: One of the things I liked about the movie, from an engineering point of view, is the way that it depicts numerous seemingly unrelated minor system failures that herald an escalation towards catastrophic failure. This kind of cascading sequence pops up in real accident reports involving complex systems [PDF]. Were you aware of this pattern and decided to build a story around it, or did you have a specific storytelling problem and found the pattern offered a solution?

John Spaihts: I needed a technical crisis that would fit the profile of the story. Meaning that it needed to make something very small to go wrong at the top of the story—leading to the awakening of our hero—but ultimately needed to swell into a full-blown crescendo that would endanger the entire ship… And I needed something that could affect systems as disparate as the hibernation system and then real ship-threatening systems like propulsion and the powerplant. That led me to think about the only real common thread those systems have, and that was computer control. The notion was that there was was kind of a mainframe computer where there was a central processor core that would just be called by systems all over the ship for processing tasks, and by terrible misfortune that core took a crippling hit, leading every auxiliary and minor processor on the ship assigned the load, with everything running way past rated capacity for years on end until things started to fail, and then of course you get a rapid cascade.

SC: The other engineering aspect I liked is that Passengers is one of the very few screenplays dealing with interstellar travel that talks about the risk of running into debris at very high speeds between the stars, although scientists worry more about tiny dust or gas particles than the asteroid field that’s shown.

JS: (Laughs) Yeah, larger than I would have made them! They are Hollywood particles… It’s something that kind of flowed naturally from the investigation of the premise: If you’re going to make a 120-year journey at half the speed of light, then that really leads you to do a lot of math about the energies involved, the propulsion problem, and I looked a little bit at relativistic math, just to see if time dilation would substantially affect their experience, which at that speed it mostly doesn’t. It’s important for navigation and communication, but not terribly important for life span. But encountering even individual gas molecules at half the speed of light imparts tremendous energies—a potato-sized nickel-iron meteorite would really ruin your whole day! So there’s always going to be some sort of plough at the front of the ship to handle that. The notion [in Passengers] was something penetrating those defenses, which was supposed to be impossible.

SC: Nowadays, a hibernation pod has become a background trope. We’ve seen them in Planet of the Apes, we’ve seen them in Aliens

JS: Yeah, like artificial gravity and force fields. They are just things that the audience accepts.

SC: Right. So in 2016 it seems risky to try to make a trope like that the central conceit of a movie. Why did you decide to go back and try to make a hibernation story fresh again?

JS: I actually think it’s a great way in to any [creative] space that has been paralyzed by cliché. The first script I sold to Warner Brothers was a movie called Shadow 19 (that hasn’t been made), which came out of exactly that kind of thought process. I was complaining to my brother about things that frustrated me about the Star Trek universe. And one of those was what felt to me like a failure of imagination about the ramifications of something as mind blowing as the transporter, or the phaser that made people disappear… The ship had ten God-like technologies that they never thought about! … I said, let’s talk about what the transporters are doing. Are they annihilating the original person and killing him, and then creating a perfect simulacrum over there? Doesn’t that raise a host of moral and philosophical issues? Are they buffering that information in some way—could they mass produce that guy at the other end? Just unpacking the trope led to a startling new story. Passengers results from unpacking the trope of the hypersleep pod that everybody just accepts in a science fiction starship. We say: wait, let’s talk about the ramifications of this and what it means for everyone. It gets very interesting. I think that’s often the best approach to cliché, to run straight at it and unpack it and make everybody look at the pieces.

SC: Another thing that modern audiences have gotten used to seeing is a “heavy industrial” look to movie spaceships.

JS: Yes, raw metal surfaces, unfinished steel!

SC: And Passengers is set on this beautiful spacecraft that looks like a high-end cruise liner with spiraling habitation rings. How much of that was in the original script?

JS: The rotating helix design was definitely the work of Guy Dyas, our production designer, who did an extraordinary job with all the spaces in the ship. The quality of the interiors was very much called out in the script, it was meant to evoke a luxury cruise ship of the future, with spaces of different character. There are a lot of modern concourses, which are the identity of the ship as it’s [designed for carrying passengers]. There are nostalgic spaces which are designed to call back to styles on Earth—you have a French restaurant, Italian restaurant, Mexican restaurant. And then the service compartments of the ship, which are for crew, are much more no-nonsense and utilitarian.

SC: Does that idea of different spaces in the movie also apply to the space suits? The suits the characters rely on don’t have a lot of features found on spacesuits today, such as maneuvering packs.

JS: Yes, those are recreational spacesuits, designed for safety and sightseeing. Somewhere else is the radiation-hardened, thruster-enabled, heavy-duty worksuit. But because heroes don’t have access to the crew spaces, they haven’t found them!

Quicker Camera Chips Coming

If you want to capture a super-slo-mo film of the nanosecond dynamics of a bullet impact, or see a football replay in fanatical detail and rich color, researchers are working on an image sensor for you. Last week at the IEEE International Electron Devices Meeting in San Francisco, two groups reported CMOS image sensors that rely on new ways of integrating pixels and memory cells to improve speed and image quality.

Both groups are working on improving global-shutter image sensors. CMOS image sensors usually use what’s called a rolling shutter. Rolling shutter cameras scan across a scene—that is, each frame of the image only shows part of the scene. This makes them speedier but it can cause distortion, especially when filming a fast-moving target like a car or a bullet. Global shutters are better for filming speeding objects because they can snap the entire scene at once. CMOS sensors aren’t naturally suited to this, because the pixels are usually read out row by row. CCD image sensors, on the other hand, have a global shutter by definition, because all the pixels are read out at once, says Rihito Kuroda, an engineer at Tohoku University in Sendai, Japan. But they’re not ideal for high speed imaging, either. Due to their high voltage operation, CCDs heat up and use a lot of power when operating at high shutter speeds.

To get beyond the row-by-row, rolling shutter operation of CMOS, chip designers assign each pixel its own memory cell or cells. That provides a global shutter but with sacrifices. In the case of ultrahigh speed imaging, the sensors are constrained by their memory capacity, says Kuroda. By focusing on the design of a custom memory bank, Kuroda’s group has developed a CMOS image sensor that can take one million frames per second for a relatively long recording time—480 microseconds at full resolution—compared to previous ultrahigh speed image sensors.

Because storage is limited, it’s not possible to take a long, high speed, high resolution video—something must be sacrificed. Either the video has to be short, capturing only part of a high speed phenomenon in great detail, or it must have lower spatial or temporal  resolution. So Kuroda’s group focused on boosting storage in the hope of improving all three constraints.

Kuroda’s group made a partial test chip with 96 x 128-pixels. The image sensor is designed to be tiled to have a million or more pixels. Each pixel in the prototype has 480 memory cells dedicated to it. So the camera can take high resolution images for 480 frames. Other sensors have captured video at a higher frames per second rate but they’ve had to do it either for a shorter period of time or with poorer spatial resolution.

The Tohoku group designed a dense analog memory bank based on vertical capacitors built inside deep trenches in the silicon chip. Because the capacitors hold a variable amount of charge, rather than a simple 0 or 1 as in DRAM, lowering the amount of current that leaks out is critical, says Kuroda. The deeper the trenches, they found, the greater the volume of each capacitor and the lower the leakage current. Increasing volume with trenches rather than by spreading out over the chip saved space and allowed for greater density of memory cells. This meant more memory cells per pixel, which allowed for longer recordings. It also freed up space to put more pixels on the chip, improving the camera’s resolution.

Some of Kuroda’s earlier CMOS image sensor chips, which used planar rather than trenched capacitors, are already on the market in ultrahigh speed cameras (HPV X and X2 models) made by Shimadzu. He says the new million frame per second sensor will further improve products like them. To push things even further, Kuroda says the next step is to stack the pixel layer on top of the memory layer. This will bring each pixel closer to its memory cells, shortening the time it takes to record each frame and potentially speeding up the sensor even more.

This sort of camera is useful for engineers who need to follow the fine details of how materials fail—for example how a carbon fiber splits—in order to make them more resilient. Physicists can use them too, for example, to study the dynamics of plasma formation.

Separately, researchers from Canon’s device technology development headquarters in Kanagawa, Japan, reported memory-related improvements for high-definition image sensors that could be used to cover sporting events or in surveillance drones. While the Tohoku group is working on ultrahigh speed, the Canon group aims to improve the image quality of high-definition global shutter cameras operating at much lower frame rates of about 30 to 120 per second.

Like the Tohoku University chip, the Canon sensor closely integrates analog memory with sensors. In the Canon chip, each pixel in the 4046 by 2496 array has its own built in charge-based memory cell. They’ve used an engineering trick to improve the image quality by effectively increasing the exposure time within each frame. Typically, the image sensor dumps its bucket of electrons into the memory cell once per frame. This transfer is called an accumulation. The Canon pixels can do as many as four accumulations per frame, emptying their charges into the associated memory cell four times. This improves the saturation and dynamic range of the images relative to previous global shutter CMOS devices operating around the same frame rates. At 30 frames per second, the sensor maintains a dynamic range of 92 dB.

This story was corrected on 19 December 2016. It is not certain Shimadzu will incorporate the current research into a product.

Icons of biohazards, video cameras, and microchips

Wanted: In-flight Drone Charging, Itty-Bitty Spy Cams, and More

Every year, the U.S. Combating Terrorism Technical Support Office puts out a “Broad Agency Announcement” that describes technologies that it wishes it could purchase, but which don't yet exist. It's a sort of to-do list for technologists and engineers, and it can turbocharge research in these areas.

The agency issued its latest draft in November, and it includes some doozies. Among the items described are a wireless recharging station for drones in flight, a low-power mini-spy camera that can be worn on the body, and a portable scanner that can find tunnel entrances under a floor or behind walls. 

None of these technologies are easy to create, and that’s the point. “I think it is very challenging and that's usually what the BAAs are geared for,” says Albert Titus, a biomedical engineer at the State University of New York at Buffalo. “When you read them the first time, it's kind of, 'Oh my, my.'”

Read More

A Smart Contact Lens for Eye Injuries

A smart contact lens fitted with an artificial iris could help people with eye injuries and congenital diseases see better. The lens, described this week at the International Electron Devices Meeting in San Francisco, uses concentric LCDs to mimic the expansion and contraction of the pupil that’s normally controlled by the iris.

The artificial iris is part of a larger project on smart contact lenses led by Herbert De Smet, a professor who works on intelligent sensors at the University of Ghent. De Smet’s group is working on putting many electronic components onto these lenses, including batteries, antennas, control electronics, and chemical sensors.

The lens presented at the San Francisco meeting is aimed at helping about 200,000 people who suffer from problems with the iris, whether due to cancer, an acute injury, or genetics. The iris is the colored part of the eye surrounding the pupil. It contracts under bright light to protect the retina from, say, the rays of the full sun, and expands in low light to help us see better. When the iris is absent or damaged and therefore can’t contract, being out in the sun or just under bright indoor light is painful.

The usual solution is to wear dark sunglasses or a dark contact lens, says Florian De Roose, a researcher at Imec in Leuven, Belgium. But it’s difficult to see in low light when wearing sunglasses. And people with damaged irises may find that daylight is still too bright despite wearing tinted lenses. A contact lens that darkens to block out light and effectively constricts the pupil could help people to see better.

De Smet is collaborating with De Roose and other researchers to make parts for the artificial iris system. De Smet’s group has already integrated liquid crystal cells onto contact lenses; De Roose worked on adding flexible control electronics. On the lens, three concentric LCDs surround a clear central area that sits above the pupil. In bright light, all three LCDs can be activated, causing the artificial iris to contract and narrow the opening. In dim light conditions, all the LCDs are turned off, and the artificial iris expands to let more light in. Around the iris are ten organic solar cells and control electronics containing a driver for each of the three LCD rings.

De Roose’s Imec group worked on flexible, low-power driver electronics that take up about 0.75 square millimeters. They used thin-film transistors, based on transparent IGZO, built on a flexible polymer. The control chip is placed at the edge of the lens so that it doesn’t occlude vision. However, the completed chip has a transparency of about 50 percent, so it could be made larger. The system draws 25 microwatts—which the onboard photovoltaics should be able to supply.

So far, all the parts have yet to be integrated. The collaborators have shown that they can build LCDs, solar cells, and drivers on the lens and that the driver can control the LCD; now they have to show that the full system can operate together with the solar cells. In future systems, says De Roose, the photovoltaics will act both as the light-level sensors and the LCD power source. “The beauty of this is, the more light there is available, the more power there will be to drive the LCDs” and to make the iris contract, says De Roose. He also notes that while the group hasn’t focused on aesthetics, organic photovoltaics can be made in colors that could look relatively natural.

The smart contact lens project faces broader challenges. Such lenses must do more than carry workable sensors and display elements. They must also be carefully mechanically engineered. The lenses themselves are stretchy, but the transistors are merely flexible. The researchers will have to account for this mismatch, either by moving to stretchy materials or being very careful about the smart lens architecture. And more importantly, they must ensure that these lenses are safe. One way they’ll do that is by ensuring that the electronic components don’t interfere with the transfer of water and oxygen through the lens to the cornea. Otherwise, the lens could cause infections.

A pair of stars, Alpha Centauri A and Alpha Centauir B, form part of the closest star system to our own.

Self-Healing Transistors for Chip-Scale Starships

Working with the Korea Institute of Science and Technology (KAIST), NASA is pioneering the development of tiny spacecraft made from a single silicon chip that could slash interstellar exploration times.

On Wednesday at the International Electron Devices Meeting in San Francisco, NASA’s Dong-Il Moon will present new technology aimed at ensuring such spacecraft survive the intense radiation they’ll encounter on their journey.

If a silicon chip were used as a spacecraft, calculations suggest that it could travel at one-fifth of the speed of light and reach the nearest stars in just 20 years. That’s one hundred times faster than a conventional spacecraft can offer.

Read More
Za Wilcox on his knees destroying a computer with a power tool as sparks fly.

The Crazy Security Behind the Birth of Zcash, the Inside Story

 “How would you feel about donating your phone to science?”

When Zooko Wilcox posed this question to me in October, what I heard was: Can I take your phone and hand it over to a hacker to riffle through its contents and sniff all over your data like a pervert who’s just opened the top drawer of a lady’s dresser?

At least, that’s how it felt.

“I think I’d rather donate my body,” I said.

What Wilcox really wanted to do with my phone was to run forensic analysis on it in the hopes of determining whether someone was using it to spy on us. Wilcox is the CEO of a company called Zcash which designed and recently launched a new privacy-preserving digital currency of the same name. On the weekend he asked for my phone we were both sitting with a two-man documentary film crew in a hotel room stuffed with computer equipment and surveillance cameras.

A secret ceremony was underway. Before the company could release the source code of its digital currency and turn the crank on the engine, a series of cryptographic computations needed to be completed and added to the protocol. But for complex reasons, Wilcox had to prevent the calculations from ever being seen. If they were, it could completely compromise the security of the currency he had built.

Over the course of the two-day event, everything went pretty much as planned. Everyone and everything did just what they were supposed to do, except for my cellphone, which in the middle of the event exhibited behaviors that made no sense at all and which planted suspicions that it had been used in a targeted attack against the currency.

The story of Zcash has already been roughly sketched by me and others. The currency launched 28 October onto the high seas of the cryptocurrency ecosystem with a strong wind of hype pushing violently at its sails. On the first morning that Zcash existed, it was trading on cryptocurrency exchanges for over US $4000 per coin. By the next day, the first round of frenzied feeding had subsided and the price was already below $1000. Now, a month later, you’ll be lucky if you can get $100 for a single Zcash coin. Even in the bubble-and-burst landscape of cryptocurrency trading, these fluctuations are completely insane.

Some hype was certainly warranted. The vast majority of digital currencies out there are cheap Bitcoin imitations. But the same cannot be said of Zcash. The project, which was three years in the making and which combines the cutting edge research of cryptographers and computer scientists at multiple top universities, confronts Bitcoin’s privacy problems head on, introducing an optional layer of encryption that veils the identifying marks of a transaction: who sent it, how much was sent, who received it. In Bitcoin, all of this data is out in the public for anyone to see.

However, with digital currencies, everything is a trade-off, and the improvement in privacy that Zcash brings comes with a risk, one that has gotten much less attention since the currency launched. Obscuring data on the blockchain inevitably complicates the process of verifying the validity of transactions, which in Bitcoin is a simple matter of tracking coins on a public ledger. In Zcash, verifying transactions requires some seriously experimental computation, mathematical proofs called zk-SNARKS that are so hot-off-the-presses that they’ve never been used anywhere else. In order to set up the zk-SNARKS in the Zcash protocol, a human being must create a pair of mathematically linked cryptographic keys. One of the keys is essential to ensuring the proper functioning of the currency, while the other one—and here’s the big risk—can be used to counterfeit new coins.

If it’s not immediately clear how this works, you’re in good company. The number of people who really understand zk-SNARKs, and therefore the Zcash protocol, is probably small enough that you could feed them all with one Thanksgiving turkey. The important thing to get is that, given the current state of cryptographic research, it’s impossible to create a private, reliable version of Zcash without also simultaneously creating the tools for plundering it. Let’s call those tools the bad key.

Prior to launching Zcash, the developers who invented it had to create the bad key, use it to make a set of mathematical parameters for the zk-SNARKS (the good key), then dispose of the bad key before any nefarious individual could get hold of it. And they had to do it all in a way that was both secret enough to be secure yet public enough that anyone who wanted to use Zcash felt well-assured of the technology’s integrity.

The Zcash developers, whose work is funded by over $2 million raised from private investors in the Zcash Company, chose a strategy that relied heavily on the secrecy part of this equation. Nearly everything about the ceremony—where and when it would be held, who would be involved, what software would be used—was kept from the public until a blog post about it was published this afternoon.

Instead of building real-time transparency into the ceremony design, the Zcash team opted to meticulously document the event and save all artifacts that remained after the bad key was destroyed. This evidence is now available for analysis to prove the process went as it was described.

As an extra measure, they decided to invite a journalist to bear witness—me.

Two weeks before the ceremony, I got a vague invite on Signal, an encrypted messaging app, from Wilcox without any specifics about what to expect. A week later he told me where I would have to go. And a week after that—two days before the ceremony—I was told when to arrive. On 21 October, I walked into a coffee shop in Boulder Colorado where I met up with Wilcox and a documentary filmmaker who had been hired to get the whole thing on tape. From there we headed to a computer shop in Denver to buy a bunch of equipment and then returned to a hotel in Boulder, where I stayed for the next three days.

The headquarters in Boulder was one of five “immobile” stations, all of which were participating in the ceremony from different cities across the planet. One mobile station was doing its part while making a mad dash across British Columbia. The generation of the keys was decentralized such that each station would only be responsible for creating a fragment of the bad key. For the ceremony, a cryptographic algorithm was custom designed that created a full version of the zk-SNARK parameters while keeping the pieces of the bad key segregated, a process that took two days of relaying data back and forth among the six stations. 

I’ll hazard an analogy in order to explain more generally how this works: Let’s say you have a recipe and you want to use it to make a single cake that is going to feed everyone in the world and that’s the only cake that anyone is allowed to eat, ever. You have to have a recipe to bake the cake, but you also have to make sure no one can ever make it again. So you split the recipe up into six parts and you design a baking process that allows each participant to add their ingredients and mix them into the batter without the others (or anyone else) seeing what they’re up to. After pulling the cake out of the oven, you burn all the pieces of the recipe.

In this analogy, the recipe is the bad key; the cake is the zk-SNARK parameters; and the person hiding the ingredients and doing all of the mixing is a cryptographic algorithm.

The way this looks in practice is that each station has a computer storing a fragment of the secret. That computer can’t connect to the Internet, has been stripped of its hard drive, and runs off a custom-built operating system. The secret never moves off the computer but it is used in a series of calculations that are then copied to write-once DVDs and carried to separate, networked computer that shares the results with the rest of the stations. Each station builds off the results of the station before it in a computational round robin until the process is complete and the software finally spits out a product.

The benefit of dividing up the work in this way is that no one participant can compromise the ceremony. Each fragment of the bad key is worthless unless it is combined with all the others. It cannot even be brought into existence unless all members of the ceremony collude or an attacker successfully compromises all six of the participating stations.

As an observer, there was very little I could do to verify the security of the events as they unfolded in front of me. I don’t have the advanced cryptography coursework that would be necessary to audit the software that Wilcox and the other station operators were running. And even if I did, the code had not yet been made available for public review. My role, as I saw it, was simply to be present and make sure the people involved did all the things that they would later tell people they did. I can bear witness to the fact that the computer storing the key fragment was bought new, that the wireless card and hard drive were removed, that while I was watching no attacker sneaked into the hotel room to mess with the equipment, that all of the DVDs were correctly labeled, and that the RAM chips that stored the key fragment were smashed and burned in a fire pit after the ceremony.

I can testify that nothing strange happened. Until it did.

During the ceremony most of the station operators were talking with each other on a Google Hangout. On the evening of the first day, after getting up from a bit of a rest, Wilcox wandered over to the laptop that was running the Google Hangout and began chatting with Peter Van Valkenburgh, a station operator located in Washington D.C. We noticed an echo of the audio coming from across the room and started looking for its source.

The whole place was filled with gadgets. Four security cameras had been hoisted onto poles and aimed at the offline computer to provide 24 hour surveillance in the event of a ninja attack. Another digital camera on a tripod was capturing a wide angle shot of the room. Both Wilcox and I were geared up with wireless mics. And another mic was secured to the laptop running the Google Hangout.

I went over to a monitor that was set up to display the security footage between the two hotel beds, and at first I thought that was it. Then I looked down at one of the beds and saw my phone lying there, When I picked it up I immediately realized that the audio was blaring out of the speaker.  

 “Morgen, why is your phone playing the audio from our Google Hangout?” asked Wilcox, bemused, curious, and slightly alarmed.

Why indeed. It was especially strange because I had not knowingly connected to the Google Hangout at all during the ceremony. Furthermore, footage of Wilcox’s computer screen shows that I wasn’t listed as a participant.

So, how was my phone accessing the audio?

Without wasting any time, Wilcox began experimenting. While continuing to talk to Van Valkenburgh, he muted the microphone on his Google Hangout session and then turned it back on. When he did that, my phone only picked up Van Valkenburgh’s audio.

Stranger still, when Wilcox re-enabled his hangout microphone, his voice came through my phone with a slight lag—maybe 100-200 milliseconds—indicating that my phone was picking it up from somewhere outside the room, perhaps from a Google Hangout server.

Just as we started to examine my phone, looking at the programs that were running and a few suspicious text messages that I had received a couple days before the ceremony, the echo abruptly stopped. We quickly put it into airplane mode hoping to preserve whatever evidence remained.

After much negotiating, I surrendered my phone (an archaic Android that was ripe for the hacking) to Wilcox. He has since passed it off to a hacker in San Francisco. Those efforts have produced no evidence about what caused my phone to turn on me, and it’s now on its way to a professional security firm for further analysis.

Unless we find evidence of malware on my phone, the question of how it may have impacted the ceremony is completely hypothetical. Assuming my phone was hacked, who would want to break into the Zcash ceremony? And if an attacker did have full control over my phone, which was powered on and present until the moment it started misbehaving, what could that person do with it?

For answers, I traveled up to Columbia University to the lab of Eran Tromer, a computer scientist at the Zcash company who co-invented its cryptographic protocol. Tromer is at Columbia for a year as a visiting researcher, but his home base is the Tel Aviv University School of Computer Science where he is a member of the faculty and the director of the Laboratory for Experimental Information Security (LEISec) at the Checkpoint Institute for Information Security.

A big part of Tromer’s work at LEISec involves investigating side channel attacks. The idea behind side channel attacks is that you don’t have to have direct access to a computer’s data in order to spy on it. Often, you can piece together some idea of what a computer is doing by examining what’s going on with the physical components. What frequencies are humming across the metal capacitors in a laptop? How much power is it pulling from the wall? How is the voltage fluctuating? The patterns in these signals can leak information about a software program’s operation, which, when you’re running a program that you want to keep secret, can be a problem.

“My research is about what happens to good, sound, cryptographic schemes when they reach the real world and are implemented on computing platforms that are faulty and leaky at the levels of software and hardware,” says Tromer.

In his lab at Columbia, Tromer opened his laptop and ran a demonstration program that executes several different computations in a loop. He told me to put my ear down close to where the fan was blowing out hot air from the computer’s innards. I leaned over, listened carefully and heard the computer whine ever so slightly over and over.

“What you’re hearing is a capacitor in the power supply, striving to maintain constant voltage to the CPU. Different computations done on the CPU have different power draw, which changes the mechanical forces on the capacitor plates. This causes vibration, which in turn are transmitted by the air as sound waves that we can capture from afar,” he says.

Tromer started investigating this phenomenon, called “coil whine,” for himself about ten years ago. “I was in a quiet hotel room at a conference. I was working on my laptop and it was making these annoying whining noises whenever I ran some computation. And I thought, let’s see what happens if the computation is actually cryptographic calculation involving a secret key, and how the key affects the emitted noise.”

Tromer and his colleagues spent the next decade trying to use acoustic leakage from computer hardware components to spy on cryptographic algorithms. In 2014, they demonstrated a successful attack in which they were able to steal a decryption key from a laptop by recording and analyzing the sounds it made as it ran RSA decryption software. With a high tech parabolic microphone, they were able to steal the secret from ten meters away. They were even able to pull off the same attack using the internal microphone on a mobile phone, provided that the device was snuggled up close to the computer.

However, for various reasons Tromer doesn’t think anyone could have used the same strategy with my phone. For one thing, the coil whine in modern computers occurs at higher frequencies than the one he demonstrated—in a range that is typically outside what a mobile phone, which is designed for the lower frequencies of the human voice, can detect.  

“It seems extremely unlikely that there would be exploitable signals that can be captured by a commodity phone, placed in a random orientation several feet away from a modern computer,” he says. “It is not completely unthinkable. There might be some extremely lucky combination. But it would be a very long shot, and at a high risk of detection, for an adversary to even try this, especially since the ceremony setup gave them very little time to tailor attacks to the specific hardware and software setting.”

Moreover, the attacks that Tromer has demonstrated are not passive. In order to collect a useful signal, you have to amplify it by sending challenges to the software that you are attacking. The challenges force the software to repeat computations. In order to do this, you have to know and have studied the code that the computer is running.

The software that was running during the Zcash key generation ceremony was all custom built specifically for that occasion and was intentionally kept from the public until the ceremony was over. The choice to do this was controversial and the approach strays from that of other similar ceremonies. (For example, the DNSSEC ceremony, which generates the digital signatures that secure top level domain names, is done in a much more transparent ceremony that gets publicly audited in real time.)

Before flying to Colorado, I contacted Bryan Ford, a computer science professor who directs the Decentralized and Distributed Systems Laboratory at the École Polytechnique Fédérale de Lausanne in Switzerland. He was troubled by the decision to keep the details of the Zcash ceremony secret. In a series of Twitter direct messages he told me:

“I understand the crypto principles that the parameter-generation is supposed to be based on well enough to know that nothing *should* need to be kept secret other than the critical secret parts of the parameter keys that eventually get combined to produce the final public parameters. If they think the ceremony needs to be kept secret, then...something’s wrong.”

By keeping the details of the ceremony software secret, the Zcash team limited their security audit to just a handful of people inside the company, but they may also have made it more difficult for an attacker to make the kinds of preparations that would be necessary to mount a successful side channel attack.

Even if someone did get a look at the source code in advance, Wilcox says it wouldn’t be the end of the world because secrecy was not the primary defense. According to him, one of the best aspects of the ceremony design was the use of multiple parties. It wouldn’t be enough to pull recordings off the computer in Colorado. An attacker would have to successfully record a side channel at each station. And because Wilcox left many of the security details up to the personal discretion of each station operator, the craftwork that would go into designing six unique side channel attacks would cost a huge amount in both time and money.

At one of the stations it may even have been impossible. Peter Todd, one of the ceremony participants, ran all of his computations on a laptop encased in a tin foil-lined cardboard box, while driving across Canada. He then burned his compute node to a crisp with a propane torch. “It was my goal to outdo every other station in Canadian cypherpunk glory,” says Todd, who also happens to be one of Zcash’s most outspoken critics.

If someone did attempt a side channel attack with the strategies Tromer has demonstrated in his lab, then there would likely be evidence of it in the trove of forensic artifacts that the ceremony produced. Among those items are all of the write-once DVDs that provide a record (authenticated by cryptographic hashes) of what computations were being relayed between the stations in the ceremony. Tromer’s techniques require direct interaction with the software and those manipulations would make their way onto the public record.

At no point did the incident with my phone stop the ceremony. Nor did Wilcox seem terribly concerned that it posed a serious threat. “We have super great security. I’m not worried about preventing some kind of attack. But I’m very interested in figuring it out, or experimenting, or extracting more evidence,” said Wilcox. “They’re very far from winning. So far from winning,”

And I’m curious too. Right now my phone is somewhere, I know not where, awaiting its strip down. Even if it wasn’t used to topple a privacy-guaranteeing digital currency—which, judging from everything I’ve learned, would have been a technological miracle—it’s still quite likely that someone was on it listening to me. Who? Why? For how long? If anything, this experience has deepened my respect for the people who are trying to make it easier to keep our private information private. And at the very least, I’ve learned a lesson: when you get invited to a super-secret cryptography ceremony, leave your phone at home.

A white Starry Beam antenna mounted on a pole for a beta test in Boston, Massachusetts

Startup Says Beaming Millimeter Waves Over the Air Will Make It a Star in Ultra-Fast Wireless Broadband

Standing on the flat roof of a data center at an undisclosed location in Boston, a shivering Chet Kanojia gestures toward a sleek white box about the size of a piece of carry-on luggage. This is the proprietary base station that the seasoned startup founder believes will change the way the world receives its Internet and liberate frustrated customers from the iron grip of legacy providers such as Comcast and Time Warner.

The white box mounted on a pole before us is called a Starry Beam. Only about a dozen of these custom base stations exist in the world right now. The team at Kanojia’s newest startup, named Starry, has spent the past 20 months perfecting the base station’s design. Its performance so far on this nondescript rooftop has persuaded Kanojia that the Internet of the future will not be delivered through expensive fiber optic cables laid in the ground, but beamed over the air using high-frequency millimeter waves.

Standing beneath the Starry Beam, Kanojia points past a spate of warehouses lined with leafy trees to an apartment complex about a kilometer away. There, jutting out from the window of an apartment that Starry has rented, is a white spherical device called a Starry Point. Starry Beams broadcast millimeter waves to Starry Points, which convert them to lower frequencies that flood the home so users can stream ultra-high-definition 4K TV shows to their hearts’ content.

This method, Kanojia believes, can offer much faster service to customers for far less money. In January, he shared that vision at Starry’s launch party in New York City. The company has been quiet ever since, but executives now say their first beta, which has been underway since late August, has confirmed their basic premise—that millimeter waves can deliver ultra-fast broadband speeds up to 1 gigabit per second to customers over the air.

Based on these early results, Starry anticipates average user speeds on its yet-to-be-built network will be as fast as any broadband connection available today—somewhere in the range of 200 to 300 megabits per second. For comparison, the average broadband network in the United States offers average download speeds of just 55 megabits per second.

Right now, Starry’s beta is only measuring the performance of this original Starry Beam that serves a handful of users. In the first quarter of 2017, the company will launch an open beta and build its test network out to a half dozen sites capable of serving several hundred users. Starry has also received permission from the U.S. Federal Communications Commission to run tests in 14 other cities including New York, Dallas, Seattle, San Francisco, and Chicago.

Because Starry CTO Joe Lipowski says the start-up doesn’t plan to publish the results of the beta, it’s hard for anyone to independently evaluate the company’s claims. Starry has not released any press releases about its progress, and Kanojia has also kept the details of his fundraising under wraps. “The less people know about our performance, the better it is for us,” he says.

That attitude has left outsiders wondering what to think of the company’s prospects in such a highly competitive market. “On the surface, the technology sounds like it's sufficient to do what they need it to do,” says Teresa Mastrangelo, a longtime wireless analyst with Broadbandtrends LLC. “We haven’t really seen anything at a big scale. I'll be curious to see how it goes when we're looking at tens of thousands of subscribers.”

If the company can successfully scale, Starry could rewrite the story of what it means to provide high-speed Internet service to homes and businesses. Millimeter waves are high-frequency radio waves that occupy a section of the electromagnetic spectrum that has never been used for consumer technologies. While WiFi, Bluetooth, and cellular carriers have operated on frequencies below 6 gigahertz, Starry is currently testing its technology at 38.2 GHz and 38.6 GHz (where waves are much shorter in length), with future plans to broadcast at 37 GHz and 40 GHz.

Millimeter waves offer several advantages over those delivering cellular data wirelessly on 4G LTE networks and even those carrrying broadband Internet service that is piped to homes through fiber. First, there is a lot more open bandwidth in the millimeter-wave range than there is at lower frequencies crowded with signals from smartphones, microwaves, and WiFi devices. And Starry thinks sending the Internet over the air to consumers will be much cheaper than digging up the ground to lay cables.

In fact, Kanojia estimates that Starry can build out a wireless network that costs only $25 for every home it serves in areas with a population density of at least 1,500 homes per square mile. Installing fiber networks typically costs $2,500 per home. Kanojia thinks the company can make money with market penetration as low as 3 to 5 percent, whereas fiber deployments sometimes require up to 65 percent penetration to be profitable.

One factor that will likely work in Starry’s favor is the range and agility of its Starry Beams. Kanojia says these base stations can deliver superfast Internet service to any customer within 1.5 kilometers who also falls within “near-line-of-sight” of a Starry Beam. That’s an important finding because millimeter waves are often presumed to perform best at shorter distances when there is a clear path between a base station and the end user—and such a direct route can be difficult to find in cities. Millimeter waves can’t easily penetrate windows or buildings, or maneuver around objects like traditional cellular signals can. They are also prone to degrade over longer distances when passing through foliage or rain.

To work around that, Starry equipped each Starry Beam with four active phased arrays, which are rows of tiny antenna elements that cooperate to point and amplify signals in precise directions. With these arrays, a base station can transmit signals more rapidly and with more precision than traditional antennas. In practical terms, this means the Starry network can serve Starry Point receivers mounted on the sides of a building from the same base station that serves those in front by bouncing signals off of buildings and other reflective surfaces. “Our measurements have shown that there’s tremendous reflections,” Lipowski says. Even at what they call “extreme non-line-of-sight” conditions, they’ve delivered data rates of 200 Mb/s to beta users.

Based on these results, Kanojia thinks Starry can provide broadband service with a deployment model similar to existing LTE networks: renting space on existing rooftop cell towers through companies such as the American Tower Corporation. To cover all of Boston, which measures about 230 square kilometers, Kanojia figures the company will need to install three or four Starry Beams at 20 to 30 sites. Each box will support about 1,000 users and boast throughput of 5 Gb/s, for a total of 15 to 20 Gb/s per site. They expect this rate will improve to 45 to 50 Gb/s per site in 2017, once the company upgrades its equipment to meet a new wireless standard known as 802.11ax.

Though Starry says it has cleared some of the biggest technical hurdles that millimeter waves pose for delivering high-speed Internet over the air, it must still find the right pricing model to bring the service to market. “There's no doubt that one could make a system work at 1.5 kilometer range at 37 GHz. In fact, that's a pretty modest range,” says John Naylon, CTO at Cambridge Broadband Networks Limited which operates several millimeter wave networks throughout the U.S. “The issues are going to be economic.”

Mastrangelo, the analyst, says that based on competitors’ rates, Starry would need to price its broadband plan below $100 a month, and ideally between $65 and $85 a month. Unfortunately, Starry’s heavy reliance on custom-built hardware means that its base stations are much more expensive than off-the-shelf models.

Meanwhile, plenty of other wireless providers are rushing to develop their own gigabit solutions. Though Google has more or less abandoned its costly fiber deployments, it recently purchased a company called Webpass that provides wireless broadband to entire buildings by installing rooftop antennas. (Starry offers a similar option for landlords who want to hook up their properties.) Verizon and AT&T have both said that they will launch trials for delivering over-the-air broadband in 2017. Mastrangelo warns that if Starry doesn’t act quickly, the start-up could fall behind.

“If they had been able to come out with a service when they first unveiled it in January, they would have definitely had a huge head start and probably positioned themselves to be an acquisition for somebody,” she says. “But their timing is not fantastic at this stage.”

Jonathan Wells, president of the wireless consulting firm AJIS LLC, says even if Starry can scale and solve the complications of serving hundreds of users at once through phased arrays without causing interference, competition could quickly undercut their plans.

“I think Starry may well be the first there with the technology and if they are successful, they'll get snapped up by Verizon or AT&T,” Wells says. “But I think offering a service that is competitive with Verizon and AT&T is incredibly hard.”

Kanojia says Starry will ultimately compete with its gigabit rivals by providing exceptional customer service, rather than focusing only on high speeds. The company expects to double in size from roughly 100 employees to 200 before the end of next year; among them will be its first batch of customer representatives. But while the Starry team has already proven it can deliver speed, they may find that providing top-notch customer response is more of an art than a science.

The Canadian startup Maluuba has developed deep-learning datasets to train AI on language comprehension and dialogue

Deep Learning Startup Maluuba's AI Wants to Talk to You

Apple’s personal assistant Siri is more of a glorified voice recognition feature of your iPhone than a deep conversation partner. A personal assistant that could truly understand human conversations and written texts might actually represent an artificial intelligence capable of matching or exceeding human intelligence. The Canadian startup Maluuba hopes to help the tech industry achieve such a breakthrough by training AI to become better at understanding languages. The key, according Maluuba’s leaders, is building a better way to train AIs.

Read More
Raytheon Phaser looks like a tan trailer with a flat dish on an arm at top and a rectangle at a 45-degree angle.

Raytheon Sets Phasers to Drone Destruction with Directed Energy Weapon Test

There are all kinds of creative ways of dealing with rogue drones: Radio jamming. Other drones with nets. Trained eagles. None of these are really designed to handle military drones, however, and large, fast-moving UAVs are still a potential threat, especially if more than one is coming at you at once. It's no surprise that the U.S. Army has been developing solutions for this potential threat— we're not sure what they're working on now, but as of late 2013, Raytheon was successfully testing a long range, high power directed microwave weapon capable of taking out swarms of drones in milliseconds.

Read More

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More