Tech Talk iconTech Talk


3 Ways To Bridge The Digital Divide

What will it take to bring the next billion people online? These days, the answer has as much to do with smart policy as with technical expertise. This week in Washington D.C., policy experts worked alongside engineers at a meeting (hosted in part by the IEEE Internet Initiative) intended to sketch a picture of what such a transition might look like around the world.

Companies such as Google and Facebook would like to know, and so would government leaders struck by the Internet’s power as an economic engine. More than half the world’s population, or about 4.2 billion people, do not have regular access to the Internet, according to the latest report published last fall by the U.N. Broadband Commission.

Last year, the U.S. State Department announced the Global Connect Initiative, which aims to bring 1.5 billion people online by 2020. As part of that effort, some of the ideas discussed this week will be presented on Thursday to financial ministers during a high-level meeting at the World Bank led by U.S. Secretary of State John Kerry.

Experts emphasized that there is no single technology or network structure that makes sense for every community. However, they offered a few good starting points for any country looking to bolster its number of internet users:

1. Permit unlicensed use of white space.

White space is a term for TV and radio frequencies that aren’t currently in use for existing channels. Originally, these extra frequencies were packed between channels as a sort of buffer in order to prevent interference. But companies have since found ways to operate on these channels without causing any interruption to neighboring programs.

Furthermore, a global transition to digital television and radio from analog has freed up an even broader swath of spectrum. Digital signals can transmit on adjacent channels without causing a disruption to either. Since rural areas tend to have access to fewer existing channels in the first place, they would have even more leftover spectrum.

New devices including smartphones, tablets, and computers that know how to detect unused spectrum can use it to transmit wireless broadband signals, also known as “WhiteFi” or “Super Wi-Fi.” These frequencies are especially useful because they can carry a lot of data over long distances and reach indoors. Tech companies including Google, Microsoft, Intel, Dell, and HP faced off against broadcasters to support early efforts to reuse white space for this purpose, and launched some of the first tests for new devices capable of doing it.

Now, enthusiasm for WhiteFi is picking up across the world. A national demonstration project in the United States conducted in public libraries has since spread to Finland, Malaysia, and the Philippines. Separately, Kenya has also experimented with it in two rural communities while Microsoft and Google recently led trials in South Africa. The Indian Institute of Technology has tested the technology in 13 villages and hopes to eventually serve many more.

2. Adopt a “dig once” mentality.

Whenever a company wants to install a new optical fiber cable to provide better Internet access to a house or community, it must first hire bulldozers and a construction crew to dig a path to the new destination. If multiple companies want to deploy fiber to the same area at different times, they might wind up digging the same route again.  

It’s easy to understand why this process is expensive and disruptive to locals. Experts at this week’s meeting say a much easier and cheaper approach would be for governments to require road construction crews to lay a single conduit alongside each new road as they are building it, through which all future fiber optic cables could be threaded. International development banks could do the same for the projects they fund. Experts stressed the value of these “dig once” policies; the U.S. Federal Highway Administration has said that this way of doing things can reduce the cost of deploying broadband by 90 percent.

This idea is gaining some traction, at least in the United States. The U.S. Departments of Commerce and Agriculture promoted it in a report published last fall. Around the same time, a lawmaker proposed a bill to implement it for all federal highway projects. However, the “dig once” policy is still not fully incorporated into federal, state, or local requirements and has yet to take hold elsewhere in the world.

3. Develop local content.

One of the most consistent ideas to emerge during this week’s meeting was that simply providing technical tools for Internet access isn’t sufficient. To welcome the next billion users, companies and technologists need to engage deeply with local communities to determine if and how they intend to use this access. That way, said the experts, networks can be built out in ways that best suit those purposes. In other words, responding to actual demand for the Internet is as important as devising new schemes to offer it.

One key part of that response is producing local content that is relevant to potential new users in their native languages. Many governments have begun to offer online services for employment, taxes, or licenses, which is one way to generate local content. Developers are also seeing success with local sites and apps that help people share with each other in a particular region. 

“You want to provide Internet access, but what do the end users really need?” Dilip Krishnaswamy, an IBM researcher based in Bangalore, India said. “Maybe they don’t care about the presidential election as much as they want to connect with each other.” India is a good example of the humongous potential demand for local material—it’s home to 1.2 billion people who speak 22 major languages.

All this new content must also be designed to work on devices that are available and popular in that area, rather than the latest smartphones used in Europe or the United States. During the meeting, experts at one table discussed obstacles to Internet use in Africa. They mentioned the ongoing challenge of simply charging devices in many parts of the continent. In response, someone tossed out the idea of hosting a hackathon devoted wholly to developing apps that consume as little power as possible.  

Editors note: This story was updated on April 15 to change “IEEE Internet Society” to “IEEE Internet Initiative.”


Software Rules Tax Preparation, But at What Cost?

It’s mid-April, which means it’s the end of tax season in America again, when those who haven’t yet filed their income taxes scramble to beat the impending deadline. This year, like every year, more of those filers will use software to help them prepare their taxes than ever before.

It’s been thirty years since the Internal Revenue Service began embracing technology in a big way: In 1986 the agency piloted a program for electronic filing. The initial project required an IRS employee to manually turn a modem on each time returns were received, and it could only process certain simple returns. From 25,000 returns in that pilot year, the program grew rapidly: to 4.2 million returns the first year the program went nationwide, in 1990; to 68 million in 2005, when electronic filing surpassed mailed returns; and to over 125 million last year, or more than 85% of all individual returns.

Today, computers are ubiquitous throughout the proccess of taxation. Since 2010, the IRS no longer mails out 1040 forms—even if you want to still fill out paper forms, the agency expects you to download and print them for yourself. 

The rise of electronic filing has been mirrored by the growing role and influence of tax prep software. In 2015, over 50 million people filed self-prepared electronic returns, accounting for 1 in 3 individual filings. While more taxpayers still rely on tax professionals, the balance continues to slowly shift toward software-assisted self-filing (in 2006, only 15% of returns were done that way).

In some ways, taxes are a natural domain for computer assistance. Tax legislation can mostly be modeled as a set of rules and criteria that apply under certain conditions. But the problem is that most tax codes were not written with automation in mind, so there’s a lot of work required to translate them into a technical specification. (As my colleague Robert Charette has noted, the Standard Federal Tax Reporter, which explains the U.S. tax code to accountants, has grown to over 70,000 pages). Not to mention the dozens of state and local tax regulations.

The upfront investment required to build a comprehensive abstraction layer on top of such large collection of requirements is a large barrier of entry to new competitors. That partially explains the success of Intuit’s TurboTax, which dominates the consumer market, processing more than twice as many returns as its nearest competitors, H&R Block and TaxAct, combined. Together, the three account for nearly 90% of returns filed electronically by individuals.

There are a number of reasons consumers choose software like TurboTax, with convenience and cost near the top of the list. (Disclosure: I’ve used TurboTax for many years, including this year). But not everything that’s good for TurboTax is good for its customers, and certainly not for the IRS.

For one thing, TurboTax has a vested interest in making sure the tax code stays complex or becomes even more complex over time. They have lobbied heavily against initiatives like California’s return-free-filing.

There’s also evidence that the sheer scale of TurboTax’s customer base has given them a wealth of valuable data, allowing the company to understand taxes as well as—and sometimes better—than the IRS. That came to light last year when TurboTax was forced to temporarily stop processing state returns after an unprecedented increase in fraudulent returns. A pair of whistleblowers claimed that TurboTax ignored its own internal fraud models, which were more reliable than those at the IRS. Similarly, I suspect that TurboTax has a large enough sample size of data to accurately reverse engineer IRS auditing risk models (which allows them to confidently offer audit protection for an additional fee).

Finally, there’s a danger to filers dependent on tax-preparation software: The more we rely on software like TurboTax, the more we risk falling into the complacency of the automation paradox, where we no longer know enough about how taxes work to hold it accountable or do our own sanity checks. Maybe we would be better off with a simpler underlying protocol than a user-friendly abstraction layer.

In any case, best of luck to those of you who have yet to file!


Bringing Augmented Reality to Real Eyeglasses

If reality isn’t cutting it for you, just hold on; engineers are working on augmenting it. At least, they hope to show you more than what would normally be before your eyes, by adding systems to ordinary eyeglasses that would display images and data to enhance your experience.

Read More
Hbar Technologies concept of an antimatter-catalyzed nuclear pulse propulsion starship.

Antimatter Starship Scheme Coming to Kickstarter

A spaceship departs Earth on a one-way, 42-year trip to Alpha Centauri. It runs on an antimatter engine that blasts the ship out of the solar system at one-tenth the speed of light. This is not the premise for a new Ridley Scott sci-fi drama but rather the endgame of a crowdfunded spaceship project launching this month.

West Chicago, Ill.-based Hbar Technologies plans a Kickstarter effort to raise US $200,000 for the next phase design of an antimatter-propelled spaceship. The two scientists behind this design effort are a veteran Fermilab particle accelerator scientist and a former Los Alamos National Laboratory physicist and founding director of the U.S. Center for Space Nuclear Research. They originally developed it for NASA at the turn of the millennium.

Because of budget cutbacks, the U.S. space agency dropped the antimatter-driven spaceship project in 2004. But the scientists say the plans they’re developing are technically feasible—if admittedly still quite optimistic in terms of the breakthroughs needed to enable antimatter to be stored in a fuel tank.

Read More
A smart phone with Higher Ground's satellite attachment Satpaq on the back.

Could Satellite Messaging Startup Higher Ground Bring Down the 911 System?

Higher Ground might be the only Silicon Valley startup promising not to disrupt its entire industry. This small satellite messaging business is battling claims by telecoms companies that its SatPaq device could interfere with their services, interrupt life-saving emergency calls and even cause major outages across the United States.

IEEE Spectrum can reveal that for the past several years, Higher Ground has been quietly developing SatPaq, a smartphone case with a flip-up antenna that communicates with geostationary satellites. Connecting to a smartphone messaging app via Bluetooth, SatPaq can send and receive text messages and email almost anywhere in the United States, including the wilds of Alaska.

The problem is that SatPaq works in the same C-band microwave frequencies used by CenturyLink and other companies for voice and data communications in rural areas and as part of their national networks. They fear that the widespread use of SatPaqs could result in catastrophic interference.

“[This] is not just potential interference to one or two specific links in a particular location but… potential interference to each and every such link of the network throughout the country,” wrote CenturyLink in a submission to the Federal Communications Commission (FCC). “This seems to be a recipe for disaster.”

Read More

Bigelow Space Habitat on Its Way to ISS as NASA Prepares to Blow It Up

While the most interesting piece of cargo on its way to the ISS after SpaceX's successful Falcon 9 launch is almost certainly some intrepid Tokyo Bekana cabbages, a close second has to be the Bigelow Expandable Activity Module (BEAM), a pleasingly round inflatable space habitat that the astronauts are going to attach to the ISS and then blow up to see what happens. If everything goes according to plan, explosions will be minimal, and BEAM will inflate to its maximum curvaceousness. This will prove, over the course of the next two years, that the future of space habitation is something to get pumped up about.

Read More

U.S. Court Postpones Decision On .africa Domain Name

On Monday 4 April, a California court cancelled a hearing to determine whether the .africa domain could be released to a South African domain-name registry by the nonprofit Internet Corporation for Assigned Names and Numbers (ICANN). According to ICANN, which issues and manages generic top-level domains on behalf of the global Internet community, the court will issue a ruling at an unspecified future date. The delay prolongs a four-year debate over which of two registries should control the continent’s prized domain. Registries resell domain name rights to registrars such as GoDaddy, which, in turn, sign up Web addresses from customers under that domain.

ZA Central Registry technically won the rights to .africa back in 2013 via ICANN’s official process for delegating geographic domain names. ICANN’s decision was challenged in court by a rival registry called DotConnectAfrica.

The legal battle to determine .africa’s true owner could take months or years to resolve. Though DotConnectAfrica has requested an injunction asking the court to prevent the immediate transfer of .africa to ZA Central Registry, the cancellation of Monday’s hearing is no guarantee that it won’t still grant ZA Central Registry a green light to launch .africa in the meantime.

Neil Dundas, executive director of the organization that backs ZA Central Registry told IEEE Spectrum in March that if the injunction were dismissed, ICANN could probably issue the .africa domain to ZA Central Registry within two weeks. Then, ZA Central Registry would be required to host a live .africa site for a month-long trial period. The public sale of .africa sites would begin soon after.

The popularity of so-called “not com” domains has exploded in recent years as website owners find the most commonly used extensions (.com and .org) have become too crowded, and the shortest and easiest-to-remember addresses were already taken.

In response, ICANN invited registries to apply to create new domain names for the Internet. Since 2012, the organization has released more than 900 new domain names for public use, including .yoga, .bar, and .viking.

IBM's Dharmendra Modha poses with TrueNorth chips

How Livermore Scientists Will Put IBM's Brain-Inspired Chips To The Test

Last week, Dharmendra Modha said goodbye to a computer some six years in the making: a set of 16 interconnected TrueNorth chips built to mimic the ultra-low-energy, highly-parallel operation of the human brain.

On Thursday, a team from IBM Research-Almaden in California hopped in a car and drove the unit some 75 minutes north to the U.S. Department of Energy’s Lawrence Livermore National Laboratory. There, scientists and engineers will evaluate whether the technology could be a useful weapon in their computing arsenal.

It was a big moment for the IBM program, which devised the TrueNorth concept in 2010 and unveiled the first chip in 2014. Developed in collaboration with Cornell University, the TrueNorth chips use traditional digital components to implement a decidedly more brain-like behavior; each 5.4-billion-transistor chip can consume as little as 70 milliwatts (for more on how that could possibly work, see our 2014 story “How IBM Got Brainlike Efficiency From the TrueNorth Chip”).  

Although these were not the first TrueNorth chips to ship, the array is notable, Modha says, because it integrates 16 chips onto a single board, allowing the company to demonstrate that it can “scale up” the approach to larger and larger systems. The entire 16-chip array can require as little as 2.5 watts (other systems, such as communications fabric, add some overhead to that).

Livermore, which has some of the world’s fastest supercomputers and signed a $1 million contract with IBM for the TrueNorth unit, will be exploring how this new technology might play a role in areas such as cybersecurity and physical simulation. 

I was particularly excited to see exascale computing mentioned in the press release announcing the system. Probably the looming question among high-performance computer makers is how we’ll reach the exascale—when machines are some 30 times as fast as the fastest supercomputer today—without also creating staggering (and probably infeasibly expensive) utility bills.

But as it turns out, chances are slim that we’ll be simulating nuclear weapons or designing tomorrow’s nuclear reactors on supercomputers composed entirely of chips modeled on the human brain. Although TrueNorth can, in principle, perform any computation, the speed and efficiency of such neuromorphic chips only shines in particular applications such as pattern recognition. Traditional computers will still be with us, Modha says: “What we’re offering is a complementary architecture.” 

Engineers are still sorting out the best way to build an exascale supercomputer, says Brian Van Essen, a computer scientist at Livermore’s Center for Applied Scientific Computing. Heterogeneous computing, which could mix of different computing technologies such as CPUs, graphics processing units, FPGAs, and neuromorphic chips—“is definitely one potential path,” he says. But, he adds, “it’s not clear what the final system design is going to look like.”

Van Essen says one area Livermore hopes to explore with the TrueNorth chips is their potential role in large-scale simulation. “As we scale simulations and modeling [of] physical systems up to large sizes, sometimes the simulations can get into an area where the numerics get kind of garbled up,” he says.

He says a team is in the midst of evaluating whether machine learning can be used to detect problems before a simulation crashes and correct for the behavior. Van Essen says that if the approach looks promising, one could envision chips distributed thoughout the system that will monitor the progress of a simulation. It would take a “nontrivial amount of horsepower to monitor the system,” Van Essen says, adding that it would be a good application for a l0w-power technology such as TrueNorth. 

If you’re looking to keep track of TrueNorth developments, Dharmendra Modha maintains a detailed blog

Follow Rachel Courtland on Twitter at @rcourt.


5 Ways Cyber Experts Think the FBI Might Have Hacked the San Bernardino iPhone

Last week, the FBI announced that it had, with the help of a third party, successfully broken into the passcode-protected iPhone 5C owned by San Bernardino shooter Syed Farook. It’s not clear yet whether the FBI found any information useful to its investigation, but the hack brought at least a temporary reprieve to the very public battle between Apple and the FBI over encryption and privacy rights. 

The agency hasn’t named its accomplice nor has it revealed how it gained access to the iPhone’s contents. To shed some light on the possibilities, IEEE Spectrum spoke with nine computer security experts and mobile phone forensics specialists about a few techniques that may have been behind this controversial hack:

1. The easy way in

Perhaps the simplest hack of all would be to exploit a vulnerability in iOS 9, the version of Apple’s operating system installed on Farook’s phone. Several experts including Robert Cunningham, chair of the IEEE Cybersecurity Initiative, and Dudu Mimran, chief technology officer for the Telekom Innovation Laboratories at Ben-Gurion University in Israel, believe this is the most likely approach.

Armed with the right security hole, also called a zero-day exploit, a hacker could potentially switch off functions that thwarted the FBI’s entry. These include a built-in delay that prohibits a user from trying too many incorrect password combinations at once, and an optional setting that prompts an iPhone to erase its memory after 10 failed entries. Once a hole is identified, there are many ways to deploy a bug to take advantage of it. The code can be sent as a malicious text message or by exploiting the driver that connects a charger to a laptop to enable new software to be uploaded to a phone.

As an added bonus, maneuvering via a bug is relatively low risk since these strategies avoid tampering with the iPhone’s physical components (more on that approach later). Joel Bollo, CEO at the MSAB, says the vast majority of mobile forensics solutions that his company executes for law enforcement clients are software-based.

So what kind of zero-day may have helped authorities slip in? It’s not entirely clear, but it’s not unreasonable to think that one could exist. There’s a healthy market for uncovering such flaws: The cybersecurity firm Zerodium paid a $1 million bounty last fall to a team that exposed a hole in iOS 9. As Mimran says, “There is no software that is considered bulletproof.”

2.  Trick the OS

Inside the iPhone 5C is an A6 chip that features both processors and RAM, which work together to achieve faster speeds than those that were available in previous models. In order to keep track of passcode attempts, this “system on a chip” also communicates with non-volatile memory stored elsewhere, such as in flash memory.

This setup leads experts to a second theory: that hackers may have circumvented the iPhone’s passcode protection by hijacking operations between the A6 and the non-volatile memory.

Ran Canetti, a computer scientist at Tel Aviv University and head of the school’s Check Point Institute of Information Security, says one way to do this would be to tamper with the physical line of communication that carries password recovery instructions between the two. A knowledgeable hacker could use this line to re-route Apple’s software, which typically receives marching orders from both the phone’s flash and RAM, to an external device. The FBI and its silent partner could’ve used such a device to instruct the software to continue accepting failed passcode attempts until the investigators arrived at the correct one.

“They can basically reset the place where it says, ‘Now you've tried nine times,’” Canetti says. “When the phone asks, ‘How many times have you tried?’ they say—‘No, you’ve only tried one time.’”

With the software rejiggered, the FBI could launch a traditional “brute force” attack, employing a software program to rapidly try password combinations until it arrived at the correct one. Since Farook’s iPhone 5C used a four-digit passcode, a program could run through every one of the 10,000 possible password combinations in a matter of minutes.

“That brute force technology isn't very sophisticated,” says Dylan Ayrey, a security engineer with the information security company Praetorian. “You could go on Ebay right now and purchase ways to brute force older versions of the iPhone.”

3. Reset (and reset and reset) the memory

One of the most popular theories among crypto-experts, including Gary McGraw, chief technology officer at the software security consulting firm Cigital, is that the FBI hacked the iPhone through a tactic called NAND mirroring. NAND is a form of flash technology used in memory chips for high-capacity and long-term storage.

Within an iPhone, NAND is thought to play a role in erasing a digital key required to unlock an iPhone’s memory after logging 10 failed password attempts. But if someone knows how to circumvent or reset the tally after each attempt, they could help themselves to unlimited tries.

One way to manually do that might be to remove the memory chip that NAND protects and make a digital copy of it. Once the copy is made, a hacker could test out combinations and simply reload the memory back onto the original chip before the 10-attempt limit is reached. iPhone forensics expert Jonathan Zdziarski has said this strategy is a lot like hitting “save” on a video game. If you die (or, in this case, lose your data) you simply go back and pick up where you left off.

Though it’s a crowd favorite among cybersecurity experts, FBI Director James Comey said in a press briefing in March that this approach, also called a replay or reset attack, wouldn’t work on Farook’s phone. But many remain skeptical of Comey’s insistence; shortly after he made that statement, Zdziarski contradicted it with a demonstration of the technique in a blog post.

That’s the post that won Citigal’s McGraw over to this theory, and he’s not the only one. Praetorian’s Ayrey says, “I think that strategy is very likely and I think that's basically the same sneak we would do here.”

4. Tear the whole thing apart  

An iPhone’s memory chips are shrouded in layers of both physical and digital protections to block hackers. To uncover its secrets, hackers must sometimes mount a physical attack in order to bypass certain tamper-resistant features.   

There are a few ways to do this. A hacker could start by heating up the device in order to detach a memory chip. The next step: using acid to remove the surface layers of the chip in an act known as “decapping.” That could be followed up with some precision work with a tiny laser drill for reaching sections of the chip the hacker wants to more closely examine.

Ari Juels, a professor in the Cornell Tech Security Group, says the goal in the Farook case would be to extract the handset’s unique ID, which is a special digital key that Apple assigns to each device during manufacturing and could be used to decode an iPhone’s memory.

Apple said in a white paper published last fall that in order to obtain this key, a hacker would have to mount a “highly sophisticated and expensive physical attack.” This is certainly an option the FBI may have considered, but runs the risk of obliterating the memory forever if a technician makes even the slightest miscalculation.

“This is a very invasive and expensive and tricky thing to do,” Dan Wallach, a computer security expert at Rice University, warns. “It's a destructive process that has a percentage chance of destroying the device.”

5. Sneak in through the side

A device that is hard at work can offer clues about the information it is handling. These clues include its power consumption, acoustic properties, electromagnetic radiation, or the time it takes for a specific component to complete a task.

In what’s known as a side-channel attack, experts can use specialized tools to monitor these properties and use the data they gather to infer what’s happening inside a device. For example, a hacker could hook up a resistor to the iPhone’s internal circuits and read the amount of energy that flows by with each passcode attempt. Ben-Gurion University’s Mimran likens it to putting your ear up to a safe, listening for a satisfying click as you turn the dial.

While Cunningham of the IEEE Cybersecurity Initiative says a hacker wouldn’t likely be able to read a PIN or passcode through this method, a would-be invader could almost certainly glean details about the size or complexity of the key and the nature of the cryptographic system within.

For example, a passcode retrieval process that relies on a form of encryption called Montgomery multiplication requires a chip to repeatedly square a large string of numbers. Eventually, it instructs the chip to multiply its result with the last integer used in this massive calculation. Depending on the integers and at what point the chip performs this computation, this process could require more or less energy.

Rice University’s Wallach says the best place to start when mounting a side channel attack would be to order specs on the iPhone 5C from a company such as Chipworks or iFixit. These firms specialize in breaking down commercial devices and writing detailed reports about their components, as well as offering their best guesses as to how information flows throughout a device.

But even with a cheat sheet, a side channel attack is also a very delicate process given the tiny wires and chips that make up a smartphone’s circuitry and internal components. What’s more, chipmakers have wisened up to this strategy, so many now install features that cause a chip to generate electromagnetic noise or maintain a steady power draw no matter what function they’re performing in order to confuse attackers.


MIT turns Wi-Fi Into Indoor GPS

Global Positioning System (GPS) satellite technology comes in handy for tracking cruise missiles, doing in-car navigation, and finding secluded restaurants. But step inside an airport, museum, or mall, and you’re often relegated to studying a paper map or asking for directions.

There are positioning systems designed for indoors, but they rely either on GPS-like radio or magnetic beacons, or on mapping the ever-shifting morass of Wi-Fi access points. Such methods have proved expensive to install and difficult to scale. What’s more, these indoor GPS systems are far from accurate enough to let you do cool things like a have a robot follow or avoid you.

Now researchers at MIT’s Computer Science and Artificial Intelligence lab (CSAIL) have developed a way for adjacent Wi-Fi devices, including smartphones, to locate each other within centimeters. The technology, called Chronos, relies on making the devices emulate multi-gigahertz wideband radios.

Chronos starts by having two Wi-Fi devices, a transmitter and receiver, hop simultaneously between all 35 frequency bands in the 2.4 gigahertz to 5.8 GHz Wi-Fi range. At each frequency, the rate at which signals accumulate phase naturally varies. The transmitter skips between bands every 2 to 3 microseconds, with the receiver comparing the phase differences at each step. Chronos can then calculate the time of flight of signals—and thus the distance—between the devices.

If one of the devices has multiple Wi-Fi antennas, as do most modern smartphones and laptops, Chronos can also calculate the angle between the two devices, and locate them in space. In experiments in everyday environments like an apartment or coffee shop, Chronos was able to localize devices to within 65 cm (or about 10 times the accuracy of GPS) using only off-the-shelf Wi-Fi cards.

The MIT researchers, PhD student Deepak Vasisht and Professor Dina Katabi, envisage Chronos being used to count people in smart homes for lighting control, to offer password-free Wi-Fi in cafés (while excluding freeloaders outside), and for robots to operate safely around humans.

“Because Wi-Fi is widely used and in every cellphone, it would be good to use this amazing technology for as many applications as we can,” Katabi told IEEE Spectrum.

There are some limitations, however. Although Chronos can run on existing Wi-Fi devices using just an app (or a firmware upgrade for an access point), each device has to undergo a one-time distance calibration. And because Chronos takes around one-tenth of a second to sweep all the Wi-Fi bands, its accuracy plunges if the devices are moving relative to one another during this initial setup.

So, do you have to place your cellphone on a counter—or on a table in the food court if you’re at the mall—so it’ll be perfectly still? “Walking is fine, but we’re not talking about somebody in a car,” says Katabi. “However for a drone, it’s actually better if it moves. Because its movement is controlled and you know the speed, you can leverage that information in a feedback loop to boost your results.”

Vasisht and Katabi tested Chronos on an AscTec Hummingbird quadcopter fitted with an Intel 5300 Wi-Fi card and a Go-Pro camera. The drone was set to stay 1.4 meters from a netbook, shooting photos of the computer as it moved. Chronos was able to keep the drone within just 4 cm of its programmed distance.

The next step for Vasisht and Katabi is to improve the resolution of Chronos even further, and to start building functions such as geo-fencing, which sets virtual boundaries. The researchers are in discussions with MIT about commercializing the technology. If all goes well, using your phone to find the way to your departure gate, with your robotic carry-on following close behind, could be just a few years away.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More