Tech Talk iconTech Talk

U.S. Leads Global Effort to Bring 1.5 Billion People Online by 2020

A global push to create more than a billion new Internet users over the next four years is underway, and leaders this week announced dozens of country-specific projects devoted to improving connectivity. India also officially signed on, joining more than 35 nations committed to expanding public Internet access and working with industry to build connections for rural users.

U.S. Secretary of State John Kerry led a meeting of global finance ministers, company executives and government representatives on Thursday in Washington D.C. to promote the U.S. State Department’s Global Connect Initiative, first announced last fall. The initiative has a stated goal of bringing 1.5 billion people online by 2020.

Kerry underscored the program’s ambition and mission by calling it “sort of the international equivalent of Franklin Roosevelt’s electrification program 80 years ago.”

Read More

SkinHaptics Uses Ultrasound to Generate Haptic Feedback Through Your Body

In the future that I'm planning on living in, nobody will carry around laptops or cell phones anymore. Instead, electronics will be embedded in wearables: in wristbands, in watches, in rings, in clothing, and eventually, in things like electronic temporary tattoos that you apply directly to your skin. The more embedded the technology gets, the tricker interaction with it can be, since you're no longer physically holding objects. At the University of Sussex, in England, researchers have developed a system called SkinHaptics that transmits ultrasound straight through your body to generate focused haptic feedback on the surface of your skin.

Read More

3 Ways To Bridge The Digital Divide

What will it take to bring the next billion people online? These days, the answer has as much to do with smart policy as with technical expertise. This week in Washington D.C., policy experts worked alongside engineers at a meeting (hosted in part by the IEEE Internet Initiative) intended to sketch a picture of what such a transition might look like around the world.

Companies such as Google and Facebook would like to know, and so would government leaders struck by the Internet’s power as an economic engine. More than half the world’s population, or about 4.2 billion people, do not have regular access to the Internet, according to the latest report published last fall by the U.N. Broadband Commission.

Last year, the U.S. State Department announced the Global Connect Initiative, which aims to bring 1.5 billion people online by 2020. As part of that effort, some of the ideas discussed this week will be presented on Thursday to financial ministers during a high-level meeting at the World Bank led by U.S. Secretary of State John Kerry.

Experts emphasized that there is no single technology or network structure that makes sense for every community. However, they offered a few good starting points for any country looking to bolster its number of internet users:

1. Permit unlicensed use of white space.

White space is a term for TV and radio frequencies that aren’t currently in use for existing channels. Originally, these extra frequencies were packed between channels as a sort of buffer in order to prevent interference. But companies have since found ways to operate on these channels without causing any interruption to neighboring programs.

Furthermore, a global transition to digital television and radio from analog has freed up an even broader swath of spectrum. Digital signals can transmit on adjacent channels without causing a disruption to either. Since rural areas tend to have access to fewer existing channels in the first place, they would have even more leftover spectrum.

New devices including smartphones, tablets, and computers that know how to detect unused spectrum can use it to transmit wireless broadband signals, also known as “WhiteFi” or “Super Wi-Fi.” These frequencies are especially useful because they can carry a lot of data over long distances and reach indoors. Tech companies including Google, Microsoft, Intel, Dell, and HP faced off against broadcasters to support early efforts to reuse white space for this purpose, and launched some of the first tests for new devices capable of doing it.

Now, enthusiasm for WhiteFi is picking up across the world. A national demonstration project in the United States conducted in public libraries has since spread to Finland, Malaysia, and the Philippines. Separately, Kenya has also experimented with it in two rural communities while Microsoft and Google recently led trials in South Africa. The Indian Institute of Technology has tested the technology in 13 villages and hopes to eventually serve many more.

2. Adopt a “dig once” mentality.

Whenever a company wants to install a new optical fiber cable to provide better Internet access to a house or community, it must first hire bulldozers and a construction crew to dig a path to the new destination. If multiple companies want to deploy fiber to the same area at different times, they might wind up digging the same route again.  

It’s easy to understand why this process is expensive and disruptive to locals. Experts at this week’s meeting say a much easier and cheaper approach would be for governments to require road construction crews to lay a single conduit alongside each new road as they are building it, through which all future fiber optic cables could be threaded. International development banks could do the same for the projects they fund. Experts stressed the value of these “dig once” policies; the U.S. Federal Highway Administration has said that this way of doing things can reduce the cost of deploying broadband by 90 percent.

This idea is gaining some traction, at least in the United States. The U.S. Departments of Commerce and Agriculture promoted it in a report published last fall. Around the same time, a lawmaker proposed a bill to implement it for all federal highway projects. However, the “dig once” policy is still not fully incorporated into federal, state, or local requirements and has yet to take hold elsewhere in the world.

3. Develop local content.

One of the most consistent ideas to emerge during this week’s meeting was that simply providing technical tools for Internet access isn’t sufficient. To welcome the next billion users, companies and technologists need to engage deeply with local communities to determine if and how they intend to use this access. That way, said the experts, networks can be built out in ways that best suit those purposes. In other words, responding to actual demand for the Internet is as important as devising new schemes to offer it.

One key part of that response is producing local content that is relevant to potential new users in their native languages. Many governments have begun to offer online services for employment, taxes, or licenses, which is one way to generate local content. Developers are also seeing success with local sites and apps that help people share with each other in a particular region. 

“You want to provide Internet access, but what do the end users really need?” Dilip Krishnaswamy, an IBM researcher based in Bangalore, India said. “Maybe they don’t care about the presidential election as much as they want to connect with each other.” India is a good example of the humongous potential demand for local material—it’s home to 1.2 billion people who speak 22 major languages.

All this new content must also be designed to work on devices that are available and popular in that area, rather than the latest smartphones used in Europe or the United States. During the meeting, experts at one table discussed obstacles to Internet use in Africa. They mentioned the ongoing challenge of simply charging devices in many parts of the continent. In response, someone tossed out the idea of hosting a hackathon devoted wholly to developing apps that consume as little power as possible.  

Editors note: This story was updated on April 15 to change “IEEE Internet Society” to “IEEE Internet Initiative.”

Software Rules Tax Preparation, But at What Cost?

It’s mid-April, which means it’s the end of tax season in America again, when those who haven’t yet filed their income taxes scramble to beat the impending deadline. This year, like every year, more of those filers will use software to help them prepare their taxes than ever before.

It’s been thirty years since the Internal Revenue Service began embracing technology in a big way: In 1986 the agency piloted a program for electronic filing. The initial project required an IRS employee to manually turn a modem on each time returns were received, and it could only process certain simple returns. From 25,000 returns in that pilot year, the program grew rapidly: to 4.2 million returns the first year the program went nationwide, in 1990; to 68 million in 2005, when electronic filing surpassed mailed returns; and to over 125 million last year, or more than 85% of all individual returns.

Today, computers are ubiquitous throughout the proccess of taxation. Since 2010, the IRS no longer mails out 1040 forms—even if you want to still fill out paper forms, the agency expects you to download and print them for yourself. 

The rise of electronic filing has been mirrored by the growing role and influence of tax prep software. In 2015, over 50 million people filed self-prepared electronic returns, accounting for 1 in 3 individual filings. While more taxpayers still rely on tax professionals, the balance continues to slowly shift toward software-assisted self-filing (in 2006, only 15% of returns were done that way).

In some ways, taxes are a natural domain for computer assistance. Tax legislation can mostly be modeled as a set of rules and criteria that apply under certain conditions. But the problem is that most tax codes were not written with automation in mind, so there’s a lot of work required to translate them into a technical specification. (As my colleague Robert Charette has noted, the Standard Federal Tax Reporter, which explains the U.S. tax code to accountants, has grown to over 70,000 pages). Not to mention the dozens of state and local tax regulations.

The upfront investment required to build a comprehensive abstraction layer on top of such large collection of requirements is a large barrier of entry to new competitors. That partially explains the success of Intuit’s TurboTax, which dominates the consumer market, processing more than twice as many returns as its nearest competitors, H&R Block and TaxAct, combined. Together, the three account for nearly 90% of returns filed electronically by individuals.

There are a number of reasons consumers choose software like TurboTax, with convenience and cost near the top of the list. (Disclosure: I’ve used TurboTax for many years, including this year). But not everything that’s good for TurboTax is good for its customers, and certainly not for the IRS.

For one thing, TurboTax has a vested interest in making sure the tax code stays complex or becomes even more complex over time. They have lobbied heavily against initiatives like California’s return-free-filing.

There’s also evidence that the sheer scale of TurboTax’s customer base has given them a wealth of valuable data, allowing the company to understand taxes as well as—and sometimes better—than the IRS. That came to light last year when TurboTax was forced to temporarily stop processing state returns after an unprecedented increase in fraudulent returns. A pair of whistleblowers claimed that TurboTax ignored its own internal fraud models, which were more reliable than those at the IRS. Similarly, I suspect that TurboTax has a large enough sample size of data to accurately reverse engineer IRS auditing risk models (which allows them to confidently offer audit protection for an additional fee).

Finally, there’s a danger to filers dependent on tax-preparation software: The more we rely on software like TurboTax, the more we risk falling into the complacency of the automation paradox, where we no longer know enough about how taxes work to hold it accountable or do our own sanity checks. Maybe we would be better off with a simpler underlying protocol than a user-friendly abstraction layer.

In any case, best of luck to those of you who have yet to file!

Bringing Augmented Reality to Real Eyeglasses

If reality isn’t cutting it for you, just hold on; engineers are working on augmenting it. At least, they hope to show you more than what would normally be before your eyes, by adding systems to ordinary eyeglasses that would display images and data to enhance your experience.

Read More

Antimatter Starship Scheme Coming to Kickstarter

A spaceship departs Earth on a one-way, 42-year trip to Alpha Centauri. It runs on an antimatter engine that blasts the ship out of the solar system at one-tenth the speed of light. This is not the premise for a new Ridley Scott sci-fi drama but rather the endgame of a crowdfunded spaceship project launching this month.

West Chicago, Ill.-based Hbar Technologies plans a Kickstarter effort to raise US $200,000 for the next phase design of an antimatter-propelled spaceship. The two scientists behind this design effort are a veteran Fermilab particle accelerator scientist and a former Los Alamos National Laboratory physicist and founding director of the U.S. Center for Space Nuclear Research. They originally developed it for NASA at the turn of the millennium.

Because of budget cutbacks, the U.S. space agency dropped the antimatter-driven spaceship project in 2004. But the scientists say the plans they’re developing are technically feasible—if admittedly still quite optimistic in terms of the breakthroughs needed to enable antimatter to be stored in a fuel tank.

Read More

Could Satellite Messaging Startup Higher Ground Bring Down the 911 System?

Higher Ground might be the only Silicon Valley startup promising not to disrupt its entire industry. This small satellite messaging business is battling claims by telecoms companies that its SatPaq device could interfere with their services, interrupt life-saving emergency calls and even cause major outages across the United States.

IEEE Spectrum can reveal that for the past several years, Higher Ground has been quietly developing SatPaq, a smartphone case with a flip-up antenna that communicates with geostationary satellites. Connecting to a smartphone messaging app via Bluetooth, SatPaq can send and receive text messages and email almost anywhere in the United States, including the wilds of Alaska.

The problem is that SatPaq works in the same C-band microwave frequencies used by CenturyLink and other companies for voice and data communications in rural areas and as part of their national networks. They fear that the widespread use of SatPaqs could result in catastrophic interference.

“[This] is not just potential interference to one or two specific links in a particular location but… potential interference to each and every such link of the network throughout the country,” wrote CenturyLink in a submission to the Federal Communications Commission (FCC). “This seems to be a recipe for disaster.”

Read More

Bigelow Space Habitat on Its Way to ISS as NASA Prepares to Blow It Up

While the most interesting piece of cargo on its way to the ISS after SpaceX's successful Falcon 9 launch is almost certainly some intrepid Tokyo Bekana cabbages, a close second has to be the Bigelow Expandable Activity Module (BEAM), a pleasingly round inflatable space habitat that the astronauts are going to attach to the ISS and then blow up to see what happens. If everything goes according to plan, explosions will be minimal, and BEAM will inflate to its maximum curvaceousness. This will prove, over the course of the next two years, that the future of space habitation is something to get pumped up about.

Read More

U.S. Court Postpones Decision On .africa Domain Name

On Monday 4 April, a California court cancelled a hearing to determine whether the .africa domain could be released to a South African domain-name registry by the nonprofit Internet Corporation for Assigned Names and Numbers (ICANN). According to ICANN, which issues and manages generic top-level domains on behalf of the global Internet community, the court will issue a ruling at an unspecified future date. The delay prolongs a four-year debate over which of two registries should control the continent’s prized domain. Registries resell domain name rights to registrars such as GoDaddy, which, in turn, sign up Web addresses from customers under that domain.

ZA Central Registry technically won the rights to .africa back in 2013 via ICANN’s official process for delegating geographic domain names. ICANN’s decision was challenged in court by a rival registry called DotConnectAfrica.

The legal battle to determine .africa’s true owner could take months or years to resolve. Though DotConnectAfrica has requested an injunction asking the court to prevent the immediate transfer of .africa to ZA Central Registry, the cancellation of Monday’s hearing is no guarantee that it won’t still grant ZA Central Registry a green light to launch .africa in the meantime.

Neil Dundas, executive director of the organization that backs ZA Central Registry told IEEE Spectrum in March that if the injunction were dismissed, ICANN could probably issue the .africa domain to ZA Central Registry within two weeks. Then, ZA Central Registry would be required to host a live .africa site for a month-long trial period. The public sale of .africa sites would begin soon after.

The popularity of so-called “not com” domains has exploded in recent years as website owners find the most commonly used extensions (.com and .org) have become too crowded, and the shortest and easiest-to-remember addresses were already taken.

In response, ICANN invited registries to apply to create new domain names for the Internet. Since 2012, the organization has released more than 900 new domain names for public use, including .yoga, .bar, and .viking.

How Livermore Scientists Will Put IBM's Brain-Inspired Chips To The Test

Last week, Dharmendra Modha said goodbye to a computer some six years in the making: a set of 16 interconnected TrueNorth chips built to mimic the ultra-low-energy, highly-parallel operation of the human brain.

On Thursday, a team from IBM Research-Almaden in California hopped in a car and drove the unit some 75 minutes north to the U.S. Department of Energy’s Lawrence Livermore National Laboratory. There, scientists and engineers will evaluate whether the technology could be a useful weapon in their computing arsenal.

It was a big moment for the IBM program, which devised the TrueNorth concept in 2010 and unveiled the first chip in 2014. Developed in collaboration with Cornell University, the TrueNorth chips use traditional digital components to implement a decidedly more brain-like behavior; each 5.4-billion-transistor chip can consume as little as 70 milliwatts (for more on how that could possibly work, see our 2014 story “How IBM Got Brainlike Efficiency From the TrueNorth Chip”).  

Although these were not the first TrueNorth chips to ship, the array is notable, Modha says, because it integrates 16 chips onto a single board, allowing the company to demonstrate that it can “scale up” the approach to larger and larger systems. The entire 16-chip array can require as little as 2.5 watts (other systems, such as communications fabric, add some overhead to that).

Livermore, which has some of the world’s fastest supercomputers and signed a $1 million contract with IBM for the TrueNorth unit, will be exploring how this new technology might play a role in areas such as cybersecurity and physical simulation. 

I was particularly excited to see exascale computing mentioned in the press release announcing the system. Probably the looming question among high-performance computer makers is how we’ll reach the exascale—when machines are some 30 times as fast as the fastest supercomputer today—without also creating staggering (and probably infeasibly expensive) utility bills.

But as it turns out, chances are slim that we’ll be simulating nuclear weapons or designing tomorrow’s nuclear reactors on supercomputers composed entirely of chips modeled on the human brain. Although TrueNorth can, in principle, perform any computation, the speed and efficiency of such neuromorphic chips only shines in particular applications such as pattern recognition. Traditional computers will still be with us, Modha says: “What we’re offering is a complementary architecture.” 

Engineers are still sorting out the best way to build an exascale supercomputer, says Brian Van Essen, a computer scientist at Livermore’s Center for Applied Scientific Computing. Heterogeneous computing, which could mix of different computing technologies such as CPUs, graphics processing units, FPGAs, and neuromorphic chips—“is definitely one potential path,” he says. But, he adds, “it’s not clear what the final system design is going to look like.”

Van Essen says one area Livermore hopes to explore with the TrueNorth chips is their potential role in large-scale simulation. “As we scale simulations and modeling [of] physical systems up to large sizes, sometimes the simulations can get into an area where the numerics get kind of garbled up,” he says.

He says a team is in the midst of evaluating whether machine learning can be used to detect problems before a simulation crashes and correct for the behavior. Van Essen says that if the approach looks promising, one could envision chips distributed thoughout the system that will monitor the progress of a simulation. It would take a “nontrivial amount of horsepower to monitor the system,” Van Essen says, adding that it would be a good application for a l0w-power technology such as TrueNorth. 

If you’re looking to keep track of TrueNorth developments, Dharmendra Modha maintains a detailed blog

Follow Rachel Courtland on Twitter at @rcourt.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More