Tech Talk iconTech Talk

An illustration shows two mouths talking below two blank speech bubbles.

Automatic Speaker Verification Systems Can Be Fooled by Disguising Your Voice

Automatic speaker verification (ASV) systems are sometimes used to grant access to sensitive information and identify suspects in a court of law. Increasingly, they are being baked into consumer devices, such as Amazon’s Echo and Google’s Home, to respond to person-specific commands, such as “play my music” or “read my email.”

But such systems make mistakes when speakers disguise their voices to sound older or younger, according to a new study published in Speech Communication by researchers from the University of Eastern Finland. Earlier research by the same group has shown that some ASV systems can’t distinguish between a professional impersonator and the person they are imitating.  

Read More
A photo shows a woman wearing the new HaptX glove, which resembles a large black ski glove with wires attached to it, and an HTC Vive headset.

HaptX Inc Reveals New Haptic Glove for Virtual Reality

In early October, I showed up at an old firehouse on Staten Island for a glimpse into the future of virtual reality. That future depends largely on haptics. Now that we can use VR headsets to transport ourselves to another world, the thinking goes, we need systems to recreate sensations to bring those virtual experiences to life.

I went to Staten Island to meet up with a little-known company that fancies itself “the leader of realistic haptic feedback.” The company—now called HaptX—had promised to let me try out a prototype of its very first product.

Read More
A man walks past the exterior of a building labelled TechShop in San Francisco, CA.

TechShop Goes Bankrupt

A little over a decade ago, Jim Newton, who once served as an advisor to the TV show Mythbusters, and his partner Ridge McGhee launched a company called TechShop, opening the first of its for-profit makerspaces in Menlo Park, Calif., the beating heart of Silicon Valley.

Their enterprise was seemingly successful, and the number of TechShops soon grew, with 10 of them operating all around the United States by 2017.

But yesterday TechShop suddenly announced bankruptcy—the Chapter 7 kind. So no re-organization; no second chance to get back in the black. TechShop is kaput.

I was very enthusiastic when a TechShop opened in my neighborhood, the Raleigh-Durham area of North Carolina. I never purchased an annual membership there, but I would sign up for a month at a time when I needed access to machine tools.

TechShop provided the only way (short of having a buddy with an old Bridgeport mill or a South Bend lathe in his garage) that a Sunday machinist like me could get access to such equipment. And I once took a welding class at TechShop just for kicks. So I lamented when the company’s Raleigh-Durham location closed its doors in 2013, and I’m sorry for the many others who are no doubt feeling similar sentiments now that none are to be found anywhere in the United States.

So why did TechShop fail? Isn’t the “maker” movement exploding? Some hints come from a message from TechShop’s CEO, Dan Woods, which was published online by Make magazine. In it, Woods explains that the maker movement is very much a not-for-profit affair, often bankrolled by government or philanthropic organizations. And as a for-profit company, TechShop was normally not eligible to compete for such grant money or continuing subsidies.

Woods describes his company’s effort to pivot, to turn into TechShop into some sort of makerspace midwife, which would help non-profits, schools, or community governments set up makerspaces of their own. That wouldn’t seem any more promising of a business plan, at least to me, and I suppose yesterday’s bankruptcy announcement confirms that I would be right to be leery about it. I’m not savvy enough, though, to speculate about whether Woods and his colleagues could have done anything else to keep TechShop afloat. 

You might ask, “What does it matter?” After all, if not-for-profit makerspaces are forming in many places with subsidies from schools and local governments, isn’t that enough? Perhaps, but my limited experience with the makerspaces being established at schools and libraries is that they are makerspaces-lite. They might have computers and 3D printers and perhaps a laser cutter, but look for a bay where you can work on your car or a machine tool you can use to make metal chips fly, and you’ll be hard pressed to find one. So the demise of TechShop does seem to create a big hole in makerdom, one that I’m not so confident that the non-profit sector will rush in to fill.

Schematic drawing of IBM's 20 qubit (left) and 50 qubit (right) systems, illustrating qubit interconnectivity.

IBM Edges Closer to Quantum Supremacy with 50-Qubit Processor

“We have successfully built a 20-qubit and a 50-qubit quantum processor that works,” Dario Gil, IBM’s vice president of science and solutions, told engineers and computer scientists at IEEE Rebooting Computing’s Industry Forum last Friday. The development both ups the size of commercially available quantum computing resources and brings computer science closer to the point where it might prove definitively whether quantum computers can do something classical computers can’t.

“It’s been fundamentally decades in the making, and we’re really proud of this achievement,” said Gil.

More interconnected qubits translate to exponentially more computing power, so industry has been racing to increase the number of qubits in their experimental processors. The 20-qubit machine, made from improved superconducting qubits that operate at a frigid 15-millikelvin, will be made available to IBM clients through the company’s IBM Q program by the end of 2017. The company first made a 5-qubit machine available in 2016, and then a 16-qubit machine earlier this year.

The 50-qubit device is still a prototype, and Gil did not provide any details regarding when it might become available.

The qubits in the new processors are more stable than those in previous generations. Stability is measured in “coherence time,” the average length a qubit will stay in a quantum state of superposition before environmental influences cause it collapse to either a 1 or 0. The longer the coherence time, the longer the processor has to complete its calculations. The quantum bits in IBM’s 5- and 16-qubit machines averaged 50 and 47 microseconds respectively, Gil said. The new 20- and 50-qubit machines hit 90 microseconds.

Apart from wanting to achieve practical quantum computing, industry giants, Google in particular, have been hoping to hit a number of qubits that will allow scientists to prove definitively that quantum computers are capable of solving problems that are intractable for any classical machine. Earlier this year, Google revealed plans to field a 49-qubit processor by the end of 2017 that would do the job. But recently, IBM computer scientists showed that it would take a bit more than that to reach a “quantum supremacy” moment. They simulated a 56-qubit system using the Vulcan supercomputer at Lawrence Livermore National Lab; their experiments showed that quantum computers will need to have at least 57-qubits.

“There’s a lot of talk about a supremacy moment, which I’m not a fan of,” Gil told the audience. “It’s a moving target. As classical systems get better, their ability to simulate quantum systems will get better. But not forever. It is clear that soon there will be an inflection point. Maybe it’s not 56. Maybe it’s 70. But soon we’ll reach an inflection point” somewhere between 50 and 100 qubits.

(Sweden is apparently in agreement. Today it announced an SEK 1 billion program with the goal of creating a quantum computer with at least 100 superconducting qubits. “Such a computer has far greater computing power than the best supercomputers of today,” Per Delsing, Professor of quantum device physics at Chalmers University of Technology and the initiative's program director said in a press release.)

Gil believes quantum computing turned a corner during the past two years. Before that, we were in what he calls the era of quantum science, when most of the focus was on understanding how quantum computing systems and their components work. But 2016 to 2021, he says, will be the era of “quantum readiness,” a period when the focus shifts to technology that will enable quantum computing to actually provide a real advantage.

“We’re going to look back in history and say that [this five-year period] is when quantum computing emerged as a technology,” he told the audience.

A photo of a phone on a table showing an app release note on the screen from the Transit app.

The Strange Art of Writing App Release Notes

If you have an iPhone, go to the App Store and navigate to Updates. You’ll see release notes describing changes developers have made to the newest versions of the apps on your phone.

For the most part, they’re pretty boring:

“Bug fixes and performance updates.”
“Bug fixes.”
“This update contains stability and performance improvements.”
“Update to optimize alert handling.”

But sometimes, a note will stand out. Scrolling through my own recently, I came across this one written for a transportation app called Transit that I often use here in New York City.

Read More
Illustration of supercomputing machines as a bar graph

Two Different Top500 Supercomputing Benchmarks Show Two Different Top Supercomputers

The 50th TOP500 semi-annual ranking of the world’s supercomputers was announced earlier today. The topmost positions are largely unchanged from those announced last June, with China’s Sunway TaihuLight and Tianhe-2 supercomputers still taking the #1 and #2 positions, and the Swiss Piz Daint supercomputer still at #3. The only change since June, really, to the handful of computers at the very top of the list is that the one U.S. computer to make the top-five cut, Oak Ridge National Laboratory’s Titan, slipped from #4 to #5, edged out by a Japanese supercomputer called Gyoukou.

The top 10 now look like this:

Top500.org’s November 2017 ranking
Position Name Country Teraflops Power (kW)
1 Sunway TaihuLight China 93,015 15,371
2 Tianhe-2 China 33,863 17,808
3 Piz Daint Switzerland 19,590 2,272
4 Gyoukou Japan 19,136 1,350
5 Titan United States 17,590 8,209
6 Sequoia United States 17,173 7,890
7 Trinity United States 14,137 3,844
8 Cori United States 14,015 3,939
9 Oakforest-PACS Japan 13,555 2,719
10 K Computer Japan 10,510 12,660

What’s more interesting to me is not this usual “TOP500” ranking but a second ranking the TOP500 organization has tracked recently using a different software benchmark, called High Performance Conjugate Gradients, or HPCG. This relatively new benchmark is the brainchild of Jack Dongarra, one of the founders of the TOP500 ranking, and Pitor Luszczek (both of the University of Tennessee) along with Michael Heroux of Sandia National Laboratories.

Why was there a need for a new benchmark? The normal ranking is determined by how fast various supercomputers can run something called a LINPACK (or HPL) benchmark. The LINPACK benchmarks originated in the late 1970s and started being applied to supercomputers in the early 1990s. The first TOP500 list, which used a LINPACK benchmark, came out in 1993. Initially, the LINPACK benchmarks charted how fast computers could run certain FORTRAN code. The newer (HPL) benchmarks measure execution time of code written in C.

Experts have long understood that the LINPACK benchmark is biased toward peak processor speed and number, missing important constraints like the bandwidth of the computer’s internal data network. And it tests the computer’s ability to solve so-called dense-matrix calculations, which aren’t representative of many “sparse” real-world problems. HPCG was devised to remedy these shortcomings.

And when you rank the current crop of supercomputers according to the newer HPCG benchmark, the picture looks very different:

Top500.org’s November 2017 ranking using the HPCG benchmark
Position Name Country Teraflops
1 K Computer Japan 603
2 Tianhe-2 China 580
3 Trinity United States 546
4 Piz Daint Switzerland 486
5 Sunway TaihuLight China 481
6 Oakforest-PACS Japan 385
7 Cori United States 355
8 Sequoia United States 330
9 Titan United States 322
10 Mira United States 167

The 10th-ranking computer on the TOP500 list, Fujitsu’s K computer, floats all the way up to #1. And the computer that had been at the top, the Sunway TaihuLight, sinks to the #5 position. Perhaps more important is the drastic difference in performance all of these computers show when you compare results from the two benchmarks.

Take, for example, the Sunway TaihuLight. It’s theoretical top speed, known as Rpeak, is 125 petaflops (that’s 125 x 1015 floating point operations per second). Judged using the LINPACK benchmark, that computer can manage 93 petaflops, about three-quarters of theoretical performance. But with the HPGC benchmark, it achieves a mere 481 teraflops. That’s just 0.4 percent of the computer’s theoretical performance. So running many problems on the Sunway TaihuLight is like getting into a Dodge Viper, which can in theory go 200 miles per hour [322 kilometers per hour], and never driving it any faster than a Galapagos tortoise.

So are the LINPACK (HPL) results or the HPGC results more representative of real-world operations? Experts regard them as “bookends,” bracketing the range users of these supercomputers can expect to experience. I don’t have statistics to back me up, but I suspect the distribution is skewed closer to the HPGC side of the shelf. If that’s true, maybe the TOP500 organization should be using HPGC for its main ranking. That would be more logical, I suppose, but I expect the organizers would be reluctant to do that, given people’s hunger for big numbers, now squarely in the petaflop range for supercomputers and soon to flirt with exaflops.

Perhaps supercomputers should just be required to have written in small letters at the bottom on their shiny cabinets: “Object manipulations in this supercomputer run slower than they appear.”

Illustration on a red background of a white and blue computer screen with an atomic symbol glowing on it.

A Data Bus for Quantum Computers

Quantum physicists are now laying the groundwork for a "quantum bus," which can teleport quantum information between the memory and processor components of future quantum computers.

Classical computers switch transistors between one state or another to symbolize data as ones and zeros. Quantum computers use quantum bits or qubits that, because of the surreal nature of quantum mechanics, can be in a state of superposition where they can essentially behave as both one and zero.

The superpositions that qubits adopt let them hold two states at once. If two qubits are quantum-mechanically linked, or entangled, they can hold four states simultaneously; three qubits, eight states; and so on. In theory, a quantum computer with 300 qubits could hold more states than there are atoms in the visible universe. Algorithms can use such entangled qubits to run an extraordinary amount of calculations in an instant.

Read More
Illustration showing a tactical fighter jet with a high energy laser weapon system blasting something in the distance.

Laser Weapon to Go in Fighter Jet in 2021

Nothing says futuristic 'top gun' like a fighter jet using a high-energy laser to blast enemy missiles out of the sky. That future may be only a few years away. The Air Force Research Laboratory (AFRL) has issued a $26.3-million contract to Lockheed Martin to design, develop, and build a high-energy laser for tests in a tactical fighter jet by 2021. The key technology is an advanced version of a multi-kilowatt fiber laser like the one the Navy tested earlier on the USS Ponce.

Fiber lasers started as a dark horse in the development of electrically-powered laser weapons that culminated in a demonstration of a 100-kilowatt laboratory laser by Northrop-Grumman in 2009. Multi-kilowatt fiber lasers were already in use in industrial machining, but conventional wisdom said that fiber laser output was limited because optical power was concentrated inside the fiber's tiny light-guiding core. Though that setup maximized how efficiently a fiber laser could convert electrical energy into light and gave good beam quality, it also raised the power density so high that a single fiber couldn't emit much more than about 10 kilowatts without self-destructing.

At the time, it seemed impractical to combine beams from many fiber lasers; in doing so, you’d  sacrifice the beam quality needed to focus power on distant targets. What changed the picture was the development of a high-power version of the wavelength-division multiplexing (WDM) technique used in high-capacity fiber-optic telecommunications. At low power, WDM can merge the outputs of 100 different lasers, each operating in its own narrow slice of the spectrum, into a single fiber core without causing interference. Lockheed has extended what it calls “spectral beam combining” to high-power fiber lasers, starting with a 30-kW system that combines light from about 100 lasers and uses less than half the power of other electric lasers. This year, Lockheed delivered a 60-kilowatt version to the Army for testing in a military tractor-trailer.

The AFRL contract is part of its Self-protect High Energy Laser Demonstrator, or SHiELD, program The aim is to test capabilities of a fighter jet that uses lasers to defend itself against missiles launched from the air or ground. Separately, Northrop-Grumman is building a beam control system to fire at targets, and Boeing is integrating the laser into an external pod that will contain it in flight, power it, cool it, and coordinate its actions with the aircraft’s systems.

Meanwhile AFRL and DARPA are also planning airborne tests of a competing, non-fiber laser system. The compact and lightweight electric laser, called HELLADS, was developed by General Atomics.

Cost has long been an issue with laser weapons, but fiber lasers should be affordable because they leverage a very large technology base for fiber-optic communications and industrial lasers, says Rob Afzal, senior fellow of laser weapon systems at Lockheed. “We have to manage weight both for what the pod can accept and for things like the center of gravity,” says Afzal. But he adds, “fitting it into a tight, compact space is the bigger challenge,” as well as ruggedizing the laser to withstand the vibration, temperatures, and G forces encountered in a tactical aircraft.”

Bitmain's Sophon BM1680 AI chips

Bitcoin’s Biggest Tech Player to Release AI Chips and Computers

By its own reckoning, Bitmain built 70 percent of all the computers on the Bitcoin network. It makes specialized chips to perform the critical hash functions involved in mining and trading bitcoins, and packages those chips into the top mining rig—the Antminer S9.

But Bitmain CEO Jihan Wu sees a future beyond blockchains and cryptocurrency. As he told IEEE Spectrum contributing editor Morgen E. Peck in July: “It’s quite personal that I wanted Bitcoin to be successful. But as a company we are not allowed to solely rely on the success of Bitcoin. That’s a thing we cannot afford.”

So Bitmain is applying its Bitcoin playbook to artificial intelligence. On 8 November, Bitmain’s other CEO Micree Zhan will detail its new AI chip, the Sophon BM1680, at AIWORLD in Beijing, and the company will begin selling systems based on it.

Read More
An illustration shows a mockup of Wiliot's no-battery bluetooth beacon, which looks like a flexible red and white button with Wiliot's logo.

Startup Wiliot Promises No-Battery Bluetooth Beacons in 2019

Thanks to Wi-Fi and Bluetooth, the modern world is awash in 2.4-GHz radiation. A semiconductor startup in Israel, Wiliot, thinks it can use some of that RF energy to free the Internet-of-Things from batteries and other energy storage devices.

“There’s a lot of Bluetooth energy,” says Steve Statler, senior vice president of marketing and business development at Wiliot. “You’re bathed in it.” Add up ordinary Bluetooth traffic, Wi-Fi signals, and those from Bluetooth 5.0, the newest iteration which can blast signals a kilometer or more, and there’s enough energy that simple beacon tags won’t need any form of energy storage, the company believes.

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More