Tech Talk iconTech Talk

A cartoon thief carrying a bag with the Bitcoin symbol on it

Hong Kong Bitcoin Exchange BitFinex Loses Nearly 120,000 Bitcoins in Hack

Yesterday afternoon, BitFinex, a Bitcoin exchange in Hong Kong, disabled its customer deposits and withdrawals functions and replaced the trading engine on its website with notification of a major security breach. Later in the day, Zane Tackett, the “Director of Community and Product Development” took to Reddit (under the username “zanetackett”) to confirm that an attack had occurred and that nearly 120,000 bitcoins had been stolen from individual customer accounts.

This latest hack, which amounts to a loss of around US $72 million, is the biggest plundering of a Bitcoin exchange since 2014 when 850,000 bitcoins disappeared from the books during Mark Karpeles’s tenure as CEO of Mt. Gox. As was the case in 2014, the value of the currency is now crashing. The market price of bitcoin, which had begun to steadily increase at the beginning of the summer, fell 15 percent on news of the BitFinex hack.

Read More
stylized drawing of people using laptop, tablet, phone to communicate via Wi-Fi and cellular networks

How Cognitive Radio Can Help LTE-U and Wi-Fi Users Get Along

With cellular networks slammed by increasing demand, mobile operators are hunting for extra radio waves to carry their customers’ wireless signals. One place they’ve set their sights on is an unlicensed patch of spectrum in the 5 GHz band that is primarily used for Wi-Fi. But mobile and Wi-Fi companies have quarreled for years about whether or not adding LTE users to these frequencies would interfere with Wi-Fi networks.

To remedy the situation, new research from Portugal’s Instituto de Telecomunicações describes a strategy that could help LTE and Wi-Fi users to get along: empower crowded Wi-Fi users to hop over to licensed LTE bands when space gets tight on the unlicensed channels. If this approach succeeds in real-life trials, it could help LTE-U overcome some of the political and technical hurdles it has faced since Qualcomm first proposed it in 2012.

The whole kerfuffle began when Qualcomm and cellular service providers including Verizon and T-Mobile argued that shifting some of their LTE traffic over to unlicensed spectrum during busy periods would add extra capacity to their growing networks, and help them provide better, faster coverage for customers. They called the extended network enabled by this strategy LTE-U, for its use of unlicensed spectrum.

But Wi-Fi and cable providers were not pleased, because this meant LTE users would directly compete for batches of unlicensed spectrum (particularly those at 5 GHz) that have traditionally delivered Wi-Fi. Opponents such as Google and the nonprofit Wi-Fi Alliance worried that adding LTE devices to the same frequencies would interfere with Wi-Fi users. 

One reason for their concern is that Wi-Fi-enabled devices are programmed to share bandwidth with other devices that use the same router. These devices are “polite,” meaning they will throttle their owner’s consumption in order to fit another user onto the network, which is why Wi-Fi networks slow down as more people join.

Such chivalry would work against Wi-Fi users, though, if LTE customers suddenly began switching over to LTE-U and using the same unlicensed spectrum. The fear was that since LTE devices do not have any polite sharing mechanism in place, they would greedily claim bandwidth for themselves and leave courteous Wi-Fi users in the dust. The debate about exactly what this would mean for Wi-Fi users has been ongoing, but opponents generally believe it would cause long waiting times and lower throughput.

Faced with this dilemma, Shahid Mumtaz, a wireless researcher at the Instituto de Telecomunicações, developed an approach to help these two groups live in harmony. His technique allows Wi-Fi users to hop onto a licensed LTE band to send data when unlicensed channels become too busy. Mumtaz and his colleagues called it WiFi-Lic, for its use of licensed spectrum.

The group employed cognitive radio technology, which scans for any available spectrum that is not currently occupied by a user, to enable Wi-Fi users to search licensed LTE bands for free space every millisecond. Once a Wi-Fi user identifies a free space in the licensed band, they hop on and start using it. When they’re finished transferring data, or if they detect that a licensed LTE user needs the space, they hop back off.

The group carried out a simulation on the 5 GHz band to demonstrate how such a strategy would work, featuring 50 users who shared 10 Wi-Fi routers and a single 4G LTE base station. Each user was instructed to send a 0.5 MB data file multiple times in experiments that ran for several days.

In the end, the tests with WiFi-Lic achieved 160 percent more throughput than those without it and the waiting period for Wi-Fi users was essentially cut to zero. Based on these results, Mumtaz said he would expect any real-world network to roughly double its capacity by implementing WiFi-Lic.

However, Ravi Balakrishnan, a research scientist at Intel Labs who has worked with cognitive radio in the past but was not involved in this project, is skeptical. He says 1 millisecond (which also happens to be the duration of an LTE subframe, or scheduling period) seems like a very short amount of time for a Wi-Fi user to detect free space, communicate its plan to occupy that space, switch over to it, and transmit data.

“If you can do all of this within 1 millisecond, then you can do this, but I find it very challenging for the amount of hardware complexity that's required,” he says, adding that Wi-Fi users would, at the very least, need additional equipment—which would cost more.

Stefan Schwarz, who heads a wireless research lab at Vienna University of Technology in Austria, also points out that WiFi-Lic is only meant for Wi-Fi networks that are set up by an LTE operator, rather than home networks. And right now, it only works for data traffic—not for live streaming.

Mumtaz says he believes more Wi-Fi networks will be set up by LTE operators in the future, and he plans to further validate WiFi-Lic in a test bed this fall. It’s certainly a good time to be asking these questions: Earlier this year, Qualcomm received approval from the U.S. Federal Communications Commission to test its first pieces of LTE-U equipment. And LTE networks are not going to get any less crowded any time soon.

A judge's gavel on a laptop computer

EFF Sues U.S. Government Over DMCA

Last Thursday lawyers from the Electronic Frontier Foundation filed a lawsuit against the U.S. government on behalf of hardware guru Andrew “bunnie” Huang and computer-security researcher Matthew Green. In a nutshell, the pair allege that parts of the Digital Millennium Copyright Act are unconstitutional. Their objections center on Section 1201 of the DMCA, which makes it illegal to circumvent technical copy-protection schemes or to broadcast to others methods for doing so.

Read More
Blockchain split up

Ethereum Blockchain Forks to Return Stolen Funds

Remember that $60 million dollar blockchain heist that made headlines last month? You know, the one that threatened to throw the Ethereum project (the most successful cryptocurrency after Bitcoin) off the tracks? Well, as of today, it has all been magically fixed. On Wednesday, the users, miners, developers, exchanges, and everyone else who matters on the network embraced a fork of the Ethereum software. That effectively confiscated all of the stolen funds and placed them into a new account programmed to automatically reimburse victims of the theft. 

The maneuver, which was the focus of much philosophical and technical debate, seems to have worked well enough to call it a success. However, not everyone in the network went along with the fork. There are now two versions of the Ethereum blockchain growing in tandem—one (ETH) with the updates to the stolen funds and one (ETHC) that keeps everything as it was. You can see both chains growing here. About 15 percent of miners have continued to mine new blocks on the original Ethereum blockchain. However, none of the major online exchanges are listing or trading the coins generated on the un-forked chain. And so, it could be argued that at this point that those coins have no real value.

The forked chain, on the other hand, is performing well on exchanges. The price of Ethereum’s native currency, Ether, crashed right after The DAO—a smart contract-enabled investment fund—was hacked and drained of 3.6 million Ether. It now seems to be making a slow recovery.

A 13-percent increase in the market value of Ether does indeed signal a renewed faith in the overall viability of the Ethereum project. But, even as it seems to have been a success, the bailout of The DAO is likely to generate a lot of much needed discussion. The hard fork Ethereum pulled off this week marks the first time a protocol change has been written and adopted with the explicit goal of confiscating funds. Next week, we will take a look at what kinds of precedents this could set, both in terms of how the Ethereum community makes controversial decisions and what kinds of social intervention users will accept.

A plot of the physical gate length of transistors, which could stop getting smaller as early as 2021.

Transistors Will Stop Shrinking in 2021, Moore’s Law Roadmap Predicts

After more than 50 years of miniaturization, the transistor could stop shrinking in just five years. That is the prediction of the 2015 International Technology Roadmap for Semiconductors, which was officially released earlier this month. 

After 2021, the report forecasts, it will no longer be economically desirable for companies to continue to shrink the dimensions of transistors in microprocessors. Instead, chip manufacturers will turn to other means of boosting density, namely turning the transistor from a horizontal to a vertical geometry and building multiple layers of circuitry, one on top of another. 

For some, this change will likely be interpreted as another death knell for Moore’s Law, the repeated doubling of transistor densities that has given us the extraordinarily capable computers we have today. Compounding the drama is the fact that this is the last ITRS roadmap, the end to a more-than-20-year-old coordinated planning effort that began in the United States and was then expanded to include the rest of the world.

Citing waning industry participation and an interest in pursuing other initiatives, the Semiconductor Industry Association—a U.S. trade group that represents the interests of IBM, Intel, and other companies in Washington and a key ITRS sponsor—will do its own work, in collaboration with another industry group, the Semiconductor Research Corporation, to identify research priorities for government- and industry-sponsored programs. Other ITRS participants are expected to continue on with a new roadmapping effort under a new name, which will be conducted as part of an IEEE initiative called Rebooting Computing

These roadmapping shifts may seem like trivial administrative changes. But “this is a major disruption, or earthquake, in the industry,” says analyst Dan Hutcheson, of the firm VLSI Research. U.S. semiconductor companies had reason to cooperate and identify common needs in the early 1990’s, at the outset of the roadmapping effort that eventually led to the ITRS’s creation in 1998. Suppliers had a hard time identifying what the semiconductor companies needed, he says, and it made sense for chip companies to collectively set priorities to make the most of limited R&D funding. 

But the difficulty and expense associated with maintaining the leading edge of Moore’s Law has since resulted in significant consolidation. By Hutcheson’s count, 19 companies were developing and manufacturing logic chips with leading-edge transistors in 2001. Today, there are just four: Intel, TSMC, Samsung, and GlobalFoundries. (Until recently, IBM was also part of that cohort, but its chip fabrication plants were sold to GlobalFoundries.)

These companies have their own roadmaps and can communicate directly to their equipment and materials suppliers, Hutcheson says. What’s more, they’re fiercely competitive. “They don’t want to sit in a room and talk about what their needs are,” Hutcheson says. “It’s sort of like everything’s fun and games when you start off at the beginning of the football season, but by the time you get down to the playoffs it’s pretty rough.”

“The industry has changed,” agrees Paolo Gargini, chair of the ITRS, but he highlights other shifts. Semiconductor companies that no longer make leading-edge chips in house rely on the foundries that make their chips to provide advanced technologies. What’s more, he says, chip buyers and designers—companies such as Apple, Google, and Qualcomm—are increasingly dictating the requirements for future chip generations. “Once upon a time,” Gargini says, “the semiconductor companies decided what the semiconductor features were supposed to be. This is no longer the case.”

This final ITRS report is titled ITRS 2.0. The name reflects the idea that improvements in computing are no longer driven from the bottom-up, by tinier switches and denser or faster memories. Instead, it takes a more top-down approach, focusing on the applications that now drive chip design, such as data centers, the Internet of Things, and mobile gadgets. 

The new IEEE roadmap—the International Roadmap for Devices and Systems—will also take this approach, but it will add computer architecture to the mix, allowing for “a comprehensive, end-to-end view of the computing ecosystem, including devices, components, systems, architecture, and software,” according to a recent press release.

Transistor miniaturization was still a part of the long-term forecast as recently as 2014, when the penultimate ITRS report was released. That report predicted that the physical gate length of transistors—an indicator of how far current must travel in the device—and other key logic chip dimensions would continue to shrink until at least 2028. But since then, 3D concepts have gained momentum. The memory industry has already turned to 3D architectures to ease miniaturization pressure and boost the capacity of NAND Flash. Monolithic 3D integration, which would build layers of devices one on top of another, connecting them with a dense forest of wires, has also been an increasingly popular subject of discussion.

The new report embraces these trends, predicting an end to traditional scaling—the shrinking of chip features—by the early 2020’s. But the idea that we’re now facing an end to Moore’s Law “is completely wrong,” Gargini says. “The press has invented multiple ways of defining Moore’s Law but there is only one way: The number of transistors doubles every two years.” 

Moore’s Law, he emphasizes, is simply a prediction about how many transistors can fit in a given area of IC—whether it’s done, as it has been for decades, in a single layer or by stacking multiple layers. If a company really wanted to, Gargini says, it could continue to make transistors smaller well into the 2020s, “but it’s more economic to go 3-D. That’s the message we wanted to send.” 

There are other changes on the horizon. In the coming years, before 3-D integration is adopted, the ITRS predicts that leading-edge chip companies will move away from the transistor structure used now in high-performance chips: the FinFET. This device has a gate draped around three sides of a horizontal, fin-shaped channel to control the flow of current. According to the roadmap, chipmakers will leave that in favor of a lateral, gate-all-around device that has a horizontal channel like the FinFET but is surrounded by a gate that extends underneath as well. After that, transistors will become vertical, with their channels taking the form of pillars or nanowires standing up on end. The traditional silicon channel will also be replaced by channels made with alternate materials, namely silicon germanium, germanium, and compounds drawn from columns III and V of the old periodic table.

These changes will allow companies to pack more transistors in a given area and so adhere to the letter of Moore’s Law. But keeping to the spirit of Moore’s Law—the steady improvement in computing performance—is another matter. 

The doubling of transistor densities hasn’t been linked to improvements in computing performance for some time, notes Tom Conte, the 2015 president of the IEEE Computer Society and a co-leader of the IEEE Rebooting Computing Initiative.

For a long time, shrinking transistors meant faster speeds. But in the mid-1990’s, Conte says, the extra metal layers that were added to wire up increasing numbers of transistors were adding significant delays, and engineers redesigned chip microarchitectures to improve performance. A decade later, transistor densities were so high that their heat limited clock speeds. Companies began packing multiple cores on chips to keep things moving. 

“We’ve been living in this bubble where the computing industry could rely on the device side to do their job, and so the computer industry and the device industry really had this very nice wall between them,” says Conte. “That wall really started to crumble in 2005, and since that time we’ve been getting more transistors but they’re really not all that much better.” 

This crumbling wall was a strong motivation for the IEEE Rebooting Computing Initiative to begin collaborating with the ITRS last year, before the launch of the IRDS. “I like to say we could see the light at the end of the tunnel, and we knew it was an oncoming train,” says Conte.

The initiative held a summit last December that covered a gamut of potential future computing technologies, including new kinds of transistors and memory devices, neuromorphic computing, superconducting circuitry, and processors that use approximate instead of exact answers.

The first international Rebooting Computing conference will be held in October this year; IRDS meetings will coincide with such events, Conte says. The IRDS will still track “Moore’s Law to the bitter end,” Conte explains. But the roadmapping focus has changed: “This isn’t saying this is the end of Moore’s Law,” he says. “It’s stepping back and saying what really matters here—and what really matters here is computing.”

A fork bent into the shape of a hand doing a thumb's up.

“Hard Fork” Coming to Restore Ethereum Funds to Investors of Hacked DAO

It’s been a little over a month since The DAO, the most visible and well-funded application on the Ethereum network, crashed and burned at the hands of an anonymous hacker. Until its spectacular demise this June, The DAO was a centerpiece of the Ethereum ecosystem. It was intended to operate as a decentralized, automated investment fund, which at one point had over US $150 million worth of the cryptocurrency Ether at its disposal. Had it succeeded, it would have been a shining early example of Ethereum’s potential as a smart contract platform, adding credibility to the thoroughly-hyped idea that blockchains are poised to replace our legal, political, and social frameworks with code

Those dreams were smashed on 17 June when a hacker exploited a flaw in The DAO and drained it of some 3.6 million Ethers. Due to the nature of The DAO’s underlying structure however, these funds are stuck and cannot be used in a transaction until 28 July, meaning the hacker has no way of cashing out for another nine days. 

Provided with this handy deadline, the Ethereum community has spent the last month passionately debating whether it is possible, and more importantly, whether it is advisable to make investors whole again. In many ways, it is a lose, lose situation. If no action is taken, then there is a good chance that legal action will ensue—initiated either by investors in The DAO or by federal regulators, who are already sniffing at the hem of the blockchain.

Read More
Photo of IEEE Life Fellow Charles Kao in the lab

How Charles Kao Beat Bell Labs to the Fiber-Optic Revolution

Fifty years ago this month, a 32-year-old Chinese-born research engineer named Charles Kao published a milestone paper that set off the entire field of fiber-optic communications and eventually earned him a share of the 2009 Nobel Prize in Physics. The story of how that 1966 paper came to be is a wonderful example of how the key to a big technological breakthrough can come down to asking the right question.

Read More
null

DARPA Challenge Tests AI as Cybersecurity Defenders

Today’s malicious hackers have an average of 312 days to exploit “zero-day” computer software flaws before human cybersecurity experts can find and fix those flaws. The U.S. military’s main research agency focused on disruptive technologies aims to see whether artificial intelligence can do a better job of finding and fixing such exploits within a matter of seconds or minutes.

Read More
Mostly white-garbed pilgrims during the Hajj, in Saudi Arabia.

Hajj Pilgrims to Get Electronic Bracelets to Prevent a Repeat of 2015 Stampede

A stampede that killed hundreds and perhaps thousands of Hajj pilgrims made the 2015 disaster the deadliest on record for the world’s largest Islamic gathering. Hoping to prevent another accident this September, officials plan to issue electronic bracelets to guide and keep track of the millions of pilgrims expected to visit Islam’s holy sites at Mecca in Saudi Arabia.

The Saudi Press Agency described the water-resistant bracelets as GPS-linked devices containing personal identification and medical records that Saudi officials and security forces could access via smartphone, according to BBC News. Personal information on each pilgrim would include passport numbers and addresses. In addition, Saudi officials have installed 1,000 new surveillance cameras to keep an eye on the pilgrims as they walk along pilgrimage routes and crowd inside holy sites.

Read More
null

For Best Results, Send Molecular Messages Using MIMO

Just as cell phones propagate radio waves to connect users, it’s also possible to transmit messages by emitting molecules. A sender can compose a message by diffusing bursts of a specific type of molecule, so long as a recipient can detect that molecule and interpret the pattern. Using this method, nanodevices could create a digital signaling system within the body, a locale where radio waves are quickly absorbed and where there is little space for bulky antennas.

Neurons and other cells in the body already communicate through the transfer of neurotransmitters, hormones, and other signaling molecules. And researchers have shown in early experiments that molecular digital communications through the air works but only with a fairly low data rate. Now, a team from Toronto’s York University and Yonsei University in Seoul, South Korea found that they can nearly double that data rate by applying multiple-input multiple-output (MIMO) technology.

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More