The deep-learning software driving the modern artificial intelligence revolution has mostly run on fairly standard computer hardware. Some tech giants such as Google and Intel have focused some of their considerable resources on creating more specialized computer chips designed for deep learning. But IBM has taken a more unusual approach: It is testing its brain-inspired TrueNorth computer chip as a hardware platform for deep learning.

Deep learning’s powerful capabilities rely on algorithms called convolutional neural networks that consist of layers of nodes (also known as neurons). Such neural networks can filter huge amounts of data through their “deep” layers to become better at, say, automatically recognizing individual human faces or understanding different languages. These are the types of capabilities that already empower online services offered by the likes of Google, Facebook, Amazon, and Microsoft.

In recent research, IBM has shown that such deep-learning algorithms could run on brain-inspired hardware that typically supports a very different type of neural network.

IBM published a paper on its work in the 9 September 2016 issue of the journal Proceedings of the National Academy of SciencesThe research was funded with just under $1 million from the U.S. Defense Advanced Research Projects Agency (DARPA). Such funding formed part of DARPA’s Cortical Processor program aimed at brain-inspired AI that can recognize complex patterns and adapt to changing environments.

“The new milestone provides a palpable proof of concept that the efficiency of brain-inspired computing can be merged with the effectiveness of deep learning, paving the path towards a new generation of chips and algorithms with even greater efficiency and effectiveness,” says Dharmendra Modha, chief scientist for brain-inspired computing at IBM Research-Almaden, in San Jose, Calif.

IBM first laid down the specifications for TrueNorth and a prototype chip in 2011. So, TrueNorth predated—and was therefore never specifically designed to harness—the deep-learning revolution based on convolutional neural networks that took off starting in 2012. Instead, TrueNorth typically supports spiking neural networks that more closely mimic the way real neurons work in biological brains.

Instead of firing every cycle, the neurons in spiking neural networks must gradually build up their potential before they fire. To achieve precision on deep-learning tasks, spiking neural networks typically have to go through multiple cycles to see how the results average out. That effectively slows down the overall computation on tasks such as image recognition or language processing.

Deep-learning experts have generally viewed spiking neural networks as inefficient—at least, compared with convolutional neural networks—for the purposes of deep learning. Yann LeCun, director of AI research at Facebook and a pioneer in deep learning, previously critiqued IBM’s TrueNorth chip because it primarily supports spiking neural networks. (See IEEE Spectrum’s previous interview with LeCun on deep learning.)

The IBM TrueNorth design may better support the goals of neuromorphic computing that focus on closely mimicking and understanding biological brains, says Zachary Chase Lipton, a deep-learning researcher in the Artificial Intelligence Group at the University of California, San Diego. By comparison, deep-learning researchers are more interested in getting practical results for AI-powered services and products. He explains the difference as follows:

To evoke the cliche metaphor about birds and airplanes, you might say the computational neuroscience/neuromorphic community is more concerned with studying birds, and the machine learning community more interested in understanding aerodynamics, with or without the help of biology. The deep learning community is generally bullish on the benefits of specialized hardware. [Therefore,] the neuromorphic chips don't inspire as much excitement because the spiking neural networks they focus on are not so popular in deep learning.

To make the TrueNorth chip a good fit for deep learning, IBM had to develop a new algorithm that could enable convolutional neural networks to run well on its neuromorphic computing hardware. This combined approach achieved what IBM describes as “near state-of-the-art” classification accuracy on eight data sets involving vision and speech challenges. They saw between 65 percent and 97 percent accuracy in the best circumstances.

When just one TrueNorth chip was being used, it surpassed state-of-the-art accuracy on just one out of eight data sets. But IBM researchers were able to boost the hardware’s accuracy on the deep-learning challenges by using up to eight chips. That enabled TrueNorth to match or surpass state-of-the-art accuracy on three of the data sets.

The TrueNorth testing also managed to process between 1,200 and 2,600 video frames per second. That means a single TrueNorth chip could detect patterns in real time from between as many as 100 cameras at once, Modha says. This assumes each camera uses 1,024 color pixels (32 x 32) and streams information at a standard TV rate of 24 frames per second.

Such results may be impressive for TrueNorth’s first major foray into deep-learning testing, but they should be taken with a grain of salt, Lipton says. He points out that the vision data sets involved very minor problems with the 32 x 32 pixel images.

Still, IBM’s Modha seems enthusiastic about continuing to test TrueNorth for deep learning. He and his colleagues hope to test the chip on so-called unconstrained deep learning, which involves gradually introducing hardware constraints during the training of neural networks instead of constraining them from the very beginning.

Modha also points to TrueNorth’s general design as an advantage over those of more specialized deep-learning hardware designed to run only convolutional neural networks. It will likely allow the running of multiple types of AI networks on the same chip. 

“Not only is TrueNorth capable of implementing these convolutional networks, which it was not originally designed for, but it also supports a variety of connectivity patterns (feedback and lateral, as well as feed forward) and can simultaneously implement a wide range of other algorithms,” Modha says.

Such biologically inspired chips would probably become popular only if they show that they can outperform other hardware approaches for deep learning, Lipton says. But he suggested that IBM could leverage its hardware expertise to join Google and Intel in creating new specialized chips designed specifically for deep learning.

“I imagine that some of the neuromorphic chipmakers will use their expertise in hardware acceleration to develop chips more focused on practical deep-learning applications and less focused on biological simulation,” Lipton says.

The Conversation (0)

The Spectacular Collapse of CryptoKitties, the First Big Blockchain Game

A cautionary tale of NFTs, Ethereum, and cryptocurrency security

8 min read
Mountains and cresting waves made of cartoon cats and large green coins.
Frank Stockton

On 4 September 2018, someone known only as Rabono bought an angry cartoon cat named Dragon for 600 ether—an amount of Ethereum cryptocurrency worth about US $170,000 at the time, or $745,000 at the cryptocurrency’s value in July 2022.

It was by far the highest transaction yet for a nonfungible token (NFT), the then-new concept of a unique digital asset. And it was a headline-grabbing opportunity for CryptoKitties, the world’s first blockchain gaming hit. But the sky-high transaction obscured a more difficult truth: CryptoKitties was dying, and it had been for some time.

The launch of CryptoKitties drove up the value of Ether and the number of transactions on its blockchain. Even as the game's transaction volume plummeted, the number of Ethereum transactions continued to rise, possibly because of the arrival of multiple copycat NFT games.

That perhaps unrealistic wish becomes impossible once the downward spiral begins. Players, feeling no other attachment to the game than growing an investment, quickly flee and don’t return.

Whereas some blockchain games have seemingly ignored the perils of CryptoKitties’ quick growth and long decline, others have learned from the strain it placed on the Ethereum network. Most blockchain games now use a sidechain, a blockchain that exists independently but connects to another, more prominent “parent” blockchain. The chains are connected by a bridge that facilitates the transfer of tokens between each chain. This prevents a rise in fees on the primary blockchain, as all game activity occurs on the sidechain.

Yet even this new strategy comes with problems, because sidechains are proving to be less secure than the parent blockchain. An attack on Ronin, the sidechain used by Axie Infinity, let the hackers get away with the equivalent of $600 million. Polygon, another sidechain often used by blockchain games, had to patch an exploit that put $850 million at risk and pay a bug bounty of $2 million to the hacker who spotted the issue. Players who own NFTs on a sidechain are now warily eyeing its security.

Remember Dragon

The cryptocurrency wallet that owns the near million dollar kitten Dragon now holds barely 30 dollars’ worth of ether and hasn’t traded in NFTs for years. Wallets are anonymous, so it’s possible the person behind the wallet moved on to another. Still, it’s hard not to see the wallet’s inactivity as a sign that, for Rabono, the fun didn’t last.

Whether blockchain games and NFTs shoot to the moon or fall to zero, Bladon remains proud of what CryptoKitties accomplished and hopeful it nudged the blockchain industry in a more approachable direction.

“Before CryptoKitties, if you were to say ‘blockchain,’ everyone would have assumed you’re talking about cryptocurrency,” says Bladon. “What I’m proudest of is that it was something genuinely novel. There was real technical innovation, and seemingly, a real culture impact.”

This article was corrected on 11 August 2022 to give the correct date of Bryce Bladon's departure from Dapper Labs.

This article appears in the September 2022 print issue as “The Spectacular Collapse of CryptoKitties.”

Keep Reading ↓Show less