Bitcoin’s Biggest Tech Player to Release AI Chips and Computers

Bitmain built the majority of the computing power on the Bitcoin network. Now it wants to expand into deep learning and other AI

2 min read
Bitmain's Sophon BM1680 AI chips
Photo: Bitmain

By its own reckoning, Bitmain built 70 percent of all the computers on the Bitcoin network. It makes specialized chips to perform the critical hash functions involved in mining and trading bitcoins, and packages those chips into the top mining rig—the Antminer S9.

But Bitmain CEO Jihan Wu sees a future beyond blockchains and cryptocurrency. As he told IEEE Spectrum contributing editor Morgen E. Peck in July: “It’s quite personal that I wanted Bitcoin to be successful. But as a company we are not allowed to solely rely on the success of Bitcoin. That’s a thing we cannot afford.”

So Bitmain is applying its Bitcoin playbook to artificial intelligence. On 8 November, Bitmain’s other CEO Micree Zhan will detail its new AI chip, the Sophon BM1680, at AIWORLD in Beijing, and the company will begin selling systems based on it.

The company claims the chip is specialized for both training and executing deep learning algorithms. The latter task relies on the chip’s ability to perform inference. As Spectrum’s David Schneider pointed out earlier this year, training an AI and performing inference require different skills from a processor. In particular, he notes, training typically needs high-precision math, while inference is most efficient at low precision.

Google’s Tensor Processing Unit uses 8-bit math for inferencing. According to its specs, the BM1680 uses 32-bit floating point math. It can perform 2 teraflops (2 trillion floating point operations per second) and typically consumes 25 Watts but can ramp up to 41 W when running flat out.  

Bitmain claims its architecture is similar to Google’s TPU. The architecture is of the general class called systolic, meaning that data flows through its processing cores in waves. “Google TPU is a cool role model among all AI ASIC makers,” Janet Zhao, who works in Bitmain’s high-performance computing group, said in an email. “When we started to develop our own AI chip…. in the end of 2015, we adopted the enhanced systolic technology which is similar to Google TPU.”

The Sophon BM1680 is the heart of a card and specialized server that Bitmain will begin selling on 8 November. The Sophon SC1 card is meant as an accelerator for deep learning applications. The Sophon SS1 server contains the SC1 card and is meant for video and image analysis.

According to the company, it has been communicating with the technical and business teams of potential customers including Baidu, Alibaba, and Tencent for the last year and a half of development.

Zhao says that these companies are concerned about the cost, stability of supply, and power consumption of GPU-based AI accelerators and are looking for a new vendor.

The Conversation (0)

The Future of Deep Learning Is Photonic

Computing with light could slash the energy needs of neural networks

10 min read

This computer rendering depicts the pattern on a photonic chip that the author and his colleagues have devised for performing neural-network calculations using light.

Alexander Sludds
DarkBlue1

Think of the many tasks to which computers are being applied that in the not-so-distant past required human intuition. Computers routinely identify objects in images, transcribe speech, translate between languages, diagnose medical conditions, play complex games, and drive cars.

The technique that has empowered these stunning developments is called deep learning, a term that refers to mathematical models known as artificial neural networks. Deep learning is a subfield of machine learning, a branch of computer science based on fitting complex models to data.

Keep Reading ↓ Show less