The February 2023 issue of IEEE Spectrum is here!

Close bar

Bitcoin’s Biggest Tech Player to Release AI Chips and Computers

Bitmain built the majority of the computing power on the Bitcoin network. Now it wants to expand into deep learning and other AI

2 min read
Bitmain's Sophon BM1680 AI chips
Photo: Bitmain

By its own reckoning, Bitmain built 70 percent of all the computers on the Bitcoin network. It makes specialized chips to perform the critical hash functions involved in mining and trading bitcoins, and packages those chips into the top mining rig—the Antminer S9.

But Bitmain CEO Jihan Wu sees a future beyond blockchains and cryptocurrency. As he told IEEE Spectrum contributing editor Morgen E. Peck in July: “It’s quite personal that I wanted Bitcoin to be successful. But as a company we are not allowed to solely rely on the success of Bitcoin. That’s a thing we cannot afford.”

So Bitmain is applying its Bitcoin playbook to artificial intelligence. On 8 November, Bitmain’s other CEO Micree Zhan will detail its new AI chip, the Sophon BM1680, at AIWORLD in Beijing, and the company will begin selling systems based on it.

The company claims the chip is specialized for both training and executing deep learning algorithms. The latter task relies on the chip’s ability to perform inference. As Spectrum’s David Schneider pointed out earlier this year, training an AI and performing inference require different skills from a processor. In particular, he notes, training typically needs high-precision math, while inference is most efficient at low precision.

Google’s Tensor Processing Unit uses 8-bit math for inferencing. According to its specs, the BM1680 uses 32-bit floating point math. It can perform 2 teraflops (2 trillion floating point operations per second) and typically consumes 25 Watts but can ramp up to 41 W when running flat out.  

Bitmain claims its architecture is similar to Google’s TPU. The architecture is of the general class called systolic, meaning that data flows through its processing cores in waves. “Google TPU is a cool role model among all AI ASIC makers,” Janet Zhao, who works in Bitmain’s high-performance computing group, said in an email. “When we started to develop our own AI chip…. in the end of 2015, we adopted the enhanced systolic technology which is similar to Google TPU.”

The Sophon BM1680 is the heart of a card and specialized server that Bitmain will begin selling on 8 November. The Sophon SC1 card is meant as an accelerator for deep learning applications. The Sophon SS1 server contains the SC1 card and is meant for video and image analysis.

According to the company, it has been communicating with the technical and business teams of potential customers including Baidu, Alibaba, and Tencent for the last year and a half of development.

Zhao says that these companies are concerned about the cost, stability of supply, and power consumption of GPU-based AI accelerators and are looking for a new vendor.

The Conversation (0)

An IBM Quantum Computer Will Soon Pass the 1,000-Qubit Mark

The Condor processor is just one quantum-computing advance slated for 2023

4 min read
This photo shows a woman working on a piece of apparatus that is suspended from the ceiling of the laboratory.

A researcher at IBM’s Thomas J. Watson Research Center examines some of the quantum hardware being constructed there.

Connie Zhou/IBM

IBM’s Condor, the world’s first universal quantum computer with more than 1,000 qubits, is set to debut in 2023. The year is also expected to see IBM launch Heron, the first of a new flock of modular quantum processors that the company says may help it produce quantum computers with more than 4,000 qubits by 2025.

This article is part of our special report Top Tech 2023.

While quantum computers can, in theory, quickly find answers to problems that classical computers would take eons to solve, today’s quantum hardware is still short on qubits, limiting its usefulness. Entanglement and other quantum states necessary for quantum computation are infamously fragile, being susceptible to heat and other disturbances, which makes scaling up the number of qubits a huge technical challenge.

Keep Reading ↓Show less