Competing Visions Underpin China’s Quantum Computer Race

Alibaba builds their own qubits, Baidu remains quantum hardware-agnostic

5 min read
Quibit pattern
IEEE Spectrum

China and the US are in a race to conquer quantum computing, which promises to unleash the potential of artificial intelligence and give the owner all-seeing, code-breaking powers.

But there is a race within China itself among companies trying to dominate the space, led by tech giants Alibaba and Baidu.

Like their competitors IBM, Google, Honeywell, and D-Wave, both Chinese companies profess to be developing "full stack" quantum businesses, offering access to quantum computing through the cloud coupled with their own suite of algorithms, software, and consulting services.

Alibaba is building solutions for specific kinds of hardware, as IBM, Google, and Honeywell are doing. (IBM's software stack will also support trapped ion hardware, but the company's focus is on supporting its superconducting quantum computers. Honeywell's software partner, Cambridge Quantum, is hardware agnostic, but the two companies' cooperation is focused on Honeywell's trapped ion computer.)

Baidu is different in that it is building a hardware-agnostic software stack that can plug into any quantum hardware, whether that hardware uses a superconducting substrate, nuclear magnetic resonance, or ion traps to control its qubits.

"Currently we don't do hardware directly, but develop the hardware interface," Runyao Duan, Baidu's head of quantum computing, told the 24th Annual Conference on Quantum Information Processing earlier this year. "This is a very flexible strategy and ensures that we will be open for all hardware providers."

Quantum computers calculate using the probability that an array of entangled quantum particles is in a particular state at any point in time. Maintaining and manipulating the fragile particles is itself a difficult problem that has yet to be solved at scale. Quantum computers today consist of fewer than 100 qubits, though hardware leader IBM has a goal of reaching 1,000 qubits by 2023.

But an equally thorny problem is how to use those qubits once they exist. "We can build a qubit. We can manipulate a qubit and we can read a qubit," said Mattia Fiorentini, head of machine learning and quantum algorithms at Cambridge Quantum in London. "The question is, how do you build software that can really benefit from all that information processing power?"

Scientists around the world are working on ways to program quantum computers that are useful and generalized and that engineers can use pretty much straight out of the box.

Of course, real large-scale quantum computing remains a relatively distant dream—currently quantum cloud services are primarily used for simulations of quantum computing using classical computers, although some are using small quantum systems—and so it's too early to say whether Baidu's strategy will pay off.

“We can build a qubit. We can read a qubit. But how do you build software that can really benefit from all that information processing power?"

In the past, Alibaba worked with the University of Science and Technology of China in Hefei, the capital of central China's Anhui province, which currently has the world's most advanced quantum computer, dubbed the Zuchongzhi 2.1, after China's famous fifth century astronomer who first calculated pi to six decimal places. The company is also building quantum computing hardware of its own.

China's most important quantum scientist, Pan Janwei, also worked for Alibaba as scientific advisor. Earlier this year, Pan's team set a new milestone in quantum computation with the 66-qubit Zuchongzhi 2.1. Pan and his team ran a calculation on the device in about an hour and a half, which would take the world's fastest supercomputer an estimated eight years to complete.

Baidu, meanwhile, has been releasing a series of platforms and tools that it hopes will put it ahead when quantum computers eventually become large enough and stable enough to be practical.

Last year, it announced a new cloud-based quantum computing platform called Quantum Leaf, which it bills as the first cloud-native quantum computing platform in China—a bit of semantics apparently intended to put it ahead of Alibaba's cloud division, which began offering a cloud-based quantum platform with the Chinese Academy of Sciences several years ago.

Unlike Alibaba's platform, Quantum Leaf's cloud programming environment provides quantum-infrastructure-as-a-service.

Baidu's cloud-native quantum computing platform Quantum Leaf provides access to the superconducting quantum processing unit from the Institute of Physics, Chinese Academy of Sciences.

Baidu also released Paddle Quantum, a device-independent platform for building and training quantum neural network models for advanced quantum computing applications. It combines AI and quantum computing using the company's deep learning framework called PaddlePaddle—Paddle means PArallel, Distributed, Deep Learning—which has 3.6 million developers and can support hyperscale training models with trillions of parameters.

Paddle Quantum, in turn, can be used to develop quantum neural network models for software solutions. Users can then deploy those models on both quantum processing units or simulators through Quantum Leaf.

Baidu's quantum activities are largely focused on quantum artificial intelligence, an extension of Baidu's current artificial intelligence activities.

Baidu also offers a "cloud-based quantum pulse computing service" called Quanlse, intended to bridge the gap between hardware and software through sequences of pulses that can control quantum hardware and reduce quantum error, one of the biggest challenges in quantum computing.

"We see an increasing number of demands from universities and companies to use our quantum platform and collaborat[e] on quantum solutions, [which] is an essential part of our quantum ecology," a Baidu spokesperson said.

Baidu's quantum activities are largely focused on quantum artificial intelligence, an extension of Baidu's current artificial intelligence activities. Quantum computing is expected to accelerate the development of artificial intelligence both by making models faster but also by allowing compute-intensive models not currently possible on classical computers.

The company established a quantum computing institute in 2018 whose research includes classification of quantum data, which opens the door to quantum machine learning. To classify chemical compounds as toxic or non-toxic, for example, data scientists currently use classical means. But because the underlying data—the molecules and their configurations—is quantum data, it would be faster and more accurate to classify that quantum data directly with a quantum computer.

Quantum information is encoded in the probability distribution of qubit states. That probability distribution is reconstructed by collecting samples with classical means, but the number of samples needed grows exponentially as you add qubits.

"The more you add qubits to your quantum system, the more powerful the system, but the more samples you need to take to extract all useful information," says Cambridge Quantum's Fiorentini.

Existing methods for quantum classification are impractical because hardware and infrastructure limitations restrict the complexity of the datasets that can be applied.

Baidu researchers' new hybrid quantum-classical framework for supervised quantum learning uses what they call the “shadows" of quantum data as a subroutine to extract significant features—where “shadows" here refers to a method for approximating classical descriptions of a quantum state using relatively few measurements of the state.

"If we can get all the key information out of the quantum computer with a very small number of samples without sacrificing information, that's significant," says Fiorentini.

Baidu's hybrid quantum-classical framework, meanwhile, sharply reduces the number of parameters, making quantum machine learning models training easier and less compute intensive.

In the near term, the company says, Baidu is pursuing more efficient and more powerful classical computing resources that can accelerate its AI applications, from training large-scale models to inferencing on the cloud or edge. In 2018, it developed a cross-architecture AI chip called Kunlun, named or the mountain range on the Tibetan plateau that is the mythological origin of Chinese civilization.

Baidu has produced more than 20,000 14-nm Kunlun chips for use in its search engine, Baidu AI cloud and other applications. It recently announced the mass production of Kunlun II, which offers 2-3 times better performance than the previous generation, using the world's leading 7nm process and built on Baidu's own second-generation cross-platform architecture. Kunlun II has a lower peak power consumption while offering significantly better performance in AI training and inferencing. The chip can be applied in multiple scenarios, including in the cloud, on terminal, and at the edge, powering high-performance computer clusters used in biocomputing and autonomous driving.

The Conversation (1)
Buddy Nguyen15 Sep, 2021

Wasn't China's Zuchongzhi a one trick pony? Couldn't be programed and pretty much useless.

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
Vertical
A plate of spaghetti made from code
Shira Inbar
DarkBlue1

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less
{"imageShortcodeIds":["31996907"]}