Can HPE’s “The Machine” Deliver?

A 320-terabyte prototype, expected in 2016, will showcase the company’s bid to reinvent computing

4 min read
Can HPE’s “The Machine” Deliver?
Illustration: Elias Stein

HP never shied away from big names for its computers. There are high-performance servers named Apollo and optimized computing systems called Moonshot. And then there’s The Machine.

When Hewlett-Packard Co.—now split in two—announced The Machine in Las Vegas in 2014, it presented the project as a near-complete overhaul of traditional computer architecture. Gone were the CPU-centric architecture, the slow copper communications, and the messy hierarchy of traditional memory. In their place, specialized computing cores, speedy light-carrying photonic connections, and a massive store of dense, energy-efficient memristor memory. The resulting computer, its designers say, will be efficient enough to manipulate petabyte-scale data sets in an unprecedented fashion, expanding what companies and scientists can accomplish in areas such as graph theory, predictive analytics, and deep learning in a way that could improve our daily lives.

There is nothing small about what HP promised. Now the question is what will come of the initial claims. It seems we’ll soon get a glimpse of the vision, realized in hardware; Hewlett Packard Labs (formerly HP Labs) says it hopes to unveil its first large-scale prototype of The Machine in 2016. The project is now part of Hewlett Packard Enterprise (HPE), which focuses on corporate computing (the second HP spin-off, HP Inc., sells PCs and printers).

img HPETheMachine2 Machine prototype at Hewlett Packard Labs in Palo Alto. Photo: Hewlett Packard Enterprise

That initial computer is a whopper by modern standards: It’s expected to be about the size of a server rack, with 320 terabytes of memory and more than 2,500 CPU cores. But one key Machine component will be missing: the memristor.

The device, one of four fundamental circuit elements, was developed at HP Labs. HP touted it as an energy-efficient replacement for DRAM, the workhorse memory that sits close to CPUs. Like flash, memristors can also act as nonvolatile storage, retaining their data even when powered down.

HP Labs looked at ways to capitalize on these properties after the memristor was announced in 2008. In the end, the device became a key component of a larger overhaul the company calls Memory-Driven Computing. Current computers have different forms of memory, each with different advantages when it comes to cost, power consumption, and speed. Memristors with various capabilities, the thinking went, could eventually replace them all.

In his first public talk about The Machine, in 2014, the then HP Labs director Martin Fink outlined a picture of what this might look like: a computer server with as much as 160 petabytes of data—roughly five times what the Large Hadron Collider produces in the course of a year—all accessible in less than 250 nanoseconds. Such capacity would be huge, says Richard Fichera, an analyst at Forrester Research, and the access time on the order of 1,000 times as fast as can be done today. But memristors have had teething pains. Although HP previously showed off wafers containing memristors, a commercial offering has yet to emerge.

Memristor Timeline

  1. 2008

    HP Labs announces the development of a fourth fundamental circuit element, the memristor.

  2. 2010

    HP announces a partnership with SK Hynix (then Hynix) to manufacture the memory.

  3. 2011

    HP says it hopes to have memristors products ready by the end of 2013.

  4. 2014

    The Machine is announced. Details on the first prototype emerge the following year.

  5. 2015

    Another memristor partnership, this one with memory maker SanDisk, is announced.

When The Machine’s first prototype arrives, it will carry energy-hungry DRAM in their place. The memory will be a stand-in, the team says, that can be used to emulate later hardware and move development forward. “Everything the Machine team is doing is designed to get us to working, useful Machine prototypes as quickly as possible,” the group wrote to IEEE Spectrum. “We still believe memristors are the best candidate, but rather than waiting until memristors are ready before making prototype Machines, we’ll use more conventional memory to learn about memory fabrics and how operating systems, analytics, and applications should change now rather than waiting.” Memristors, in principle, will be slotted in later, although no time frame has been given for when that might occur.

Grand announcements about products that are still years away are rare in the computer industry. “We don’t need to talk about stuff five years from now,” Juan Loaiza, Oracle senior vice president for systems technology, told Bloomberg Businessweek.

HPE says the project will evolve, with hardware and software building upon one another in a “virtuous cycle.” Commercial offerings, ranging from credit-card size to supercomputer scale are targeted for 2020, but components of The Machine will be marketed as they become available.

“As a research vehicle, I think it’s a great idea. I just think they sort of painted themselves into a corner with a lot of early and aggressive hype,” says Forrester’s Fichera, who worked on server strategy at HP. “It’s going to be a real challenge for them to pull off some of the things they’re talking about.”

The memristor could have a big impact on success, Fichera says, as it is the one piece of Machine hardware exclusive to the company. “If they could build a huge server with a huge memory space and somehow allow these individual nodes to share that memory more efficiently, that could be a very advantageous architecture,” he said. “Unfortunately anyone else who wants to chase that same goal is going to have all of the same components to work with.”

But perhaps HPE, which sells servers, will go a different way. Last year, Intel and Micron announced a speedy new form of nonvolatile memory called 3D XPoint, aimed to ship in 2016. Jim Handy, an analyst at Objective Analysis, says he suspects those companies could well have convinced then HP that their memory is “a better or more timely solution.”

If that’s true, other companies will have access to that same memory technology. With the ample heads-up from HP in 2014, they too could be lacing up their racing shoes.

The Conversation (0)

China Aims for a Permanent Moon Base in the 2030s

Lunar megaproject to be a stepping-stone to the solar system

6 min read
Mark Ralston/AFP/Getty Images

On 3 January 2019, the Chinese spacecraft Chang'e-4 descended toward the moon. Countless craters came into view as the lander approached the surface, the fractal nature of the footage providing no sense of altitude. Su Yan, responsible for data reception for the landing at Miyun ground station, in Beijing, was waiting—nervously and in silence with her team—for vital signals indicating that optical, laser, and microwave sensors had combined effectively with rocket engines for a soft landing. "When the [spectral signals were] clearly visible, everyone cheered enthusiastically. Years of hard work had paid off in the most sweet way," Su recalls.

Chang'e-4 had, with the help of a relay satellite out beyond the moon, made an unprecedented landing on the always-hidden lunar far side. China's space program, long trailing in the footsteps of the U.S. and Soviet (now Russian) programs, had registered an international first. The landing also prefigured grander Chinese lunar ambitions.

Keep Reading ↓ Show less

Air Quality: Easy to Measure, Tough to Fix

Wildfire season shows the limits of air purifiers

3 min read
Harry Campbell

Illustration of a phone with with a sensor on top. Harry Campbell

The summer of 2020 brought wildfire to Portland, Ore., as it did to so many other cities across the world. All outdoor activity in my neighborhood ceased for weeks, yet staying indoors didn't guarantee relief. The worst days left me woozy as my lone air purifier, whirring like a jet engine, failed to keep up.

Keep Reading ↓ Show less

How to Write Exceptionally Clear Requirements: 21 Tips

Avoid bad requirements with these 21 tips

1 min read

Systems Engineers face a major dilemma: More than 50% of project defects are caused by poorly written requirements. It's important to identify problematic language early on, before it develops into late-stage rework, cost-overruns, and recalls. Learn how to identify risks, errors and ambiguities in requirements before they cripple your project.

Trending Stories

The most-read stories on IEEE Spectrum right now