Can The Human Brain Project Succeed?

Europe’s ambitious program to simulate the human brain is meeting with some very human resistance

3 min read
Can The Human Brain Project Succeed?
Image: Getty Images

An ambitious effort to build human brain simulation capability is meeting with some very human resistance. On Monday, a group of researchers sent an open letter to the European Commission protesting the management of the Human Brain Project, one of two Flagship initiatives selected last year to receive as much as €1 billion over the course of 10 years (the other award went to a far less controversy-courting project devoted to graphene).

The letter, which now has more than 450 signatories, questions the direction of the project and calls for a careful, unbiased review. Although he’s not mentioned by name in the letter, news reports cited resistance to the path chosen by project leader Henry Markram of the Swiss Federal Institute of Technology in Lausanne. One particularly polarizing change was the recent elimination of a subproject, called Cognitive Architectures, as the project made its bid for the next round of funding.

According to Markram, the fuss all comes down to differences in scientific culture. He has described the project, which aims to build six different computing platforms for use by researchers, as an attempt to build a kind of CERN for brain research, a means by which disparate disciplines and vast amounts of data can be brought together. This is a "methodological paradigm shift" for neuroscientists accustomed to individual research grants, Markram told Science, and that's what he says the letter signers are having trouble with.

But some question the main goals of the project, and whether we're actually capable of achieving them at this point. The program's Brain Simulation Platform aims to build the technology needed to reconstruct the mouse brain and eventually the human brain in a supercomputer. Part of the challenge there is technological. Markram has said that an exascale-level machine (one capable of executing 1000 or more petaflops) would be needed to "get a first draft of the human brain", and the energy requirements of such machines are daunting

Crucially, some experts say that even if we had the computational might to simulate the brain, we're not ready to. "The main apparent goal of building the capacity to construct a larger-scale simulation of the human brain is radically premature," signatory Peter Dayan, who directs a computational neuroscience department at University College London, told the Guardian. He called the project a "waste of money" that "can't but fail from a scientific perspective". To Science, he said "the notion that we know enough about the brain to know what we should simulate is crazy, quite frankly.”

This last comment resonated with me, as it reminded me of a feature that Steve Furber of the University of Manchester wrote for IEEE Spectrum a few years ago. Furber, one of the co-founders of the mobile chip design powerhouse ARM, is now in the process of stringing a million or so of the low-power processors together to build a massively parallel computer capable of simulating 1 billion neurons, about 1% as many as are contained in the human brain.

Furber and his collaborators designed their computing architecture quite carefully in order to take into account the fact that there's still a host of open questions when it comes to basic brain operation. General-purpose computers are power-hungry and slow when it comes to brain simulation. Analog circuitry, which is also on the Human Brain Project's list, might better mimic the way neurons actually operate, but, he wrote,

“as speedy and efficient as analog circuits are, they’re not very flexible; their basic behavior is pretty much baked right into them. And that’s unfortunate, because neuroscientists still don’t know for sure which biological details are crucial to the brain’s ability to process information and which can safely be abstracted away”

The Human Brain Project's website admits that exascale computing will be hard to reach: "even in 2020, we expect that supercomputers will have no more than 200 petabytes." To make up for the shortfall, it says, "what we plan to do is build fast storage random-access storage systems next to the supercomputer, store the complete detailed model there, and then allow our multi-scale simulation software to call in a mix of detailed or simplified models (models of neurons, synapses, circuits, and brain regions) that matches the needs of the research and the available computing power. This is a pragmatic strategy that allows us to keep build ever more detailed models, while keeping our simulations to the level of detail we can support with our current supercomputers."

This does sound like a flexible approach. But, as is par for the course with any ambitious research project, particularly one that involves a great amount of synthesis of disparate fields, it's not yet clear whether it will pay off. 

And any big changes in direction may take a while. Although the proposal for the second round of funding will be reviewed this year, according to Science, which reached out to the European Commission, the first review of the project itself won't begin until January 2015.

Rachel Courtland can be found on Twitter at @rcourt.

The Conversation (0)