The Human Brain Project Reboots: A Search Engine for the Brain Is in Sight

The massive €1 billion project has shifted focus from simulation to informatics

10 min read
Image of brain with computer mouse.
Image: Dan Saelinger

Image of brain with computer mousePhoto: Dan Saelinger

The human brain is smaller than you might expect: One of them, dripping with formaldehyde, fits in a single gloved hand of a lab supervisor here at the Jülich Research Center, in Germany.

Soon, this rubbery organ will be frozen solid, coated in glue, and then sliced into several thousand wispy slivers, each just 60 micrometers thick. A custom apparatus will scan those sections using 3D polarized light imaging (3D-PLI) to measure the spatial orientation of nerve fibers at the micrometer level. The scans will be gathered into a colorful 3D digital reconstruction depicting the direction of individual nerve fibers on larger scales—roughly 40 gigabytes of data for a single slice and up to a few petabytes for the entire brain. And this brain is just one of several to be scanned.

Neuroscientists hope that by combining and exploring data gathered with this and other new instruments they’ll be able to answer fundamental questions about the brain. The quest is one of the final frontiers—and one of the greatest challenges—in science.

Imagine being able to explore the brain the way you explore a website. You might search for the corpus callosum—the stalk that connects the brain’s two hemispheres—and then flip through individual nerve fibers in it. Next, you might view networks of cells as they light up during a verbal memory test, or scroll through protein receptors embedded in the tissue.

Right now, neuroscientists can’t do that. They lack the hardware to store and access the avalanche of brain data being produced around the world. They lack the software to bridge the gaps from genes, molecules, and cells to networks, connectivity, and human behavior.

“We don’t have the faintest idea of the molecular basis for diseases like Alzheimer’s or schizophrenia or others. That’s why there are no cures,” says Paolo Carloni, director of the Institute for Computational Biomedicine at Jülich. “To make a big difference, we have to dissect [the brain] into little pieces and build it up again.”

That’s why there’s no choice but to move from small-scale investigations to large, collaborative efforts. “The brain is too complex to sit in your office and solve it alone,” says neuroscientist Katrin Amunts, who coleads the 3D-PLI project at Jülich. Neuroscientists need to make the same transition that physicists and geneticists once did—from solo practitioners to consortia—and that transformation won’t be easy.

The Human Genome Project, for instance, was a concerted international effort culminating in a full, searchable human genome—all the pages of the manual for making a human body. Along the way, the project pioneered technologies that have since been used to sequence the genomes of countless species. Today, “genetics” is practically synonymous with “bioinformatics.” Could neuroscientists do the same—employ informatics to create a searchable manual for the brain?

That is the plan of the Human Brain Project, funded in 2013 by the European Commission to the tune of €1 billion. After a rocky, controversial start, the HBC is now building infrastructure that includes high-performance computing, data analytics, and simulation and modeling software. With these tools, postdocs in China and Massachusetts might analyze and collaborate on the same high-resolution data.

There’s nothing glamorous, prestigious, or headline-making about software, hardware, and data curation. Even so, many researchers say this is exactly what the field requires. And the pressure is on to produce, says Simon B. Eickhoff, an imaging specialist on the HBP neuroinformatics team.

“Coming up with something premature, selling it big, and having people go to the website and being disappointed is the very last thing we can do now,” Eickhoff says. “It’s on us now to deliver.”

Neuroscientist Henry Markram, who also founded the Swiss Blue Brain Project to simulate a portion of the rat brain in 2005, first presented his vision of the Human Brain Project in a 2009 TED talk. His plan was to simulate the several hundred trillion synapses of a human brain on a supercomputer within 10 years.

graphic link to page with the Q&A on Henry Markram

As soon as the HBP was funded, things got messy. Some scientists derided the aspiration as both too narrow and too complex. Several labs refused to join the HBP; others soon dropped out. Then, in July 2014, more than 800 neuroscientists signed an open letter to the European Commission threatening to boycott HBP projects unless the commission had an independent panel review “both the science and the management of the HBP.”

The commission ordered an overhaul, and a year later an independent panel published a 53-page report [PDF] that criticized the project’s science and governance alike. It concluded that the HBP should focus on goals that can be “realistically achieved” and “concentrate on enabling methods and technologies.” And that’s exactly what HBP has done over the past two years. It has deemphasized simulation, focusing instead on the task of mapping the brain in enormous detail. “After having trouble at the beginning, we are now on a good road,” says Amunts, now the scientific research director of the HBP.

“It’s a big relief,” says Henry Kennedy, research director at the Stem-Cell and Brain Research Institute in Lyon, France, who signed the original letter. “They’ve done what they said they would do in the mediation process.”

Markram, who lost his executive position at HBP, is more subdued about the changes. “I dedicated three years to win the HBP grant, bringing hundreds of scientists and engineers around the focused mission of simulating the brain,” he told IEEE Spectrum. The former project leader remains convinced that simulation is the future of neuroscience, but he expects the revamped HBP will still produce good science. “Building many different models, theories, tools, and collecting all kinds of data and a broad range of experiments serves a community vision of neuroscience very well.”

Today, HBP researchers are getting down to the hard work of proving the viability and importance of what is, in fact, a labyrinth of research platforms, subprojects, and advisory boards across Europe. It’s not the first large neuroscience initiative: In the United States, the nonprofit Allen Institute for Brain Science began in 2003, and the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative launched in 2013. Japan, Israel, and China have also commenced massive brain-research programs.

But the HBP is unique in its focus on informatics.

The guts of the project are neuroinformatics and high-performance analytics and computing efforts. Teams focused on these areas plan to provide software for researchers to access, share, and analyze many types of brain data.

For instance, just as cancer researchers pool the genomes of hundreds of patients (thanks to the Human Genome Project) and then analyze them to detect genetic abnormalities in a certain tumor type, neuroscientists might gather and compare brain scans from dozens of labs to detect connectivity problems in a rare form of epilepsy.

Researchers will be able to collaborate online, as well as access data and tools, including the rodent and human brain atlases. Today, for example, researchers pick and choose cell-density numbers from journal articles. Using the neuroinformatics platform, however, they will be able to systematically pull up cell-density data for, say, the visual cortex at the back of the brain while simultaneously viewing anatomical images of the region, says Timo Dickscheid, group leader for big-data analytics at the Institute of Neuroscience and Medicine at Jülich.  

A central goal of the HBP is to allow researchers to combine diverse data sets in the hopes of answering key questions about the brain.

Eickhoff’s team has been building a brain atlas based not on old drawings but on imaging data: anatomical scans, connectivity profiles of different brain areas, and results from behavioral experiments during scanning. In their very first experiment, the team upended received wisdom: The dorsal premotor cortex, a region spread across the top of the brain, was thought to be a jack-of-all-trades, involved in motor preparation, spatial attention, and other things. By combining data on brain connectivity and human behavior, the researchers found that the region is a mosaic of distinct elements, with each one performing a different job.

Simon B. Eickhoff/J\u00fclich Research CenterAge and Memory: The hippocampus plays an important role in memory and is one of the brain regions most affected by senile dementia. New imaging methods show that this region’s internal structure differs between that of young adults (middle) and older adults (right).Images: Simon B. Eickhoff/Jülich Research Center

It’s not just the collection of new kinds of data that will enable such refinements but also the collating of existing data sets that are now incompatible. “The data is there, but it’s up to us to curate it,” Dickscheid says.

He and his colleagues are developing a multilevel atlas of the human brain—a “Google Brain,” if you will. Users will be able to zoom in or out on any particular area of the brain, from the folds of the brain’s outer gray matter to the electrified paths of individual neurons and back again.

Among the technical challenges in developing such a tool is that of spatially anchoring data across scales. Say you’re studying the amygdala, a structure buried deep in the temporal lobe thought to process emotions. With such anchoring, you could zoom down from the amygdala’s almond-shaped outline (based on MRI data) to the micrometer scale of individual neurons (produced by two-photon fluorescence microscopy).

Right now the greatest push in the development of the atlas is the classification and indexing of data. The seahorse-shaped hippocampus, a hub for memory consolidation, is one such problem area, Dickscheid notes. Researchers haven’t agreed on exactly where it begins and ends, and there are close to 20 different ways in the literature to name the various parts of the hippocampus. Those sorts of issues obviously need to be sorted out before an authoritative brain atlas can be created.

The HBP’s brain-simulation systems are also being equipped with tools to span the levels of the brain. Carloni leads a molecular-simulation group attempting to simulate molecular events at the level of two communicating neurons, such as protein interactions that occur when a memory is formed. To do so at a molecular level—which has not been previously done, for lack of technology, says Carloni—requires quantum mechanics and an arsenal of supercomputing-based simulation tools.

On the other end of the simulation spectrum, Markus Diesmann and an international consortium are working to improve the Neural Simulation Tool (NEST), which Diesmann and Marc-Oliver Gewaltig created in 1994, when they were graduate students, for brain-scale mathematical models. In 2013, Diesmann and his colleagues used NEST to model a second’s worth of activity in a network 1 percent the size of that of a human brain—roughly 1.73 billion nerve cells connected by 10.4 billion synapses. The experiment utilized 82,944 processors on the K supercomputer, in Japan.

“It showed us that this is doable,” says Diesmann, “but it also showed us the limitations.” For one, neuronal-network simulations remain orders of magnitude away from the whole human brain. Plus, the supercomputer took 40 minutes to simulate 1 second of real, biological time—too little time to re-create events such as learning and memory. Yet thanks to the high-performance computing teams at the HBP, Diesmann is now developing faster algorithms for the simulations. He says a new and improved NEST should be available within two years.

Brain simulation could open up unimagined territory in both neuroscience and computing. For example, a human brain and a mouse brain consist of similar building blocks—a small piece of the tissue looks much the same under a microscope. But the human brain easily has 1,000 times as many neurons. So, what might a brainlike system that’s 10 times as large as a human brain be able to do?

Pilot testers are working on four of the subprojects in the HBP to ensure that the new tools are ones that neuroscientists will actually want to use. Few used the United Kingdom’s CARMEN (Code Analysis Repository and Modelling for E-Neuroscience) portal, launched in 2008 for researchers to store and share electrophysiology data sets. As a result, information available through the portal quickly became out of date. “The plan in the HBP is to have people working on the infrastructure, but they are surrounded by neuroscientists to keep the infrastructure construction on track,” says Diesmann.

And let’s not forget one of the HBP’s commercial aspirations, which is to use insights from the brain to improve computing. “In the end, we want to develop new products out of this,” says Alois C. Knoll, chair of robotics and embedded systems at the Technical University of Munich.

The HBP’s neuromorphic computing efforts, for instance, expand on two previous brain-inspired computing tools, SpiNNaker and BrainScaleS [see “The Brain as Computer: Bad at Math, Good at Everything Else,” in this issue], which try to emulate the high-speed, low-energy computation of the brain.

The neurorobotics team, led by Knoll, is working to develop a cloud-based simulation environment where researchers can program and control virtual robots with nervous systems modeled on brains. Today, such a task is prohibitively complex, but using the simulation web interface, a user could decide on a robot body (mouse, please), pick an environment (a maze), choose a task for the robot (escape!), connect the robot to a brain (a theoretical neural network model, maybe, or a reconstruction based on experimental data), and run the experiment.

“Essentially, we are virtualizing robotics research,” says Knoll. He hopes such tools will speed up the development of new robots by “orders of magnitude.”

In March 2016, the HBP made six prototypes of their systems of tools publicly available online. Some scientists look forward to using the tools: “Our work will sit quite happily on one of their platforms,” says Kennedy, who is not currently involved in the HBP.

Others are reserving judgment. People have tried and failed to develop resources for storing and sharing data in the neurosciences many times, says Alexandre Pouget, a computational neuroscientist at the University of Geneva, who is not involved in the HBP and who signed the public letter criticizing the project in 2014. “Trying to do it across all of neuroscience might be too ambitious, but we shall see.”

The Juqueen supercomputer.Electronic Brain in a Box: The JuQueen supercomputer is currently the workhorse at the Jülich Research Center. In future, brain scientists will be able to work in real time, using more advanced machines; this should lead to a virtual but biologically realistic brain in silico.Photo: Jülich Research Center

Sheer computing muscle is one thing that won’t be a problem, says Boris Orth, the head of the High Performance Computing in Neuroscience division at the Jülich Supercomputing Center. Orth walks between the monolithic black racks of the JuQueen supercomputer, his ears muffled against the roar of cooling fans. This is one of the big machines that HBP researchers are using today. Jülich recently commissioned JURON and JULIA, two pilot supercomputers designed with extra memory, to help neuroscientists interact with a simulation as it runs.

Such systems could be eventually used to satisfy Markram’s original vision—a virtual but biologically realistic brain modeled in silico. But for now, Orth and his team are just trying to get neuroscientists to use the supercomputers. “It’s a challenge sometimes to really understand each other when we discuss requirements,” says Orth. “People think something should be very easy to do, when in fact it’s not.”

Still, many are taking advantage of the facilities, at Jülich and elsewhere. The simulation team led by Diesmann has been successfully running NEST to run on the Jülich supercomputer for years. The neurorobotics team is running its visualizations on a small server farm in Geneva, but it plans to move soon to the Swiss National Supercomputer Center in Lugano.

The HBP is scheduled to end in 2023, 10 years after it began. “We don’t think brain research will be over when the Human Brain Project comes to an end,” says Knoll with a laugh. And Amunts, Knoll, and others hope the project will have a life after death: The scientific directorate of the project has submitted an application to make the HBP an independent legal entity capable of gathering funds on its own.

If that were to happen, the HBP could become a brick-and-mortar hub for advanced neuroscience research—a CERN for the brain, if you will. Maybe then, like the discovery of the Higgs boson or the creation of antimatter, we’ll begin to make long-anticipated discoveries about ourselves.

The Conversation (0)