The December 2022 issue of IEEE Spectrum is here!

Close bar

The Allen Institute Takes A Look At How We See

Christof Koch and neuroscientists at the Allen Institute will lead a $300 million effort to map visual processing in the mouse brain.

2 min read
The Allen Institute Takes A Look At How We See

What's the difference between seeing my mother's face and imagining it when she's gone? Why are there moments when, for the life of me, I can't remember what she looks like? When I see a flat picture of her, does my brain fill in bits that the camera didn't capture? How many features of her face would someone have to change for me not to recognize her?

The good news is that there are scientists who are trying to answer these questions. The bad news: most of them are actually looking at only a tiny part of the picture; they're doing it in labs that are spread out across the planet; and they're working under the aegis of universities and foundations whose primary goal is to keep the money flowing. In this environment, competition often trumps collaboration.

Since 2003, The Allen Institute for Brain Science has embraced a different model: collect a squad of really smart people under one roof and have them all work on the same problem. Starting this year, they'll begin looking at how mammals see.

The approach has certainly worked for them in the past. In 2004, IEEE Spectrum called the Allen Institute's Brain Atlas a winner, and eight years later, it looks like we got it right. The project, which mapped the expression patterns for 20,000 genes in the mouse and human brain, has become a powerful, publicly accessible tool for neuroscientists in every field. "It’s absolutely fantastic," says Christof Koch, the recently appointed chief scientist (slash philosopher, consciousness-columnist, and color-maven) at the institute.

With a US $300-million-dollar gift from Paul G. Allen, the Institute's namesake, at his disposal, Koch will lead an effort to decode how the the brain processes visual information—an initiative that he described in an interview for Nature. For the last 25 years, Koch has studied consciousness at the California Institute of Technology (occasionally sharing his insights with this magazine).  He sees the areas of overlap between that work and the Allen Institute's mission. As he explained in Nature, "We've used every technique there is in volunteers and patients. But to understand consciousness, we need to be able to image the activity of millions of individual neurons at the same time." 

To gain a better understanding of how mammals see, the institute will use emerging techniques in optogenetics, allowing them to use light as a genetic switch that activates and deactivates protein expression in the brain. Imaging has finally come far enough that we can produce data indicating causation rather than just correlation, explains Koch.

But why vision? Why not smell, or taste, or touch? Koch says you could study any of the senses and come out learning about consciousness. What we feel, how we form language, how we learn. It's all informed by sensation. But vision is the most direct, and therefore probably the easiest to study in animals. "You open your eyes and you see something," explains Koch. Researchers also have ways to play tricks with vision—the same ones that magicians use—which can help when designing experiments.

For this next push, the Allen Institute plans to assign its researchers to task forces (what they are calling "brain observatories"), with some looking at anatomy, others taking electrical recordings, and still others building computer models of the mouse cortex. But all of them will work next to each other and with each other. "We want to focus all these observatories on one piece of organic matter," says Koch.

With $300 million dollars and that kind of mentality, it's as close as neuroscience gets to shock and awe.

The Conversation (0)

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
Vertical
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic
DarkGray

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}