The Quest to Measure the Brain's Response to Urban Design

"It's the holy grail for architects."

Image
Zachary Tyler Newton (Van Alen Institute)

So there I was, walking down familiar cobbled streets in Brooklyn’s Dumbo neighborhood, trying not to feel self-conscious even though I resembled a character in a 1990s sci-fi rendering of The Future. Not only was I wearing an EEG device that looked like a bulky phone headset, with a sensor positioned squarely in the middle of my forehead. I was also dutifully walking, as instructed, at a “museum pace,” and maintaining what we had been told to think of as “robot torso.” That meant that when I saw something that interested me – a display in a shop window, a piece of graffiti, a wan and shivering model being photographed in the middle of the street – I was supposed to turn my whole body to look at it C-3PO–style, pointing the iPod Touch in my hand in the direction of my gaze.

I felt a little ridiculous. But people in a trendy New York neighborhood like Dumbo are careful not to act surprised at anything weird-looking. That would be uncool. So no one paid much attention to me or to the dozens of other headset-wearing people wandering slowly and robotically around the neighborhood that same chilly afternoon.

Anyway, I told myself, it was all for science. Or at least for art.

Participants in the Van Alen Institute's ubran landscape brain mapping project roam the streets of New York's Dumbo neighborhood. (Zachary Tyler Newton, courtesy Van Alen Institute).

We were participating in a little experiment trying to answer the question, “How does the brain respond to the city?” The headsets were recording second-by-second readings of our brain waves via Bluetooth to an app on the iPod. The resulting gigabyte of data, gathered from about 50 participants, will be aggregated into a visualization to be presented May 13 at Issue Project Room in Brooklyn. It’s part of the Van Alen Institute’s multiyear “Elsewhere: Escape and the Urban Landscape” project.

One of the main people behind the brain-imaging experiment is Mark Collins, an architect, programmer, and professor with Columbia University’s Cloud Lab. Collins and his colleagues have been playing around with the rapidly evolving and increasingly mobile technology that allows us to monitor our brain waves, hoping to harness the resulting data to better comprehend how human beings interact with their urban and architectural environments. Most recently, they’ve been working with the relatively inexpensive technology we all were wearing in Dumbo, EEG biosensors from a company called NeuroSky.

Such “brain-computer interfaces,” or BCIs, will potentially allow designers to see the effect of their work on the people that use it in a radically new way. “It’s the holy grail for architects, who are trying to be empathetic and really understand what people’s experience is,” says Collins. Along with his colleague Toru Hasegawa, the director of Cloud Lab, Collins has been trying to figure out good ways to do that, even as BCI technology changes from month to month. “It’s an incredible moment in the history of technology,” says Collins. “We thought architects should project themselves into that.” Things are moving so quickly in the world of wearable computing devices and biosensors, he acknowledges, that it may be impossible to stay ahead. “We’re all playing catch-up,” he says. “It’s maybe not even something to be caught up to.”

The NeuroSky device we were using in Dumbo takes readings of the brain’s electrical activity as it is transmitted to the body’s surface – specifically, in this case, the forehead. An algorithm then takes those readings of beta, theta, delta, and other waves, and summarizes them into two general states – attentive and meditative. The idea behind the visualization will be to “spray” this data onto a 3D map of the neighborhood we walked around and see what it reveals about the mental state of the experiment’s participants as they moved through space.

“We’re creating a new kind of camera,” Collins told the participants before we set out “into the wild” to start recording our reactions. “It’s a camera for mental activity. We wanted to really train that mental camera on a specific environment. Each and every one of you is a pixel in our digital camera.”

Participants receive training at the Van Alen Institute before heading out to gather data on their brain waves. (Zachary Tyler Newton, courtesy Van Alen Institute).

The researchers selected a several-block-square area of Dumbo because the neighborhood contains several different types of urban settings in a kind of microcosm. There are big, loud pieces of infrastructure, such as the Manhattan Bridge; narrow cobbled streets with boutiques and galleries; a public waterfront park; and quieter residential and office blocks. Not all of us walked every part of the area under consideration. Different groups took different routes, all with the help of guides to keep us on track, prevent us from tripping, and troubleshoot any hardware or software problems.

An earlier experiment, with a single user walking around Lincoln Center, yielded a data visualization that the team is using as a prototype. Collins told me that it reflected an interesting result: when the subject was in parts of the Lincoln Center plaza that are more open to the city’s streets, he recorded more “meditative” brain waves; when he was in the more enclosed and architecturally circumscribed, ultramodernist part of the campus, his response was more attentive.

Collins says that by its nature, the Dumbo data-gathering effort was not a rigorous scientific experiment, but more of a large-scale art project. The conditions were obviously anything but laboratory-controlled. “We had to embrace the noise,” he says. “In a sense, we’re embracing everything [neuroscience researchers] are trying to remove.” Still, he says, when he discusses his work with neuroscientists, “at first they’re laughing. And then they’re saying, ‘Hey, what’s happening here?’”

There are numerous technological challenges, but it’s only a matter of time, Collins predicts, before readings from the next, more sophisticated generations of BCI devices reliably give us “another layer of data” to consider when we design cities or neighborhoods or when we make decisions about the urban experiences we want to pursue. Ultimately, he imagines data visualizations that users will be able to explore in order to learn about the texture of urban experience, whether as creators or consumers.

Interacting with the devices that generate that data, Collins thinks, will become increasingly common, just as wearing a fitness tracker is no longer unusual. But BCIs require more active participation from users than do devices that monitor heart rate or movement. Presumably, devices like Google Glass could make “participating” easier, but wearing them can make you conspicuous in a not-so-great way.


A data visualization of Cloud Lab's Lincoln Center experiment.

The other day in Dumbo, I didn’t look at the app display where my results were showing up while I was walking for fear of creating misleading results (or getting hit by a car). So I wasn’t able to see whether I was in meditative mode or attentive mode at any given point – for instance, when I watched a well-groomed King Charles spaniel pee all over its own fluffy leg, much to the consternation of its equally well-groomed owner. I did get an Excel spreadsheet of my own data results, but on top of not being a neuroscientist, I am terrible with spreadsheets. So I’m looking forward to seeing if Collins and his team can make sense of all those data points with a good visualization.

I think he’s probably right when he says that interacting with BCIs is going to be a more common part of life in the near future, even if the backlash to the most ostentatious mobile computer, Google Glass, continues. We may not have Her-style affairs with our devices, but we may find ourselves bonding with them in ways we don’t expect.

“It’s beyond sci-fi how these things are evolving together,” says Collins. “It’s not a human becoming like a computer or a computer becoming like a human. It’s a participatory framework between the two, and each becomes a little more like the other.”

The “How Does the Brain Respond to the City? tech demo and conversation will take place Tuesday, May 13, from 7 pm to 9 pm at ISSUE Project Room, 22 Boerum Place, Brooklyn, NY. Ticket information here.

About the Author

  • Sarah Goodyear has written about cities for a variety of publications, including Grist and Streetsblog. She lives in Brooklyn.