“It’s beyond sci-fi how these things are evolving together,” says Collins. “It’s not a human becoming like a computer or a computer becoming like a human. It’s a participatory framework between the two, and each becomes a little more like the other.
Collins and his colleagues have been playing around with the rapidly evolving and increasingly mobile technology that allows us to monitor our brain waves, hoping to harness the resulting data to better comprehend how human beings interact with their urban and architectural environments. Most recently, they’ve been working with the relatively inexpensive technology we all were wearing in Dumbo, EEG biosensors from a company called NeuroSky.
Such “brain-computer interfaces,” or BCIs, will potentially allow designers to see the effect of their work on the people that use it in a radically new way. “It’s the holy grail for architects, who are trying to be empathetic and really understand what people’s experience is,” says Collins. Along with his colleague Toru Hasegawa, the director of Cloud Lab, Collins has been trying to figure out good ways to do that, even as BCI technology changes from month to month. “It’s an incredible moment in the history of technology,” says Collins. “We thought architects should project themselves into that.” Things are moving so quickly in the world of wearable computing devices and biosensors, he acknowledges, that it may be impossible to stay ahead. “We’re all playing catch-up,” he says. “It’s maybe not even something to be caught up to.”
The NeuroSky device we were using in Dumbo takes readings of the brain’s electrical activity as it is transmitted to the body’s surface – specifically, in this case, the forehead. An algorithm then takes those readings of beta, theta, delta, and other waves, and summarizes them into two general states – attentive and meditative. The idea behind the visualization will be to “spray” this data onto a 3D map of the neighborhood we walked around and see what it reveals about the mental state of the experiment’s participants as they moved through space.
“We’re creating a new kind of camera,” Collins told the participants before we set out “into the wild” to start recording our reactions. “It’s a camera for mental activity. We wanted to really train that mental camera on a specific environment. Each and every one of you is a pixel in our digital camera.”