What’s the Big Idea?
David Eagleman’sLaboratory for Perception is located on the ground floor of Baylor College of Medicine, but the vibe is more creative think tank than clinical academic enclave. The walls are enamelled in dry-erase paint and marked up with impromptu sketches, arrows, and words like SYNESTHESIA spelled out in childish block letters. Coffee mugs are covered with to-do lists. There’s not a single white coat on display.
Watch the video:
As director of the lab, Eagleman hopes to gain insight into how the brain creates reality, and his methods are as unconventional as the space in which he and his team work.
To conduct the first ever systematic study of synaesthesia, they created an online test for the neurological condition (which causes people to perceive numbers, letters, and sometimes pieces of music as inherently colored, i.e. “one is green, two is pink, three is a purplish-blue”). The test went viral and gave them instant access to a global pool of synesthetes. To determine whether time slows down during life-threatening events or only seems to, Eagleman and his research assistant strapped themselves into harnesses and jumped off the roofs of tall buildings.
What’s the Significance?
If Descartes had built a theme park, it would probably look a lot like this. But the presiding philosophy of the Laboratory for Perception is ultimately more informed by the possibilities of the future than by the past. Eagleman is fascinated by the idea that we could import the technology into human biology to enhance our sensory perception of the world, broadening and deepening our reality.
“As it stands now, as biological creatures, we only see a very small strip of what’s going on,” he said in a recent interview with Big Think. “Take electromagnetic radiation: there’s a little strip of that that we can see… [but] the rest of the spectrum — radio waves, television, cell phone, gamma rays, x-rays is invisible to us because we don’t have biological receptors for it. CNN is passing through your body right now and you don’t know it because you don’t have the right receptors for it.”
Eagleman and his team are currently at work on a vibratory vest that feeds sensory information into the skin rather than through other channels (for example, the eyes or ears). The technology would enable deaf people to hear by picking up on the auditory stream via a microphone. The stream becomes a matrix of vibrations on the skin, sending electrical signals to the brain that represent auditory information.
“If it sounds crazy that you would ever be able to understand all these signals through your skin, remember that all the auditory system is doing is taking signals and turning them into electrical signals in your brain,” he says. “It doesn’t matter how you get those data streams there.” In the future, other data streams could be streamed into the vest, meaning that people could walk around unconsciously perceiving the weather report.
There’s no limit to the possibilities, and nature provides neuroscientists with a constant source of inspiration: “Snakes see in the infrared range and honey bees see in the ultravnstantiolet range. There’s no reason why we can’t start building devices to see that and feed it directly into our brains.”