Skip to content
Who's in the Video
Sam Wang is an associate professor, Department of Molecular Biology and the Princeton Neuroscience Institute.  Wang grew up in California and studied physics at the California Institute of Technology. Seeking[…]

Sam Wang says Google is a trade-off between rapid knowledge and knowledge retention.

Sam Wang: I don’t know the answer to the question, is Google making us stupid. That’s a funny one.

I have to admit that I’m kind of stogy. I am a big fan of learning factual knowledge and having those facts at the ready. And I think that’s important.

And one thing I have very much notice is that this is something that is less empathized in my students and I think that’s going to continue in the future.

Getting back to the concept of fluid intelligence and G is that modern society for the last hundred years has increasingly rewarded reasoning skills and the manipulation of knowledge as opposed to the collection of knowledge.

I think that Facebook and Google are just the most recent manifestations of that increasing trend. My own view is that I think there’s a lot of value in the learning of factual knowledge. And I think that that is something that’s being lost.

But the thing that’s gained is that at the press of a key, you can get facts and start to manipulate them and turn them into something new. I think there’s a large gain and I don’t know how to trade-off those two things.

 

Sam Wang: Videogames have the potential for improving cognitive function.

There’s a researcher, Daphne Bavelier, who is very interested in the effects of videogames. So, for instance, one kind of thing that seems to occur is that videogames improve hand-eye coordination. And that’s a fairly obvious thing.

Depending on the type of videogame, say, for instance, if it’s a complex game like Sim City, that can lead to improvements in tasks switching. It seems likely, although I don’t think it’s been exactly demonstrated, it seems highly unlikely that videogames might improve multitasking abilities. There are limits to multitasking. Basically, multitasking, in most cases, leads to decrease performance in both tasks but at the same time, it’s a thing that it’s potentially trainable by videogames.

Question: Can computers outmatch the human brain?

Sam Wang: I think that what [Ray] Kurzweil does is take existing discoveries, whether being computer science or neuroscience and extrapolates to the maximum. So if you look in the “Singularity is Near,” he does a very good job of describing current neuro-scientific discoveries by my friends and colleagues and describing what’s been done. And then, he takes those and drives them up to the maximum possible interpretation.

So maybe we can observe a single synaptic connection being formed or breaking. That’s where the state of the art is. But that doesn’t mean that we can map an entire brain or copy a brain. That will be a massive technological challenge. So that’s one just an example of being a little bit too optimistic.

Another practical problem is that computational power is increasing tremendously and he calls upon that idea. But the fact of the matter is that energetic efficiency isn’t quite increasing enough to keep pace. And so, for instance, he likes to cite the year 2020 as a time when we can have something about the size of a brain that will compute as well as a brain does. But it looks to me like, looking at those trends, that the amount of power that such a thing would consume will be about as much power as is consumed by all of Washington D.C. So there’s a practical issue there.

Now, if brain function is ever replicating in any meaningful way, it’s probably going to rely heavily on the idea that we’re going to start understanding the heuristics and the non-algorithmic principles underlying how brain works. So looking at real brains, for instance, looking at how we use emotions to guide decisions or looking at the shortcuts that actual brains use to make decisions and to assign value to things. Those things are very much the meat and potatoes of current neuroscience and where neuroscience is headed.

So it’s possible to imagine using those principles to make better computing devices. But I think that replacing our brains by the year 2020, let’s just say I’ll be willing to place a very large wager against that.

 

Recorded on: April 24, 2009.

 


Related