Your Brain Is Vulnerable to Hacking. Companies Are Exploiting That.
We can't seem to resist frequent rewards, which is why slot machines and social media are both so addictive. What's more, they're designed that way, purposefully, to keep you coming back.
Tristan Harris is a design thinker, philosopher and entrepreneur.
Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris was a Design Ethicist at Google and is now a leader in Time Well Spent, a movement to align technology with our humanity. Time Well Spent aims to heighten consumer awareness about how technology shapes our minds, empower consumers with better ways to use technology and change business incentives and design practices to align with humanity’s best interest.
Tristan is an avid researcher of what influences human behavior, beliefs and interpersonal dynamics, drawing on insights from sleight of hand magic and hypnosis to cults and behavioral economics. Currently he is developing a framework for ethical influence, especially as it relates to the moral responsibility of technology companies.
His work has been featured on PBS NewsHour, The Atlantic Magazine, ReCode, TED, 1843 Economist Magazine, Wired, NYTimes, Der Spiegel, NY Review of Books, Rue89 and more.
Previously, Tristan was CEO of Apture, which Google acquired in 2011. Apture enabled millions of users to get instant, on-the-fly explanations across a publisher network of a billion page views per month.
Tristan holds several patents from his work at Apple, Wikia, Apture and Google. He graduated from Stanford University with a degree in Computer Science, focused on Human Computer Interaction, while dabbling in behavioral economics, social psychology, behavior change and habit formation in Professor BJ Fogg’s Stanford Persuasive Technology lab. He was rated #16 in Inc Magazine’s Top 30 Entrepreneurs Under 30 in 2009.
You can read his most popular essay: How Technology Hijacks People’s Minds – from a Magician and Google’s Design Ethicist.
Tristan Harris: One thing we don't talk about is that—it's sort of hard to talk about this—our minds have these kinds of back doors.
There's kind of—if you're human and you wake up and you open your eyes there is a certain set of dimensions to your experience that can be manipulated.
When I was a kid I was a magician, and you learn all about these limits: that short-term memory is about this long and there's different reaction times, and if you ask people certain questions in certain ways you can control the answer. And this is just the structure of being human. To be human means that you are persuadable in every single moment.
I mean the thing about magic, as an example, it’s that magic works on everybody—sleight of hand, right?
It doesn't matter what language you speak, it doesn't matter how intelligent you are, it's not about what someone knows. It's about how your mind actually works.
So knowing this, it turns out that there's this whole playbook of persuasive techniques that actually I learned when I was at the Stanford Persuasive Technology Lab and that most people in Silicon Valley in the tech industry learned as ways of getting your attention.
So one example is: we are all vulnerable to social approval. We really care what other people think of us. So for example, when you upload a new profile photo of yourself on Facebook, that's a moment where our mind is very vulnerable to knowing, “what do other people think of my new profile photo?”
And so when we get new likes on our profile photo, Facebook—knowing this—could actually message me and say, “oh, you have new likes on your profile photo.” And it knows that we'll be vulnerable to that moment because we all really care about when we're tagged in a photo or when we have a new profile photo.
And the thing is that they control the dial, the technology companies control the dial for when and how long your profile photo shows up on other people's newsfeeds, so they can orchestrate it so that other people more often end up liking your profile photo over a delayed period of time, for example, so that you end up having to more frequently come back and see what the new likes are.
And the problem is that they don't do this because they're evil, they do it because, again, they're in this race for our attention.
And we should also ask: is that necessarily such a bad thing if they're orchestrating it so that other people like my photo? I mean that might feel good to me.
So we have to have a new conversation about, as these technology companies use these techniques, these vulnerabilities in our minds, when is that actually aligned and good for us? When is that ethical? When is that honest? When is that fair? And when is that dishonest and unfair? Because they're actually manipulating our minds in a way that doesn't add up to our spending our time well on the screen.
Well, so another vulnerability in our mind is something called a variable schedule reward, and that's like a slot machine in Las Vegas. It turns out that slot machines make more money in the United States than baseball, movies and theme parks combined.
People become addicted to slot machines, I think it's two to three times faster than any other kind of gambling in a casino. So it's insane. And why is that?
Because it's very simple: you just pull a lever, and sometimes you get a reward and sometimes you don't. And the more random it is and the more variable it is the more addictive it becomes.
And the thing is, that that turns our phone into a slot machine, because every time we check our phone we're playing the slot machine to see “what did I get?”
Every time that we check our email, we're playing the slot machine to see, “What did I get? Did I get invited to an interview at Big Think or did I just get another newsletter?”
Or if you're on a dating site like Tinder and when you're swiping, each swipe is: you're playing the slot machine to see “did I get a match?”, I'm playing the slot machine to see, “did I get a match?”
And the problem is that this dynamic, these variable schedule rewards or this slot machine mechanic, is so powerful that it's the best thing at addicting people and putting you in the zone.
One of the original designers of the Facebook newsfeed told me that the thing that made the newsfeed work at the very, very, very beginning back in 2006 was in part a hardware innovation.
And I looked at her and I said, "What do you mean?" And she said it was actually the scroll wheel on a mouse, because with the scroll wheel on a mouse your hand never had to leave its resting position—you just scroll to see the next thing. Because before that you had to click on the down button or move your mouse and drag the arrow down and scroll the page that way. When you don’t, you can take your two fingers on a track pad and do this, or you can just scroll on a scroll wheel—it just means your hand never has to leave its resting position, and it's more like a slot machine: you can just keep swiping and playing just like in Vegas where there's the button right there. They actually changed it. In fact, it used to be a lever, and now it's just a button in Vegas because they found it's easier just to get people to see if they get a match this way.
So, how much on our phones, when we use our technology, Instagram is like a slot machine? What's going to come next on the feeds? Snapchat is a slot machine, each time you see the red notifications and you don't know what's behind it you're playing the slot machine when you click on it to see “what did I get?”
And so it's sprinkled all throughout these products because it's a very compelling way of getting people's attention.
Casinos, magicians, and the makers of social media platforms all know something about you: your mind is very vulnerable to influence. Just as the magician relies on limitations in your short term memory or visual acuity to accomplish sleight of hand, online software engineers leverage the limits of your mind to make their product addictive. From the sonorous ping of mobile phones to Facebook's highly nuanced algorithm, product makers understand that frequent reward is what keeps you coming back. And just like slot machines, the easier those rewards are to access, the more frequently we'll want them.
Should humans fear artificial intelligence or welcome it into our lives?
- Sophia the Robot of Hanson Robotics can mimic human facial expressions and humor, but is that just a cover? Should humans see AI as a threat? She, of course, says no.
- New technologies are often scary, but ultimately they are just tools. Sophia says that it is the intent of the user that makes them dangerous.
- The future of artificial intelligence and whether or not it will backfire on humanity is an ongoing debate that one smiling robot won't settle.
A new study from Singapore found that intermittent fasting increases neurogenesis.
- Rats that fasted for 16 hours a day showed the greatest increase in hippocampal neurogenesis.
- If true in humans, intermittent fasting could be a method for fighting off dementia as you age.
- Intermittent fasting has previously been shown to have positive effects on your liver, immune system, heart, and brain, as well as your body's ability to fight cancer.
Researchers argue that most coronavirus infections around the world go undetected.
- A new paper contends that only 6% of actual coronavirus infections have been detected.
- Delayed and inadequate testing as well as differences in reporting are to blame.
- The researchers argue that better testing needs to be set up before social distancing is eased.