Nest_thermostat_vs_hal_9000

Personal Autonomy Is Evaporating. Should We Care?

Once upon a time, a car was an industrial machine you climbed in and drove around. Today, it's also a tracking and nudging machine that second-guesses you for your own good. It reminds you to fasten your seat belt and makes sure you don't lock yourself out, and it contains a "black box," much like a jetliner's, that records direction, speed, seatbelt position and other details. Soon, cars will go beyond giving advice, and drive themselves (already legal in three states). Eventually they will be so good at driving that (as Gary Marcus has noted) it will be illegal and immoral for you to take the wheel. Meanwhile the "Internet of Things" will have knitted that future car into a network of other devices and apps—the retailers that sell you stuff, the house whose thermostats, smoke detectors and appliances know you inside and out. (That's the "conscious home" that Nest is working toward, as its founder wrote Monday in a post on the startup's acquisition by Google.)

Personal tech is just the shiny edge of a broader change. We are heading quickly toward an "Other Knows Best" world, in which everything and everybody second-guesses you for your own good. That may be a world of easier shopping and friction-free government, better health and safer lives (thank you, surveillance cameras). It will certainly be a world of sharply reduced personal autonomy.

Autonomy is often described as personal self-government, or "the condition of being self-directed," as the philosopher Marina Oshana has put it. But after your car drives itself, your refrigerator tracks the milk and egg supply, City Hall imperceptibly nudges you towards the "right choice," your favorite stores tell you what you want to buy, and you have outsourced your willpower to apps and wearable gadgets that tell you to eat salad and go to the gym—what is left for you to direct? The scope of self-government is shrinking, and that is going to alter people's relationship with the state, with business, and with each other.

Already, businesses, with more knowledge about you on their servers than could ever fit in your own head, are using that data to figure out what you might buy before you think about it. And those businesses also have a bead on your innate biases and predilections—those habits of thought and action that you can't help—which they'll use, again, to get you to spend (as a consumer) and to cooperate (as an employee). Because we're greatly influenced by other people in our social networks, for example, Facebook may make sure you know that a pal of yours likes a restaurant's FB page, or use your photo in an ad displayed to one of your friends. Because we can be motivated by reactions we aren't aware of, Campbell's soup included biometric measurements of consumers in its research for a label re-design.

And the "Other Knows Best" world is also, of course, a place where government uses the same techniques as business to get you to save water, recycle, pay your taxes on time and engage in other "pro-social" behaviors with less pain to you and less cost to the authorities. That's the promise (or peril, if you prefer) of the boom in "nudge"- type regulations.

Now, there is nothing particularly new or sinister about a car that won't let you kill yourself, or a company that would like you to buy its fine product, or a government that really wants you to quit smoking. Organizations have been trying to change people's behavior for as long as there have been organizations. The great change now underway involves means, not goals. Car safety and advertising and government regulations in 1980 were appeals to the conscious mind: Buckle your seat belt because it could save your life; buy our product because it will leave your breath minty fresh; pay your taxes or we're coming after you. Consider what we say, oh rational citizen, and then decide.

The defining trait of "other knows best" techniques is that they work outside awareness. They go around the conscious mind. For example: Old-school analysis of consumers worked with data those consumers decided to provide (would you please fill out this attitude survey? check the box for race and gender). New school "Big Data" analytics works with information people don't even know they're sending—patterns in their Facebook "likes" and tweets, habitual paths tracked by cell phone towers. Old-school government mandates offered a tax credit, or threatened a fine, for particular behaviors. New-school "choice architecture" aims to get you to do the right thing, whether or not you think about it.

To notice all this is not to subscribe to the right-wing fantasy that a "nanny state" is conspiring to take away your autonomy. First, there is no black-hatted or helicoptered villain out there with designs on people's freedom. The justification for the tax nudge is the same as that of the Facebook marketing and the self-driving car. It works, it helps people, it accomplishes goals that older tools achieve less well. It's not a conspiracy.

Second, the advent of autonomy-reducing technologies isn't confined to governments and giant corporations. Individuals use these techniques and technologies, too. Who wouldn't want to know if a potential hire had been arrested, or said bizarre things on Twitter? Even as we are monitored by those who seek to predict our behavior, we also monitor others (with apps, with nanny cams). For example, Verizon now offers its customers a "new tool to help parents set boundaries for children," called FamilyBase. For $5 a month, it gives parents a complete a report on all activity on their children's phones—calls, texts, apps downloaded, time spent talking and the times of conversations. Few are the parents who high-mindedly say they don't want, and shouldn't have, such information.

We who resent being spied upon by the state also endorse the state spying on other people. (The rule seems to be: I, in my glorious individuality, am unpredictable, but please do use Big Data analytics on those other people to predict who will try to blow up a plane next year.)

Then, too, we use these autonomy-limiting techniques on ourselves. Hundreds of thousands of people have tacitly accepted that they aren't nearly as thoughtful or hardheaded as they'd like to think. They are gladly outsourcing their self-control and decision-making to gadgets and applications that nudge them to eat "right," get exercise, save money and so on.

All of these developments undermine "the principle that in deciding what is good and what is bad for a given individual, the ultimate criterion can only be his own wants and his own preferences," as the economist John Harsanyi defined autonomy. Big Data predictions and psychological nudging undermine the idea that your consciously declared desires are paramount. Outsourcing choices to apps and gadgets gives the lie to the notion that your wants and preferences are consistent over time (if they were, why do you need an app to make you take the stairs?). And so all these developments mean the scope of personal autonomy—the range of choices you are expected to make for yourself—is shrinking and will continue to shrink. The car that won't let you drive is a great emblem of the world to come.

Over the four years that I've been writing here about human irrationality, the ground has shifted. It's no longer news that a human being doesn't behave like the coherent, consistent, self-aware, me-first person of classical economic models. And I don't hear so often any more that behavioral economics is just a bunch of oddball results with no central theory (not since the publication of Daniel Kahneman's Thinking Fast and Slow, which is to the behavioral approach what The Wealth of Nations was to classical economics). Today, I hear a great deal more talk about applications. Goodbye, "gee whiz, people aren't so rational"; hello, "here is how we can harness irrational behavior to get people to use mosquito nets." I don't celebrate this development, nor do I deplore it. But I want to point out that its effect is to undermine the idea that each of us is the ultimate authority on, and in, our own lives.

So in 2014 the focus of this blog is shifting to the question of personal autonomy—what is it? is it worth saving? if we can't save personal autonomy, what do we do about those features of democracy (elections, civil rights, equality, civil and criminal trials) that seem to depend on it?

I hope the topic will resonate with you, and that you'll keep me apprised of any "other knows best" moments you have as you deal with life at work, school, the doctor's office, the store and other places where we are seeing these changes play out. I'm eager to hear about your experiences, so please decide of your own free will to discuss them in the comments, or drop me an email.

Illustration: "Just what do you think you're doing, Dave?" Left, the all-knowing computer HAL-9000 from 2001: A Space Odyssey; Right, a Nest thermostat. From the Facebook page "Nest Thermostat vs Hal 9000."

Follow me on Twitter: @davidberreby

comments powered by Disqus
×