Skip to content
Who's in the Video
Jonathan Zittrain is a Professor of Law at Harvard Law School, Professor of Computer Science at the Harvard School of Engineering and Applied Sciences, Vice Dean for Library and Information[…]
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Harvard Law Professor Jonathan Zittrain pulls back the digital curtain.

Question: Is privacy an illusion?

Jonathan Zittrain: I devote an entire chapter of the book, right before I end the book, to privacy. And I do think that privacy is in pretty great danger. As many others have observed, we tend to get used to it. When the first people had to walk through airport metal detectors, it was the most intrusive thing you can imagine. And now we find ourselves half disrobing and not even giving it a second thought as we walk through. What strikes me most though, in 2008, are the ways in which the invasions of privacy around the corner are not coming from the usual suspects. They're not coming from big corporations. They're not coming from government, even though those are the ones we're used to, respectively, gathering information about us so they can target ads or give us credit, or government wanting to figure out who has pot in the car or something. Instead, I see the threats coming from our fellow citizens, who have very cheap cameras in their mobile phones, or video recorders, or microphones. Teaching a class now, I have no idea if the class, as it is happening, is being streamed onto the internet for people to watch. It could easily happen, thanks to the wonderful infrastructure we have of really cheap sensors, hooked up to great networks, with fast processors. Now Posener, as I understand it, and as I've read from him, he basically sees privacy as a sort of irrational desire anyway. I suppose if people have it, he might end up respecting it. But from what I've read, he has tended to favor the government having pretty good license to snoop, particularly in anti-terrorism matters. And if it started to abuse that power, surely there'd be a whistle blower, and we'd all find out, and then there'd be a Watergate. I'm not nearly as sanguine as he that that would happen so nicely every time a government is empowered to snoop. But more important,

I think again, the risks are coming from the general public, and the database of pictures generated by the world's tourists, appearing in Flicker or Facebook. Facebook, I think, has over 2½ billion photos in it right now. Those will soon be sorted by exact location, down to the GPS recording, by timestamp and by who's in the photograph. Facial recognition technologies will make it much easier to say who's in the photo, even if you don't know, as the person taking it. And at that point, I can start asking questions like, "Where was John Smith over the past two weeks? Where did tourists manage to catch him on film?" or, "Show me everybody going in and out of this Planned Parenthood clinic as the photos reveal it." That's a level of privacy invasion that I don’t think we've hit yet, and that causes trouble. In Chapter 9, I try to say more about how we might take the sting out of it.

Question: How can privacy be protected?

Jonathan Zittrain: Well, the solutions basically-- just as the solutions in other parts of the book-- try to look at that special quadrant in the bottom left, to say before we pass a law that says, what? You're not allowed to take pictures? You're not allowed to identify somebody in a picture? You can take a picture and identify them, but you can't use it for certain purposes? There's all sorts of drawbacks to try to have the government come in and attach a clamp where we want to stop the flow of data because it's invading privacy. I think it's difficult for the firm to help, because the invasion is happening down here as multiple people are taking pictures and aggregating it. So instead, I'm interested in the solutions over here, and I think there are a couple of examples of community based solutions that can work. Early in the development of the web, people noticed that web search engines were coming to life, and they were crawling the entire web, indexing it. Then people would go to the search engine and ask what was where. If you put something on the web then, it meant generally, it was going to get into that index. But what if you wanted to put something up publicly so you can hand out a link to it, but you prefer it not be in Google? You could try to sue Google on a theory of, "You're not allowed to crawl my website," or something. But instead, what happened was, a couple of people got together, informal, unchartered, and they came up with a standard called Robots.txt. Robots.txt, which you can find if you go to most websites, www.website.com/robots.txt, you'll see instructions from the owner of the website to robots, like those of Google, saying, "You're allowed to go in this directory, and not in that one." It's just not a legal assertion of right. There's not much law on this at all, certainly not in 1994 when it was invented. Instead, the idea is, I'm just asking. And here's an easy way for you to respect my wishes if you want. So the ball's in your court, Google. If you wouldn't mind, I know you can get to this directory, but here in a very simple, consistent way, I've asked you not to. Google respects Robots.txt. So does Microsoft, MSN, Yahoo. All the major search engines respect it, because basically it's just the right thing to do. I would like to see a similar way of packaging and relaying a request about our own privacy as what we would call metadata, to the data that expresses part of our identity. So if I'm in a photo, I'd like to be able to say, "All right, clever computer, you've identified me as being in that photo, even unbeknownst to the photographer, and I didn't even realize the picture was being taken. But now that you know who I am, check this location to see if I have expressed a preference for photos taken of me in public. Have I said, 'Please consult with me if you want to use it.' Have I said, 'I don’t care; do whatever you want.' Have I said nothing? Have I said, 'I am a paranoid, privacy protecting person. I'd really prefer that you not use this photo in a very public way.'" If there were a way for me to just express that view, and have it associated with data that bears especially on me, it puts the ball into the court of every person who's going to touch that photo later, to make a decision as to whether to contact me and whether to respect my wishes. I'm not sure they always would, I'm not sure they always should. But to have an internet that has a social dimension, as well as just the technical dimension of, "Here's a cool photo. I'll copy and paste it into my PowerPoint presentation, because I don't know who these people are, but it's a perfect photo for what I want." I think it would be great to give people the opportunity, as they are remixing culture and putting bits out onto the next, and doing cool things, to have some sense of the people whose lives they're touching when they do it, and to have a moment to connect with them, and decide whether they want to respect whatever wishes they glean. I think a chance to decide what's ethical, and whether to be ethical, could work in many, many cases, just as Robots.txt has worked in many, but not all cases, to deflect a confrontation that would otherwise arise.

 

Recorded on: Mar 8 2008

 

 

 


Related