Internet policy expert Rebecca MacKinnon discusses the social possibilities and challenges presented by new digital technologies.
Rebecca MacKinnon: We have a situation, I think, with the internet right now where a lot of people are using it to bring attention to abuses that happen in a lot of places, to organize. We’re seeing in the US the whole Occupy movement. People are using the internet to protest against what they see as injustice. On the other side of the political spectrum, people in the Tea Party and so on have been using the internet to organize and demand change in different ways. And so it does give a lot more power to people who are more at the edges, who aren’t like powerful to begin with, and that’s a very good thing and that’s something that we definitely want to encourage. What we need to make sure is that when censorship is happening on the internet that people know about it, . . . when surveillance is happening on the internet people know who’s doing it and they can hold the people who are censoring and surveilling accountable.
You know, the point of having a free and open internet is that you need to make sure that the global internet is interconnected so that people in any one place can access information that’s been published online from anywhere and that people around the world can organize in different ways together and they’re not running up against barriers from organizing, from finding each other, from collaborating in different ways and so on.
In a lot of ways the products and services and platforms offered by tech companies have been very empowering. People have been using Facebook around the world to organize against dictators. People have been using Twitter to get information out from protests and demonstrations when the media is being censored, and that’s really important, but these companies need to do a bit more if they want to make sure that they are not being manipulated by companies or that they are not inadvertently contributing to people’s rights being violated in a lot of different places.
So with a company like Facebook . . . Facebook is making some decisions about its rules, its internal rules, that sometimes play out quite negatively for activists. So for instance, Facebook has a policy that requires people to use their real name, and so even though a lot of activists in a lot of countries are using Facebook to organize demonstrations and to organize protests and to sort of spread information about what their government is doing, Facebook’s requirement that people use their real name puts people at risk often.
One concrete example is that in Egypt before the Arab Spring there was a group of activists who created an anti-torture page organizing demonstrations against police brutality and they were on Facebook using fake names because they were afraid of becoming victims of torture themselves. But then on the night before a huge demonstration they were planning in the fall of 2010, suddenly their Facebook page went down because it had been brought to the attention of Facebook administrators that the people running the page weren’t—their accounts were not using their real name. So this is one kind of example where Facebook’s internal rules end up hurting people who are most at risk.
You also have had a number of examples where privacy policies have changed suddenly, and, in certain countries now, quite a number of countries now when a demonstrator gets arrested the first thing that their interrogator does is ask them for all their passwords of all their social media accounts and if they don’t want to hand them over they get tortured for their passwords sometimes.
So it’s true that Facebook and all of these social networks were not setup for the purpose of dissent. They were set up for social networking, for people to meet each other and network with their high school friends and whatever else, but the reality is that people around the world are using these platforms for very political purposes. And when you ask them, “Well, if you don’t like Facebook’s policies, why don’t you just use something else?” they say “Well, we have to use Facebook because if you want to reach the biggest audience that’s where the biggest audience is.”
So this is part of the problem, and this is why I argue and why a growing number of people are arguing that these companies do have human rights responsibilities and that they do need to consider the risks of their users, their most vulnerable users, when deciding what kind of policies they’re going to put in place and how they’re going to shape their features because there are life and death situations out there.
Directed / Produced by Jonathan Fowler & Elizabeth Rodd