What’s the Big Idea?
What’s more offensive than crushed heads and mangled limbs? Exposed female nipples, according to Facebook’s criteria for deleting user content, published for the first time on Gawker two weeks ago.
The documents were leaked by a former employee at oDesk, the California-based firm that outsources content moderation for both Facebook and Google. They contain a set of strict guidelines for enforcing Facebook’s vague Statement of Rights and Responsibilities, which have heretofore eluded public scrutiny.
A cheat sheet of rules for moderators lists violent speech as unacceptable, but says “crushed heads, limbs, etc are ok as long as no insides are showing.” Blood and deep flesh wounds are also ok. Users may upload as many pictures of marijuana as they want, unless it is clear that they’re selling, buying, or growing it.
This laissez faire attitude doesn’t extend to sexual content, however. Moderators are instructed to delete images portraying “naked ‘private parts,” any sexual activity — “even if naked parts are hidden from view by hands, clothes or other objects” — and “female nipple bulges,” “cartoon/art included.” (Male nipples get a pass, but J. Lo’s Oscar dress is out.) Pixelated or black-barred content showing nudity is also banned, as are pictures of sex toys, camel toes, and breast-feeding.
The contrast between Facebook’s stringent policing of “private parts” and its permissiveness towards depictions of violence led Gawker editor Adrian Chen to dub the oDesk moderators “Facebook’s anti-porn brigade.” The Guardian attributed the “odd prejudices against sex” to Facebook’s American roots. Rowan Davies, a campaign consultant for parenting site Mumsnet, called on mothers to post pictures of themselves breastfeeding in open defiance of the rules, which Davies (half-jokingly) referred to as a “Papal Index” for the digital age. The thing is: she’s got a point.
What’s the Significance?
Facebook is a company, but it’s not just a company. Like Google before it, it has become — quite intentionally — a default form of online navigation in a world where the internet is a precious resource. Its ubiquity as the largest social network on Earth gives it the power to mediate and define what we share on the web, while its newly public status means that it is accountable to shareholders, not to users. The question is, why is Facebook allowed to harbor “odd prejudices,” or any prejudices at all? Who decides that pictures of weed are allowed, but images of breastfeeding aren’t? In other words, is Facebook censoring us?
What’s unclear is whose values are being reflected in Facebook’s guidelines for content moderation. Are they shaped by what Facebook executives want, or what they think we want? And how long will that matter? In 2003, after appearing before the Harvard Ad Board to defend FaceMash (the precursor to Facebook), Mark Zuckerberg famously told the Harvard Crimson, “I’m not willing to risk insulting anyone.” Now that one out of every ten people in the world has a Facebook profile, the chance of insulting any one of its users is far greater, while the stakes for the company are far less.
Perhaps what we need to reckon with is the fact that a handful of people own what has now become public space. We see our friends as our network and our network as something we build, but of course, Facebook owns anything posted to Facebook. We think of Facebook as a free service, but of course, advertising still accounts for 85% of its revenue. Again, what’s being exchanged is not money, not exactly, but consent: your data becomes the property of Facebook. They promise to use it nicely, but ultimately, these guidelines are one more reminder that regulation of the community — and of Facebook — is in the hands of Facebook, not its users.
Image courtesy of Tomislav Pinter / Shutterstock.com