Should Facebook be regulated with laws instead of fines?

The fine Facebook just paid was huge, but many in tech say it wasn't nearly enough to protect your data.

Avishek Das/SOPA Images/LightRocket via Getty Image

Facebook

  • Last week, Facebook incurred a five billion dollar fine as a result of its mismanagement of user data.
  • This is the second largest fine the FTC has ever given out.
  • Many in tech are arguing that this was a mere slap on the wrist and that stronger regulation is needed.

You might have heard about Facebook's 5-billion-dollar fine last week. It was levied by the Federal Trade Commission (FTC) for Facebook's mismanagement of data during the Cambridge Analytica scandal and failure to follow their own privacy policies.

The fine is the second-largest ever dished out by the FTC and comes along with a series of restrictions on how Facebook will do business. These penalties were worked out as a settlement that kept the issue from going to court.

However, while this might seem to be a significant win for those who want more regulation and oversight of tech companies, many are arguing this is nothing of the sort.

Why some think a ten-figure fine was a slap on the wrist

While the fines and regulations the FTC decreed might seem stiff, they are weaker than they appear to be. First of all, Facebook made a ton of money this quarter — $5B is chump change. The restrictions on how Facebook does business also aren't quite as thorough as you'd imagine.

While they will introduce a new third-party commission to oversee new privacy decisions, this won't cause any fundamental change in how they go about collecting personal data and selling it to advertisers. Many in the tech world are criticizing the settlement; saying this was anything from a slap on the wrist to a payoff to prevent further investigation.

Tony Romm at the Washington Post showed how Facebook dictated the terms of the settlement and made an offer to pay the 10-figure fine in part to win other concessions from the FTC. Professor Siva Vaidhyanathan of the University of Virginia argued that the settlement, which includes forgiveness for any misconduct Facebook executives were part of, was a useless gesture that won't cause any progress on privacy issues.

Alex Stamos, formerly of Facebook, went further. Pointing to clauses in the deal that allow the company to cut off relationships with third-party apps, he argued they might benefit the company in the long run, saying "Facebook paid the FTC $5B for a letter that says, 'You never again have to create mechanisms that could facilitate competition.'"

The FTC decision wasn't unanimous either. Two commissioners, the ones appointed by Democratic presidents, dissented. In her dissent, Rebecca Slaughter argued that the fine was too small to deter future mismanagement of data and privacy rights and that investigations of Facebook's top staff should have been undertaken.

Fellow commissioner Rohit Chopra largely agreed, saying:

"The proposed settlement does little to change the business model or practices. . . [and] imposes no meaningful changes to the company's structure or financial incentives."

If all these experts agree that the punishment was too light, why wasn't it larger?

While Facebook might have ostensibly done some illegal things, the laws around these issues are pretty weak. The fact that the FTC backed down when Facebook suggested they were willing to go to court demonstrates this. The laws covering this topic are limited and somewhat ambiguous. Most of the FTC's power in this area comes from the Children's Online Privacy Protection Act, which doesn't even cover adults' information.

At the end of the day, the FTC can only act in ways prescribed by law. If they try to go too far beyond existing legislation in their rulings, the courts can strike them down. Perhaps this is why columnist Matt Levine makes an excellent argument for new laws to remove these ambiguities and clarify the law around online privacy in Bloomberg:

"But the more important thing that is going on here is that some people — at the FTC and elsewhere — thought that the FTC should mandate big structural changes in how Facebook works, "the way they collect data in the first place," and Facebook did not. This is not a legal or factual dispute; it's a policy dispute, and it has nothing to do with the questions of what Facebook did wrong or how responsible Zuckerberg was. Even if Facebook has an airtight case that everything it did with the Cambridge Analytica data was totally allowed under existing law and regulation and consent decrees, the FTC — and "privacy watchdogs," and Congress, and you — might nonetheless want to require changes in how Facebook collects data.

The United States actually, as a country, has a mechanism to do that. It is called 'passing a law.' Some members of Congress, the body charged with writing legislation, could write a bill saying 'social media companies can't collect data in the following bad ways,' or whatever, and then the other members of Congress could debate it, and when they (and the president) agreed it would be passed and become law, and then there would be new restrictions on the way that Facebook collects data. And if Facebook violated them it would pay even bigger fines, or it would be shut down, or its executives would go to jail, or whatever the law said."

While he leaves the exact nature of this new law up to the imagination, it is clear he envisions something rather strict.

We are in a brave new world of tech, regulation, and government oversight. Many of the laws and systems we have for controlling the new systems we use to interact with one another are outdated. This ruling, while not unsubstantial, demonstrates this. If we will find the political will to correct for these issues before the next privacy scandal breaks is anybody's guess.

Scroll down to load more…