Skip to content
Culture & Religion

Weapons of Math Destruction: How Big Data Destroys Lives

When companies hide algorithms from public view, the data are used for destructive purposes, warns Cathy O'Neil in her new book.
A pictures shows binary code reflected from a computer screen in a woman's eye on October 22, 2012. (Leon Neal/AFP/Getty Images)

A few weeks ago I went with my fiancée to buy a new car. While figuring out which model would be most economical, I reminded her to factor in an increase in insurance rates, something that had happened to me the previous year. The salesman said that might not be true.


Turns out he was right. A former auto insurance salesman, he told us rates are dependent upon zip code. Companies factor in the driving records of everyone in that neighborhood—your personal driving record is only partly consequential. So while moving from Mar Vista to Palms included an increase for me, my fiancée moving from Venice resulted in a decrease.

We should consider that balance, correct? Hardly. We’re the victims of an invisible algorithm, something data scientist Cathy O’Neil spends an entire book discussing in Weapons of Math Destruction. A math geek by nature, O’Neil became disillusioned with her lifelong passion’s applications in Big Data when working for a hedge fund during the economic collapse in 2008.

The crash made it all too clear that mathematics, once my refuge, was not only deeply entangled in the world’s problems but also fueling many of them.

My insurance issue seems benign in comparison to many issues of inequality and injustice O’Neil address. To return to that industry, however, she discusses how credit scores, itself an industry fueled by deception and corruption, affects unsuspecting drivers in insidious ways.

For example, drivers in Florida with clean records and bad credit scores were shown to pay $1,522 more than drivers with similar records save a drunken driving conviction. Whether or not you’ve paid your phone bill can have more impact on auto insurance than getting hammered and sitting behind the wheel. If this seems unfair, it is, and the problems are only getting worse.

Credit scores are used by nearly half of American employers to screen potential employees. With the rise of online resume readers, qualified candidates are never considered by human eyes due to the slightest infraction. Yet credit should not be a prison sentence. Many factors contribute to a lapse in bill payment, including another subject invisible algorithms affect: health insurance. One crippling medical bill can very well result in punishment in the eyes of creditors and employers.

It’s the invisibility, dehumanization by numbers, that’s the real problem. Qualifying subtleties during an interview—facial expressions, vocal fluctuations, pantomimes, and perhaps most importantly, a logical explanation as to why one’s credit score is not optimal—are never weighed in a system that only reads numerical data.

Without feedback, however, a statistical engine can continue spinning out faulty and damaging analysis while never learning from its mistakes … Instead of searching for the truth, the score comes to embody it.

As an example O’Neil tells the story of Sarah Wysocki. In 2009, the Washington, D.C. school district implemented one such system to weed out ineffective teachers. Wysocki was beloved by parents but her IMPACT evaluation score placed her in the bottom 5 percent during the second year of statistical measuring. She was among the 206 teachers let go that year.

What such scoring systems do not take into account, O’Neil writes, are the nuanced factors of education. Like with corporations, the statistical machine seeks constant improvement in the same way shareholders demand perpetual profits. Yet teachers have different classes each year—she might instruct honors students one year, special education children the next. All the algorithm views are test results.

Another teacher in the book received a score of six out of a hundred in a similar rating method. The following year he received a ninety-six. While there’s always room for improvement, such a system is obviously ineffective given such a wide disparity for a senior instructor. He was not alone on this absurd grading curve.

Day by day the rhythm of our lives are being automated. O’Neil has a special dislike for algorithms used by policing systems to monitor crime. They create a self-perpetuating feedback loop targeting low-income minority neighborhoods. This leads to confirmation bias: of course that’s where the problems are. Kids caught with nickel bags receive jail time while bankers siphoning billions from ignorant customers are immune to prosecution.

While critical of the systems in place, O’Neil reminds us that it does not have to be so. Math can be a tool of construction as well as destruction. For example, an algorithm could show if it’s more beneficial to pay your phone or electricity bill during a tight month in regards to how each would affect your credit score. Not sexy, but realistic.

She calls for data scientists to take a digital Hippocratic Oath, which asks them to consider the enormous impact algorithms has on the population. She also wants companies to “open the hood” so methods are not hidden from public view.

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

Open source and numerically honest platforms are beneficial from consumer and social standpoints. O’Neil invokes Mitt Romney’s 47 percent comment. The presidential candidate believed himself to be in a room of like-minded elite, ignorant that staff might not share his values. When everyone’s cell phone is a video camera politicians can no longer have separate talking points for separate audiences—something Hillary Clinton is being reminded thanks to Wikileaks.

Asking companies to peer behind the numbers is requesting of them an ethical consideration: Is it more important to maximize profits at inhumane costs or take a slight financial hit to serve the better good? Of course each is going to answer differently for a host of reasons. As long as that’s the case we’ll never know whether their weapons are constructive or destructive. As for now, the latter is too often true. As O’Neil warns, democracy itself is the wager.

Derek Beres is working on his new book, Whole Motion: Training Your Brain and Body For Optimal Health (Carrel/Skyhorse, Spring 2017). He is based in Los Angeles. Stay in touch on Facebook and Twitter.


Related

Up Next