Weapons of Math Destruction: How Big Data Destroys Lives

When companies hide algorithms from public view, the data are used for destructive purposes, warns Cathy O'Neil in her new book.

Weapons of Math Destruction: How Big Data Destroys Lives
A pictures shows binary code reflected from a computer screen in a woman's eye on October 22, 2012. (Leon Neal/AFP/Getty Images)

A few weeks ago I went with my fiancée to buy a new car. While figuring out which model would be most economical, I reminded her to factor in an increase in insurance rates, something that had happened to me the previous year. The salesman said that might not be true.


Turns out he was right. A former auto insurance salesman, he told us rates are dependent upon zip code. Companies factor in the driving records of everyone in that neighborhood—your personal driving record is only partly consequential. So while moving from Mar Vista to Palms included an increase for me, my fiancée moving from Venice resulted in a decrease.

We should consider that balance, correct? Hardly. We’re the victims of an invisible algorithm, something data scientist Cathy O’Neil spends an entire book discussing in Weapons of Math Destruction. A math geek by nature, O’Neil became disillusioned with her lifelong passion’s applications in Big Data when working for a hedge fund during the economic collapse in 2008.

The crash made it all too clear that mathematics, once my refuge, was not only deeply entangled in the world’s problems but also fueling many of them.

My insurance issue seems benign in comparison to many issues of inequality and injustice O’Neil address. To return to that industry, however, she discusses how credit scores, itself an industry fueled by deception and corruption, affects unsuspecting drivers in insidious ways.

For example, drivers in Florida with clean records and bad credit scores were shown to pay $1,522 more than drivers with similar records save a drunken driving conviction. Whether or not you’ve paid your phone bill can have more impact on auto insurance than getting hammered and sitting behind the wheel. If this seems unfair, it is, and the problems are only getting worse.

Credit scores are used by nearly half of American employers to screen potential employees. With the rise of online resume readers, qualified candidates are never considered by human eyes due to the slightest infraction. Yet credit should not be a prison sentence. Many factors contribute to a lapse in bill payment, including another subject invisible algorithms affect: health insurance. One crippling medical bill can very well result in punishment in the eyes of creditors and employers.

It’s the invisibility, dehumanization by numbers, that’s the real problem. Qualifying subtleties during an interview—facial expressions, vocal fluctuations, pantomimes, and perhaps most importantly, a logical explanation as to why one’s credit score is not optimal—are never weighed in a system that only reads numerical data.

Without feedback, however, a statistical engine can continue spinning out faulty and damaging analysis while never learning from its mistakes … Instead of searching for the truth, the score comes to embody it.

As an example O’Neil tells the story of Sarah Wysocki. In 2009, the Washington, D.C. school district implemented one such system to weed out ineffective teachers. Wysocki was beloved by parents but her IMPACT evaluation score placed her in the bottom 5 percent during the second year of statistical measuring. She was among the 206 teachers let go that year.

What such scoring systems do not take into account, O’Neil writes, are the nuanced factors of education. Like with corporations, the statistical machine seeks constant improvement in the same way shareholders demand perpetual profits. Yet teachers have different classes each year—she might instruct honors students one year, special education children the next. All the algorithm views are test results.

Another teacher in the book received a score of six out of a hundred in a similar rating method. The following year he received a ninety-six. While there’s always room for improvement, such a system is obviously ineffective given such a wide disparity for a senior instructor. He was not alone on this absurd grading curve.

Day by day the rhythm of our lives are being automated. O’Neil has a special dislike for algorithms used by policing systems to monitor crime. They create a self-perpetuating feedback loop targeting low-income minority neighborhoods. This leads to confirmation bias: of course that’s where the problems are. Kids caught with nickel bags receive jail time while bankers siphoning billions from ignorant customers are immune to prosecution.

While critical of the systems in place, O’Neil reminds us that it does not have to be so. Math can be a tool of construction as well as destruction. For example, an algorithm could show if it’s more beneficial to pay your phone or electricity bill during a tight month in regards to how each would affect your credit score. Not sexy, but realistic.

She calls for data scientists to take a digital Hippocratic Oath, which asks them to consider the enormous impact algorithms has on the population. She also wants companies to “open the hood” so methods are not hidden from public view.

Open source and numerically honest platforms are beneficial from consumer and social standpoints. O’Neil invokes Mitt Romney’s 47 percent comment. The presidential candidate believed himself to be in a room of like-minded elite, ignorant that staff might not share his values. When everyone’s cell phone is a video camera politicians can no longer have separate talking points for separate audiences—something Hillary Clinton is being reminded thanks to Wikileaks.

Asking companies to peer behind the numbers is requesting of them an ethical consideration: Is it more important to maximize profits at inhumane costs or take a slight financial hit to serve the better good? Of course each is going to answer differently for a host of reasons. As long as that’s the case we’ll never know whether their weapons are constructive or destructive. As for now, the latter is too often true. As O’Neil warns, democracy itself is the wager.

--

Derek Beres is working on his new book, Whole Motion: Training Your Brain and Body For Optimal Health (Carrel/Skyhorse, Spring 2017). He is based in Los Angeles. Stay in touch on Facebook and Twitter.

A historian identifies the worst year in human history

A Harvard professor's study discovers the worst year to be alive.

The Triumph of Death. 1562.

Credit: Pieter Bruegel the Elder. (Museo del Prado).
Politics & Current Affairs
  • Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
  • The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
  • 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
Keep reading Show less

Why professional soccer players choke during penalty kicks

A new study used functional near-infrared spectroscopy (fNIRS) to measure brain activity as inexperienced and experienced soccer players took penalty kicks.

PORTLAND, OREGON - MAY 09: Diego Valeri #8 of Portland Timbers reacts after missing a penalty kick in the second half against the Seattle Sounders at Providence Park on May 09, 2021 in Portland, Oregon.

Abbie Parr via Getty Images
Mind & Brain
  • The new study is the first to use in-the-field imaging technology to measure brain activity as people delivered penalty kicks.
  • Participants were asked to kick a total of 15 penalty shots under three different scenarios, each designed to be increasingly stressful.
  • Kickers who missed shots showed higher activity in brain areas that were irrelevant to kicking a soccer ball, suggesting they were overthinking.
Keep reading Show less

Changing a brain to save a life: how far should rehabilitation go?

What's the difference between brainwashing and rehabilitation?

Credit: Roy Rochlin via Getty Images
Mind & Brain
  • The book and movie, A Clockwork Orange, powerfully asks us to consider the murky lines between rehabilitation, brainwashing, and dehumanization.
  • There are a variety of ways, from hormonal treatment to surgical lobotomies, to force a person to be more law abiding, calm, or moral.
  • Is a world with less free will but also with less suffering one in which we would want to live?
Keep reading Show less
Surprising Science

How to fool a shark using magnets

A simple trick allowed marine biologists to prove a long-held suspicion.

Quantcast