Algorithms Feel Like Science, but Are Full of Human Error

Algorithms are in charge of hiring people and data collection. You should have the right to know what they're saying about you.


Algorithms are capable of fascinating things. Just by hearing the sound of your voice, one can allegedly tell if you’re a trustworthy person. Luis Salazar, CEO of Jobaline, the company behind the algorithm, sees it as an unbiased way of hiring. “Math is blind,” he said to NPR. Salazar says the algorithm analyzes the architecture of a potential hire's voice to find out if it has all the right qualities.

Zeynep Tufekci from the University of North Carolina at Chapel Hill says this assessment is a common misconception. She explained to NPR:

“The fear I have is that every time this is talked about people talk about it as if it's math or physics; therefore, some natural-neutral world, and they're programs. They're complex programs. They're not, like laws of physics or laws of nature; they're created by us. We should look into what they do, and not let them do everything. We should make those decisions explicitly.”

“[O]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

Algorithms are made by man and capable of making mistakes. Cases where programs and products have not worked properly for all people: the Apple Watch for those with darker pigmented skin or tattoos and photo sites autotagging darker skinned people as apes. A slip-up can also happen simply because of data imperfections, causing things like high-paying ads to show up for men far more often than for women. There may even be a fault on the programmer's part, unknowingly injecting biases into the code.

But far more worrying are the profiles being built about you, the individual. Algorithms are the way advertisers and companies do business these days and somewhere out there is a data profile on you. These profiles are built based on what you do around the web: what search queries you type in, what online stores you shop at, and so on. The problem is you don't know what they say about you. This issue can manifest itself in two ways:

When algorithms are used to personalize your experience

The former comes in the form of the filter bubble effect, where a site encapsulates each user in a personalized informational echo chamber of things they already agree with — not the best for knowledge growth. It's an experiment you can do right now, if you wish: Ask two friends to Google something like “Obama” or “Egypt” and see what results pop up first. The results tend to be different. Eli Pariser, author of The Filter Bubble: What the Internet is Hiding from You, explained the detrimental effects of this process in his 2011 TED Talk:

When an algorithm gets something wrong

The latter comes in the form of misunderstandings — some big, some small, but none you want on your permanent profile. As Aeon's Frank Pasquale found, “[o]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

For these reasons, and many others, there should be some kind of algorithmic accountability — a way users could challenge mistakes in the system. The better option, though, would be to stop data collection altogether. If you want to reduce the data collected on you, start by using search engines that don't track your queries, like DuckDuckGo. If you really want to mess businesses up, go one step further and try browsing anonymously with Tor. Vote with your data or, in this case, by not providing it.

Photo courtesy of JACQUES DEMARTHON / Getty Staff

LinkedIn meets Tinder in this mindful networking app

Swipe right to make the connections that could change your career.

Getty Images
Sponsored
Swipe right. Match. Meet over coffee or set up a call.

No, we aren't talking about Tinder. Introducing Shapr, a free app that helps people with synergistic professional goals and skill sets easily meet and collaborate.

Keep reading Show less

Think you’re bad at math? You may suffer from ‘math trauma’

Even some teachers suffer from anxiety about math.

Image credit: Getty Images
Mind & Brain

I teach people how to teach math, and I've been working in this field for 30 years. Across those decades, I've met many people who suffer from varying degrees of math trauma – a form of debilitating mental shutdown when it comes to doing mathematics.

Keep reading Show less

A world map of Virgin Mary apparitions

She met mere mortals with and without the Vatican's approval.

Strange Maps
  • For centuries, the Virgin Mary has appeared to the faithful, requesting devotion and promising comfort.
  • These maps show the geography of Marian apparitions – the handful approved by the Vatican, and many others.
  • Historically, Europe is where most apparitions have been reported, but the U.S. is pretty fertile ground too.
Keep reading Show less

How KGB founder Iron Felix justified terror and mass executions

The legacy of Felix Dzerzhinsky, who led Soviet secret police in the "Red Terror," still confounds Russia.

Getty Images
Politics & Current Affairs
  • Felix Dzerzhinsky led the Cheka, Soviet Union's first secret police.
  • The Cheka was infamous for executing thousands during the Red Terror of 1918.
  • The Cheka later became the KGB, the spy organization where Russia's President Putin served for years.
Keep reading Show less