Algorithms Feel Like Science, but Are Full of Human Error

Algorithms are in charge of hiring people and data collection. You should have the right to know what they're saying about you.


Algorithms are capable of fascinating things. Just by hearing the sound of your voice, one can allegedly tell if you’re a trustworthy person. Luis Salazar, CEO of Jobaline, the company behind the algorithm, sees it as an unbiased way of hiring. “Math is blind,” he said to NPR. Salazar says the algorithm analyzes the architecture of a potential hire's voice to find out if it has all the right qualities.

Zeynep Tufekci from the University of North Carolina at Chapel Hill says this assessment is a common misconception. She explained to NPR:

“The fear I have is that every time this is talked about people talk about it as if it's math or physics; therefore, some natural-neutral world, and they're programs. They're complex programs. They're not, like laws of physics or laws of nature; they're created by us. We should look into what they do, and not let them do everything. We should make those decisions explicitly.”

“[O]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

Algorithms are made by man and capable of making mistakes. Cases where programs and products have not worked properly for all people: the Apple Watch for those with darker pigmented skin or tattoos and photo sites autotagging darker skinned people as apes. A slip-up can also happen simply because of data imperfections, causing things like high-paying ads to show up for men far more often than for women. There may even be a fault on the programmer's part, unknowingly injecting biases into the code.

But far more worrying are the profiles being built about you, the individual. Algorithms are the way advertisers and companies do business these days and somewhere out there is a data profile on you. These profiles are built based on what you do around the web: what search queries you type in, what online stores you shop at, and so on. The problem is you don't know what they say about you. This issue can manifest itself in two ways:

When algorithms are used to personalize your experience

The former comes in the form of the filter bubble effect, where a site encapsulates each user in a personalized informational echo chamber of things they already agree with — not the best for knowledge growth. It's an experiment you can do right now, if you wish: Ask two friends to Google something like “Obama” or “Egypt” and see what results pop up first. The results tend to be different. Eli Pariser, author of The Filter Bubble: What the Internet is Hiding from You, explained the detrimental effects of this process in his 2011 TED Talk:

When an algorithm gets something wrong

The latter comes in the form of misunderstandings — some big, some small, but none you want on your permanent profile. As Aeon's Frank Pasquale found, “[o]ne woman was falsely accused of being a meth dealer by a private data broker, and it took years for her to set the record straight — years during which landlords and banks denied her housing and credit.”

For these reasons, and many others, there should be some kind of algorithmic accountability — a way users could challenge mistakes in the system. The better option, though, would be to stop data collection altogether. If you want to reduce the data collected on you, start by using search engines that don't track your queries, like DuckDuckGo. If you really want to mess businesses up, go one step further and try browsing anonymously with Tor. Vote with your data or, in this case, by not providing it.

Photo courtesy of JACQUES DEMARTHON / Getty Staff

22 months of war - condensed in a 1-minute video

No, the Syrian civil war is not over. But it might be soon. Time for a recap

Strange Maps
  • The War in Syria has dropped off the radar, but it's not over (yet)
  • This 1-minute video shows how the fronts have moved – and stabilised – over the past 22 months
  • Watching this video may leave you both better informed, and slightly queasy: does war need a generic rock soundtrack?
Keep reading Show less

Bespoke suicide pods now available for death in style

Sarco assisted suicide pods come in three different styles, and allow you to die quickly and painlessly. They're even quite beautiful to look at.

The Sarco assisted suicide pod
Technology & Innovation

Death: it happens to everyone (except, apparently, Keanu Reeves). But while the impoverished and lower-class people of the world die in the same ol' ways—cancer, heart disease, and so forth—the upper classes can choose hip and cool new ways to die. Now, there's an assisted-suicide pod so chic and so stylin' that peeps (young people still say peeps, right?) are calling it the "Tesla" of death... it's called... the Sarco! 

Keep reading Show less

How to bring more confidence to your conversations

Entrepreneur and author Andrew Horn shares his rules for becoming an assured conversationalist.

content.jwplatform.com
Videos
  • To avoid basing action on external validation, you need to find your "authentic voice" and use it.
  • Finding your voice requires asking the right questions of yourself.
  • There are 3-5 questions that you would generally want to ask people you are talking to.
Keep reading Show less