Self-Motivation
David Goggins
Former Navy Seal
Career Development
Bryan Cranston
Actor
Critical Thinking
Liv Boeree
International Poker Champion
Emotional Intelligence
Amaryllis Fox
Former CIA Clandestine Operative
Management
Chris Hadfield
Retired Canadian Astronaut & Author
Learn
from the world's big
thinkers
Start Learning

IBM promises no more facial recognition software

The programming giant exits the space due to ethical concerns.

Image source: PaO_STUDIO/Shutterstock
  • IBM sent a latter to Congress stating it will no longer research, develop, or sell facial recognition software.
  • AI-based facial recognition software remains widely available to law enforcement and private industry.
  • Facial recognition software is far from infallible, and often reflects its creators' bias.

In what strikes one as a classic case of shutting the stable door long after the horse has bolted, IBM's CEO Arvind Krishna has announced the company will no longer sell general-purpose facial recognition software, citing ethical concerns, in particular with the technology's potential for use in racial profiling by police. They will also cease research and development of this tech.

While laudable, this announcement arguably arrives about five years later than it might have, as numerous companies sell AI-based facial recognition software, often to law enforcement. Anyone who uses Facebook or Google also knows all about this technology, as we watch both companies tag friends and associates for us. (Facebook recently settled a lawsuit regarding the unlawful use of facial recognition for $550 million.)

It's worth noting that no one other than IBM has offered to cease developing and selling facial recognition software.

IBM

Watson IBM sign on building

Image source: Tada Images/Shutterstock

Krishna made the announcement in a public letter to Senators Cory Booker (D-NJ) and Kamala Harris (D-CA), and Representatives Karen Bass (D-CA), Hakeem Jeffries (D-NY), and Jerrold Nadler (D-NY). Democrats in Congress are considering legislation to ban facial-recognition software as reported abuses pile up.

IBM's letter states:

"IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency. We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies."

Prior to their exit entirely from facial recognition, IBM had a mixed record. The company scanned nearly a million Creative Commons images from Flickr without their owners' consent. On the other hand, IBM released a public data set in 2018 in an attempt at transparency.

Skewed identification

close up of computer screen with lines of code

Image source: Best-Backgrounds/Shutterstock

Privacy issues aside — and there definitely are privacy concerns here — the currently available software is immature and prone to errors. Worse, it often reflects the biases of its programmers, who work for private companies with little regulation or oversight. And since commercial facial recognition software is sold to law enforcement, the frequent identification errors and biases are dangerous: They can ruin the lives of innocent people.

The website Gender Shades offers an enlightening demonstration of the type of inaccuracies to which facial recognition is inclined. The page was put together by Joy Buolamwini and Timnit Gebru in 2018, and doesn't reflect the most recent iterations of the software it tests, from three companies, Microsoft, the now-presumably-late IBM Watson, and Face++. Nonetheless, it's telling. To begin with, all three programs did significantly better at identifying men than women. However, when it came to gender identification — simplified to binary designations for simplicity — and skin color, the unimpressive results were genuinely troubling for the bias they reflected.

Amazon's Rekognition facial recognition software is the one most frequently sold to law enforcement, and an ACLU test run in 2018 revealed it also to be pretty bad: It incorrectly identified 28 members of Congress as people in a public database of 28,000 mugshots.

Update, 6/11/2020: Amazon today announced a 12-month moratorium on law-enforcement use of Rekognition, expressing the company's hope that Congress will in the interim enact "stronger regulations to govern the ethical use of facial recognition technology."

In 2019, a federal study by the National Institute of Standards and Technology reported empirical evidence of bias relating to age, gender, and race in the 189 facial recognition algorithms they analyzed. Members of certain groups of people were 100 times more likely to be misidentified. This study is ongoing.

Facial rec's poster child

Crowd holding up cellphones at a concert

Image source: Gian Cescon/Unsplash

The company most infamously associated with privacy-invading facial recognition software has to be Clearview AI, about whom we've previously written. This company scraped identification from over 3 billion social media images without posters' permission to develop software sold to law enforcement agencies.

The ACLU sued Clearview AI in May of 2020 for engaging in "unlawful, privacy-destroying surveillance activities" in violation of Illinois' Biometric Information Privacy Act. The organization wrote to CNN, "Clearview is as free to look at online photos as anyone with an internet connection. But what it can't do is capture our faceprints — uniquely identifying biometrics — from those photos without consent." The ACLU's complaint alleges "In capturing these billions of faceprints and continuing to store them in a massive database, Clearview has failed, and continues to fail, to take the basic steps necessary to ensure that its conduct is lawful."

The longer term

Though it undoubtedly sends a chill down the spine, the onrush of facial recognition technologies — encouraged by the software industry's infatuation with AI — suggests that we can't escape being identified by our faces for long, legislation or not. Advertisers want to know who we are, law enforcement wants to know who we are, and as our lives revolve ever more decisively around social media, many will no doubt welcome technology that automatically brings us together with friends and associates old and new. Concerns about the potential for abuse may wind up taking a back seat to convenience.

It's been an open question for some time whether privacy is even an issue for those who've grown up surrounded by connected devices. These generations don't care so much about privacy because they — realistically — don't expect it, particularly in the U.S. where very little is legally private.

IBM's principled stand may ultimately be more pyrrhic than anything else.

LIVE EVENT | Radical innovation: Unlocking the future of human invention

Innovation in manufacturing has crawled since the 1950s. That's about to speed up.

Big Think LIVE

Add event to calendar

AppleGoogleOffice 365OutlookOutlook.comYahoo


Keep reading Show less

NASA's idea for making food from thin air just became a reality — it could feed billions

Here's why you might eat greenhouse gases in the future.

Jordane Mathieu on Unsplash
Technology & Innovation
  • The company's protein powder, "Solein," is similar in form and taste to wheat flour.
  • Based on a concept developed by NASA, the product has wide potential as a carbon-neutral source of protein.
  • The man-made "meat" industry just got even more interesting.
Keep reading Show less

Navy SEALs: How to build a warrior mindset

SEAL training is the ultimate test of both mental and physical strength.

Videos
  • The fact that U.S. Navy SEALs endure very rigorous training before entering the field is common knowledge, but just what happens at those facilities is less often discussed. In this video, former SEALs Brent Gleeson, David Goggins, and Eric Greitens (as well as authors Jesse Itzler and Jamie Wheal) talk about how the 18-month program is designed to build elite, disciplined operatives with immense mental toughness and resilience.
  • Wheal dives into the cutting-edge technology and science that the navy uses to prepare these individuals. Itzler shares his experience meeting and briefly living with Goggins (who was also an Army Ranger) and the things he learned about pushing past perceived limits.
  • Goggins dives into why you should leave your comfort zone, introduces the 40 percent rule, and explains why the biggest battle we all face is the one in our own minds. "Usually whatever's in front of you isn't as big as you make it out to be," says the SEAL turned motivational speaker. "We start to make these very small things enormous because we allow our minds to take control and go away from us. We have to regain control of our mind."
Keep reading Show less

How COVID-19 will change the way we design our homes

Pandemic-inspired housing innovation will collide with techno-acceleration.

Maja Hitij/Getty Images
Coronavirus
COVID-19 is confounding planning for basic human needs, including shelter.
Keep reading Show less
Scroll down to load more…
Quantcast