A new interactive documentary "How Normal Am I?" helps reveal the shortcomings of facial recognition technology.
- The website is part of SHERPA, a European Union-funded "project which analyses how AI and big data analytics impact ethics and human rights."
- The interactive documentary uses your webcam to analyze your face, predicting metrics like age, attractiveness, gender, body mass index and life expectancy.
- Despite the shortcomings of facial recognition, there's currently no set of national laws regulating the use of the technology by governments or private companies.
An interactive facial recognition experience<p>Want to see for yourself how well these systems work? Check out a new interactive mini-documentary called <a href="https://www.hownormalami.eu/" target="_blank">"How Normal Am I?"</a>, created by Tijmen Schep, a technology critic and privacy designer. </p><p>The documentary is part of <a href="https://www.project-sherpa.eu/" target="_blank">SHERPA</a>, a European Union-funded "project which analyses how AI and big data analytics impact ethics and human rights." To experience it, you'll need to grant the website permission to access your webcam, though "no personal data is collected." (You can also access the website and then disconnect your computer from the internet; it should still work fine.)<br></p>
hownormalami.eu<p>"How Normal Am I?" uses facial recognition to predict your age, attractiveness, <a href="https://www.sherpapieces.eu/overview/predicting-your-bmi-from-a-just-photo-a-github-safari" target="_blank">body mass index</a>, life expectancy and gender. Don't get upset if you get a low attractiveness or a high age score: Tilting your head, moving closer to the camera, or just running the program a second time can produce different results.<br></p><p>And that's sort of the point: If facial recognition technology is unreliable on a broad range of measures, to what extent should governments and the private sector be using it? Even if it does become reliable, to what extent should governments be allowed to use it on citizens?</p>
The future of facial recognition technology<p>In a <a href="https://www.pewresearch.org/internet/2019/09/05/more-than-half-of-u-s-adults-trust-law-enforcement-to-use-facial-recognition-responsibly/" target="_blank">2019 Pew Research Center survey</a>, a majority of U.S. respondents said it's acceptable for law enforcement agencies to use facial recognition to scan for threats in public spaces. However, far fewer said it's acceptable for advertisers to use facial recognition to do things like analyze how people respond to commercials in real time.<br></p><p>What could change how facial recognition operates in the U.S. is a set of national laws, which currently don't exist. (Although, some states and <a href="https://bigthink.com/politics-current-affairs/facial-recognition" target="_self">cities do regulate the technology</a>.) There are currently more than a <a href="https://www.cnn.com/2020/06/13/tech/facial-recognition-policy/index.html" target="_blank">dozen bills</a> addressing facial recognition technology, ranging from legislation that would outlaw warrantless usage of facial recognition, to banning federal agencies from <a href="https://www.congress.gov/bill/116th-congress/house-bill/3875/text?q=%7B%22search%22%3A%5B%22%5C%22facial+recognition%5C%22+-uyghur%22%5D%7D&r=2&s=8" target="_blank" rel="noopener noreferrer">using it altogether.</a></p>
The system is basically facial recognition technology, but for cars.
- Some police departments use automatic license plate readers to track suspects.
- A company called Flock Safety is now allowing police departments to opt in to a national network, which shares data on car movements.
- Privacy advocates are concerned about the potential for errors and abuse.
Map tracking the car movements of a murder suspect in Alabama.
Flock Safety<p>Flock Safety says its cameras help police solve more crimes. The company <a href="https://www.flocksafety.com/flock-safety-resources" target="_blank" rel="dofollow">website</a> notes that "70% of crime involves a vehicle" and law enforcement agencies say "a license plate is the best piece of evidence to track leads and solve crimes."</p><p>But critics of Flock Safety have raised concerns over the potential for errors and abuse. In August, for example, <a href="https://gizmodo.com/cops-terrorize-black-family-but-blame-license-plate-rea-1844602731" target="_blank">police in Colorado held a family at gunpoint</a> after a license plate reader flagged a car as stolen. It turned out to be the wrong vehicle.</p><p>With TALON, police would also have unprecedented information about the movements of citizens. It's not hard to see how this data could be abused. Think, for example, of the Florida police officer who used the Driver and Vehicle Information Database (D.A.V.I.D.) to get women's contact information so he could <a href="https://www.mercurynews.com/2019/03/11/police-in-florida-allege-officer-used-database-to-gets-dates/" target="_blank" rel="noopener noreferrer dofollow">ask them out on dates</a>.</p>
Flock Safety<p>It's currently unclear how many police departments plan to join TALON. But like the advent of facial recognition technologies, the spread of automatic license plate reader technology highlights how mass surveillance isn't always driven by the state.</p><p style="margin-left: 20px;">"We often think of dystopian surveillance as something that's imposed by an authoritarian government," Evan Greer, deputy director of the digital rights group Fight for the Future, told <a href="https://www.cnet.com/news/license-plate-tracking-for-police-set-to-go-nationwide/?utm_source=reddit.com" target="_blank">CNET</a>. "It's clearer every day that there is an enormous threat posed by privately owned and managed surveillance regimes, which will be weaponized by the rich and powerful to protect not just their wealth but the exploitative system that helped them amass it."</p>
New documents confirm that the government agency—one of many—has been using a tracking company.
- Documents reveal that the Secret Service used Locate X as part of a social media tracking package.
- The service "allows investigators to draw a digital fence around an address or area, pinpoint mobile devices that were within that area, and see where else those devices have traveled, going back months."
- Other agencies that have used this service include the Immigration and Customs Enforcement, the Customs and Border Protection, the Coast Guard, and the Drug Enforcement Administration.
Young players walk through the city centre of Hanover while holding their smartphones and playing "Pokemon Go" on July 15, 2016 in Hanover, Germany.
Photo by Alexander Koerner/Getty Images<p>Besides, the Secret Service is not Babel Street's only client. Others <a href="https://www.inputmag.com/tech/cbp-ice-the-secret-service-are-reportedly-locate-x-to-track-people-through-their-apps" target="_blank">include</a> the Immigration and Customs Enforcement, the Customs and Border Protection, the Coast Guard, the Drug Enforcement Administration, and others.</p><p>The Secret Service has <a href="https://www.protocol.com/government-buying-location-data" target="_blank">apparently used</a> Locate X to identify credit card skimming thieves. <a href="https://www.protocol.com/government-buying-location-data" target="_blank" rel="noopener noreferrer dofollow">According to</a> former employees of location-based companies that provide data to Babel Street, "the sale of personal location data from commercial firms to the government is more widespread and has been going on longer than previously known."</p><p>In 1985, Neil Postman wrote about Orwell's "1984" prophecy coming and going. In "Amusing Ourselves to Death," the cultural critic noted that Orwell got a lot right, but with "Brave New World," Aldous Huxley may have been the real winner when it comes to understanding the future. </p><p style="margin-left: 20px;">"What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture."</p><p>There's no need for a secret government plot to microchip us. Tech companies sold chips that we willingly bought; governments are simply purchasing the rights to locate them. That's not a conspiracy. We're staring straight into the truth every single day. </p><p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a>, <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer dofollow">Facebook</a> and <a href="https://derekberes.substack.com/" target="_blank" rel="noopener noreferrer dofollow">Substack</a>. His next book is</em> "<em>Hero's Dose: The Case For Psychedelics in Ritual and Therapy."</em></p>
A new study explores how wearing a face mask affects the error rates of popular facial recognition algorithms.
- The study measured the error rates of 89 commercial facial recognition technologies as they attempted to match photos of people with and without masks.
- Wearing a mask increased error rates by 5 to 50 percent among the algorithms.
- The researchers said they expect facial recognition technology to get better at recognizing people wearing masks. But it's not clear that that's what Americans want.
NIST digitally applied mask shapes to photos and tested the performance of face recognition algorithms developed before COVID appeared. Because real-world masks differ, the team came up with variants that included differences in shape, color and nose coverage.
Credit: B. Hayes/NIST<p>But not all masks thwarted the software equally. For example, black masks led to higher error rates than blue masks (though the researchers said they weren't able to completely explore how color affected the software). Error rates were also higher when people wore wide masks (as opposed to rounder ones) that covered most of the nose.</p><p style="margin-left: 20px;">"With the arrival of the pandemic, we need to understand how face recognition technology deals with masked faces," said Mei Ngan, a NIST computer scientist and an author of the report. "We have begun by focusing on how an algorithm developed before the pandemic might be affected by subjects wearing face masks. Later this summer, we plan to test the accuracy of algorithms that were intentionally developed with masked faces in mind."</p><p>The researchers said they expect facial-recognition software will get better at recognizing people wearing masks.</p><p style="margin-left: 20px;">"But the data we've taken so far underscores one of the ideas common to previous FRVT tests: Individual algorithms perform differently," Ngan said.</p>
American opinion on facial recognition<p>But do Americans even want better facial recognition technology? The answer depends on who's deploying the software. A <a href="https://www.pewresearch.org/internet/2019/09/05/more-than-half-of-u-s-adults-trust-law-enforcement-to-use-facial-recognition-responsibly/" target="_blank">2019 survey from Pew Research Center</a> found that 56 percent of Americans would trust law enforcement to use facial recognition technology responsibly, while 59 percent said it's acceptable for officials to use the software to monitor public spaces for threats.</p><p>Americans are more wary of trusting the private sector with facial recognition. For example, 36 percent of respondents said they'd trust technology companies to use the software responsibly, while only 16 percent said they'd trust advertisers to do the same.</p>
(Photo by Steffi Loos/Getty Images)<p>No matter how Americans feel about facial recognition, it's probably here to stay. After all, the FBI already has a database of more than <a href="https://nymag.com/intelligencer/2019/11/the-future-of-facial-recognition-in-america.html" target="_blank">641 million facial images</a>, many of which simply come from publicly accessible social media posts. And even though cities like San Francisco have banned the technology, police across the country are using it with increasing frequency.</p><p>Georgetown Law School's Center on Privacy and Technology <a href="https://www.perpetuallineup.org/findings/deployment" target="_blank">estimates</a> that "more than one in four of all American state and local law enforcement agencies can run face recognition searches of their own databases, run those searches on another agency's face recognition system, or have the option to access such a system."</p>
Innovative use of blockchain tech, data trusts, algorithm assessments, and cultural shifts abound.
- A study published last year by the Pew Research Center found that most American's distrust the federal government, and there's plenty of evidence to suggest that the situation has yet to improve.
- Governments have more access than ever to our private information, which creates an inherent tension between how they can use data for the public good while ensuring they aren't abusing citizens' privacy rights.
- As emerging technologies mature, it will become more evident to the public which models are the most effective ways for governments to achieve the levels of transparency they've committed to delivering.