The attack on the Capitol forces us to confront an existential question about privacy.
- The insurrection attempt at the Capitol was captured by thousands of cell phones and security cameras.
- Many protestors have been arrested after their identity was reported to the FBI.
- Surveillance experts warn about the dangers of using facial recognition to monitor protests.
Brad Templeton: Today's Surveillance Society is Beyond Orwellian<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="06eb4a2ab19b644f5a3c0bf35ac2f42b"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/awFrWxfDA30?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>There's the <a href="https://www.thecut.com/2021/01/capitol-rioter-larry-rendall-brock-identified-to-fbi-by-ex.html" target="_blank">ex-wife of a retired Air Force lieutenant colonel</a> whose neck gaiter was pulled down; the <a href="https://www.inquirer.com/news/nation-world/capitol-insurrectionists-losing-jobs-social-media-identification-20210108.html" target="_blank">patriotic cohort of Internet detectives</a> crowd-sourcing information for the FBI; the director of the infamous pseudoscience film, "Plandemic," <a href="https://conspirituality.net/transmissions/plandemics-mikki-willis-joins-praises-violent-capitol-mob/" target="_blank">praising the "patriots" that breached the building</a> moments after he left the siege himself; and that unemployed actor who regularly attended QAnon events leaving the most public trail imaginable, and who is <a href="https://www.bbc.com/news/world-us-canada-55606044" target="_blank">currently in custody</a> facing serious charges.</p><p>Fish in barrels, all of them. What of the remaining thousands? </p><p>This privacy discussion is not new. Arthur Holland Michel, founder and co-director of the Center for the Study of the Drone at Bard College, <a href="https://bigthink.com/technology-innovation/gorgon-stare-surveillance" target="_self">warned Big Think in 2019</a> about the dangers of surveillance technology—specifically, in this case, a camera known as Gorgon Stare. </p><p style="margin-left: 20px;">"Say there is a big public protest. With this camera, you can follow thousands of protesters back to their homes. Now you have a list of the home addresses of all the people involved in a political movement. If on their way home you witness them committing some crime—breaking a traffic regulation or frequenting a location that is known to be involved in the drug trade—you can use that surveillance data against them to essentially shut them up. That's why we have laws that prevent the use of surveillance technologies because it is human instinct to abuse them. That's why we need controls."</p><p>Late last year, University of Miami students <a href="https://bigthink.com/technology-innovation/facial-recognition-software" target="_self">pushed back against school administrators</a> using facial recognition software for potentially insidious means—a protest not limited to that campus. Can you place students refusing to attend classes during a pandemic with armed insurrectionists attempting to change the results of a democratic election? Not even close. More to the point, however, we should leave political leanings out of the equation when deciding who we think should be monitored. </p>
Protesters enter the U.S. Capitol Building on January 06, 2021 in Washington, DC. Congress held a joint session today to ratify President-elect Joe Biden's 306-232 Electoral College win over President Donald Trump.
Credit: Win McNamee/Getty Images<p>Shortly after the siege, the New Yorker's Ronan Farrow <a href="https://www.newyorker.com/news/news-desk/an-air-force-combat-veteran-breached-the-senate" target="_blank">helped reveal the identity</a> of the aforementioned lieutenant colonel while conservatives <a href="https://www.washingtonpost.com/nation/2021/01/07/antifa-capitol-gaetz-trump-riot/" target="_blank">claim the riots were actually antifa</a>—a conspiracy theory that's <a href="https://apnews.com/article/virus-outbreak-race-and-ethnicity-suburbs-health-racial-injustice-7edf9027af1878283f3818d96c54f748" target="_blank">been peddled before</a>. Politics simply can't be avoided in this age. Still, Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, doesn't believe the insurrection attempt <a href="https://www.wired.com/story/opinion-the-capitol-attack-doesnt-justify-expanding-surveillance/" target="_blank">justifies an uptick in facial recognition technology</a>.</p><p style="margin-left: 20px;">"We don't need a cutting-edge surveillance dragnet to find the perpetrators of this attack: They tracked themselves. They livestreamed their felonies from the halls of Congress, recording each crime in full HD. We don't need facial recognition, geofences, and cell tower data to find those responsible, we need police officers willing to do their job."</p><p>The New Orleans City Council recently <a href="https://thelensnola.org/2020/12/18/new-orleans-city-council-approves-ban-on-facial-recognition-predictive-policing-and-other-surveillance-tech/" target="_blank">banned similar surveillance technologies</a> due to fears that it would unfairly target minorities. San Francisco was the <a href="https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html" target="_blank" rel="noopener noreferrer">first city to outright ban facial recognition</a> nearly two years ago. Cahn's point is that the FBI shouldn't be using AI to cover for the government's failure to protect the Capitol. Besides, the insurrectionists outed themselves on their own social media feeds. </p><p>When Pandora's box cracks open, it's hard to push the monster back in. Naomi Klein detailed the corporate takeover of New Orleans after Hurricane Katrina in "The Shock Doctrine." Real estate brokers, charter school companies, and government agencies didn't cause the flood, but they certainly profited from it. The fear is that companies like Clearview AI, which saw a <a href="https://www.nytimes.com/2021/01/09/technology/facial-recognition-clearview-capitol.html" target="_blank">26 percent spike in usage of its facial recognition service</a> following the attack, will be incentivized, as will police departments to use such technology for any means they choose.</p><p>Cahn comes to a similar conclusion: don't expose American citizens to the "anti-democratic technology" known as facial recognition. New Yorkers had to endure subway backpack checks for nearly a decade after 9/11; this slope is even slipperier. </p>As the US braces for <a href="https://www.cnn.com/2021/01/11/politics/fbi-bulletin-armed-protests-state-us-capitol/index.html" target="_blank" rel="noopener noreferrer">further "armed protests"</a> in all 50 states over the coming week, phones need to keep capturing footage. Bystanders need to remain safe, of course. But if last week was any indication, the insurrectionists have difficulty deciphering between social media and real life. Their feeds should reveal enough.<p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a> and <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer">Facebook</a>. His most recent book is</em> "<em><a href="https://www.amazon.com/gp/product/B08KRVMP2M?pf_rd_r=MDJW43337675SZ0X00FH&pf_rd_p=edaba0ee-c2fe-4124-9f5d-b31d6b1bfbee" target="_blank" rel="noopener noreferrer">Hero's Dose: The Case For Psychedelics in Ritual and Therapy</a>."</em></p>
A heated debate is occurring at the University of Miami.
- Students say they were identified with facial recognition technology after a protest at the University of Miami; campus police claim this isn't true.
- Over 60 universities nationwide have banned facial recognition; a few colleges, such as USC, regularly use it.
- Civil rights groups in Miami have called for the University of Miami to have talks on this topic.
Arthur Holland Michel: The Future of Surveillance Technology<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="8c330ab8c4df396f5313be796c0d96da"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/hIC-kaYcq34?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>Americans don't always agree with that assessment, especially on college campuses. Over 60 universities—Harvard, MIT, and UCLA are on the list—have banned facial recognition. Of the few schools that utilize it, USC lets students enter their rooms via face scans; the software also ensures intruders cannot access buildings.</p><p>These are great uses of this technology. You could argue it's how any progress with our devices should work: in service of people. The problem, of course, is that those in power don't tend to stop when they have a little taste of the possibilities.</p><p>University of Miami is the <a href="https://www.forbes.com/sites/rachelsandler/2020/10/27/human-rights-groups-call-on-the-university-of-miami-to-ban-facial-recognition/#a11c8bf2965a" target="_blank">latest school</a> to be embroiled in a battle over facial recognition. The ACLU of Florida was joined by 21 other groups when requesting that the university hold an open forum so that students can express their concerns. A piece of their letter is below. </p><p>This call for action was inspired after a September incident in which students <a href="https://www.miaminewtimes.com/news/university-of-miami-tracked-protesters-with-video-surveillance-11712139" target="_blank">protested</a> returning for in-person classes during the pandemic. The students, concerned about their health, predominantly wore face masks. Still, a number of them were identified, leading to concerns that facial recognition was used. Campus police denied it—the chief even claimed the tech "doesn't work," though that notion <a href="https://www.cnn.com/2020/08/12/tech/face-recognition-masks/index.html" target="_blank">has been refuted</a>—yet civil liberties groups are worried that an invasion of privacy occurred.</p><p>Lia Holland, a member of the digital rights nonprofit <a href="https://www.fightforthefuture.org/news/2020-10-27-20-human-rights-organizations-call-on-university-of-miami-to-ban-facial-recognition-and-meet-f6f2119fd41b/" target="_blank">Fight for the Future</a>, wants answers from school administrators. </p><p style="margin-left: 20px;">"UMiami is struggling to answer to their creepy surveillance practices, and clarify whether they are using their own facial recognition system, or Florida's state facial recognition database."</p>
Credit: Pixel Shot / Adobe Stock<p>The police chief in question, David Rivero, claims overhead surveillance cameras provided identification at the protest. Yet speaking of another case involving facial-recognition software, he's <a href="https://www.miaminewtimes.com/news/university-of-miami-tracked-protesters-with-video-surveillance-11712139" target="_blank">on the record stating</a>, "We were able to [easily] identify and arrest him. We've [detected] a few bad guys that way."</p><p>The letter sent to the Board of Administrators <a href="https://www.fightforthefuture.org/news/2020-10-27-20-human-rights-organizations-call-on-university-of-miami-to-ban-facial-recognition-and-meet-f6f2119fd41b/" target="_blank">includes the following demands</a>: </p><ol><li>Issue a campus-wide policy banning non-personal use of facial recognition technology, and issue a statement that you have done so.</li><li>Immediately schedule an open forum with students and faculty/staff to discuss community concerns and clarify how student activists who participated in First Amendment protected protest activities were identified by campus police.</li><li>Immediately schedule a meeting with the UMiami Employee Student Alliance (UMESA) to address their COVID-19 safety concerns, the subject of the original protest.</li></ol><p>There's no doubt facial-recognition technology has a place in law enforcement. Victims of unsolved crimes are relieved when the perpetrators are brought to justice, regardless of the means. As Michel writes, some police forces are already surveilling large regions of their districts using the Gorgon Stare, a camera used from airplanes. Cameras are ubiquitous, and that's not going to change. </p>As a society, we need honest discussions regarding the application of surveillance. Nearly every citizen in China has <a href="https://www.cnet.com/news/in-china-facial-recognition-public-shaming-and-control-go-hand-in-hand/" target="_blank">already been logged</a> by facial recognition software, which has led to human rights abuses. While the stated intention of this tech by American police is pure, good intentions are known to pave the way...well, we know how that ends. <p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a> and <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer">Facebook</a>. His new book is</em> "<em><a href="https://www.amazon.com/gp/product/B08KRVMP2M?pf_rd_r=MDJW43337675SZ0X00FH&pf_rd_p=edaba0ee-c2fe-4124-9f5d-b31d6b1bfbee" target="_blank" rel="noopener noreferrer">Hero's Dose: The Case For Psychedelics in Ritual and Therapy</a>."</em></p>
A new interactive documentary "How Normal Am I?" helps reveal the shortcomings of facial recognition technology.
- The website is part of SHERPA, a European Union-funded "project which analyses how AI and big data analytics impact ethics and human rights."
- The interactive documentary uses your webcam to analyze your face, predicting metrics like age, attractiveness, gender, body mass index and life expectancy.
- Despite the shortcomings of facial recognition, there's currently no set of national laws regulating the use of the technology by governments or private companies.
An interactive facial recognition experience<p>Want to see for yourself how well these systems work? Check out a new interactive mini-documentary called <a href="https://www.hownormalami.eu/" target="_blank">"How Normal Am I?"</a>, created by Tijmen Schep, a technology critic and privacy designer. </p><p>The documentary is part of <a href="https://www.project-sherpa.eu/" target="_blank">SHERPA</a>, a European Union-funded "project which analyses how AI and big data analytics impact ethics and human rights." To experience it, you'll need to grant the website permission to access your webcam, though "no personal data is collected." (You can also access the website and then disconnect your computer from the internet; it should still work fine.)<br></p>
hownormalami.eu<p>"How Normal Am I?" uses facial recognition to predict your age, attractiveness, <a href="https://www.sherpapieces.eu/overview/predicting-your-bmi-from-a-just-photo-a-github-safari" target="_blank">body mass index</a>, life expectancy and gender. Don't get upset if you get a low attractiveness or a high age score: Tilting your head, moving closer to the camera, or just running the program a second time can produce different results.<br></p><p>And that's sort of the point: If facial recognition technology is unreliable on a broad range of measures, to what extent should governments and the private sector be using it? Even if it does become reliable, to what extent should governments be allowed to use it on citizens?</p>
The future of facial recognition technology<p>In a <a href="https://www.pewresearch.org/internet/2019/09/05/more-than-half-of-u-s-adults-trust-law-enforcement-to-use-facial-recognition-responsibly/" target="_blank">2019 Pew Research Center survey</a>, a majority of U.S. respondents said it's acceptable for law enforcement agencies to use facial recognition to scan for threats in public spaces. However, far fewer said it's acceptable for advertisers to use facial recognition to do things like analyze how people respond to commercials in real time.<br></p><p>What could change how facial recognition operates in the U.S. is a set of national laws, which currently don't exist. (Although, some states and <a href="https://bigthink.com/politics-current-affairs/facial-recognition" target="_self">cities do regulate the technology</a>.) There are currently more than a <a href="https://www.cnn.com/2020/06/13/tech/facial-recognition-policy/index.html" target="_blank">dozen bills</a> addressing facial recognition technology, ranging from legislation that would outlaw warrantless usage of facial recognition, to banning federal agencies from <a href="https://www.congress.gov/bill/116th-congress/house-bill/3875/text?q=%7B%22search%22%3A%5B%22%5C%22facial+recognition%5C%22+-uyghur%22%5D%7D&r=2&s=8" target="_blank" rel="noopener noreferrer">using it altogether.</a></p>
The system is basically facial recognition technology, but for cars.
- Some police departments use automatic license plate readers to track suspects.
- A company called Flock Safety is now allowing police departments to opt in to a national network, which shares data on car movements.
- Privacy advocates are concerned about the potential for errors and abuse.
Map tracking the car movements of a murder suspect in Alabama.
Flock Safety<p>Flock Safety says its cameras help police solve more crimes. The company <a href="https://www.flocksafety.com/flock-safety-resources" target="_blank" rel="dofollow">website</a> notes that "70% of crime involves a vehicle" and law enforcement agencies say "a license plate is the best piece of evidence to track leads and solve crimes."</p><p>But critics of Flock Safety have raised concerns over the potential for errors and abuse. In August, for example, <a href="https://gizmodo.com/cops-terrorize-black-family-but-blame-license-plate-rea-1844602731" target="_blank">police in Colorado held a family at gunpoint</a> after a license plate reader flagged a car as stolen. It turned out to be the wrong vehicle.</p><p>With TALON, police would also have unprecedented information about the movements of citizens. It's not hard to see how this data could be abused. Think, for example, of the Florida police officer who used the Driver and Vehicle Information Database (D.A.V.I.D.) to get women's contact information so he could <a href="https://www.mercurynews.com/2019/03/11/police-in-florida-allege-officer-used-database-to-gets-dates/" target="_blank" rel="noopener noreferrer dofollow">ask them out on dates</a>.</p>
Flock Safety<p>It's currently unclear how many police departments plan to join TALON. But like the advent of facial recognition technologies, the spread of automatic license plate reader technology highlights how mass surveillance isn't always driven by the state.</p><p style="margin-left: 20px;">"We often think of dystopian surveillance as something that's imposed by an authoritarian government," Evan Greer, deputy director of the digital rights group Fight for the Future, told <a href="https://www.cnet.com/news/license-plate-tracking-for-police-set-to-go-nationwide/?utm_source=reddit.com" target="_blank">CNET</a>. "It's clearer every day that there is an enormous threat posed by privately owned and managed surveillance regimes, which will be weaponized by the rich and powerful to protect not just their wealth but the exploitative system that helped them amass it."</p>
New documents confirm that the government agency—one of many—has been using a tracking company.
- Documents reveal that the Secret Service used Locate X as part of a social media tracking package.
- The service "allows investigators to draw a digital fence around an address or area, pinpoint mobile devices that were within that area, and see where else those devices have traveled, going back months."
- Other agencies that have used this service include the Immigration and Customs Enforcement, the Customs and Border Protection, the Coast Guard, and the Drug Enforcement Administration.
Young players walk through the city centre of Hanover while holding their smartphones and playing "Pokemon Go" on July 15, 2016 in Hanover, Germany.
Photo by Alexander Koerner/Getty Images<p>Besides, the Secret Service is not Babel Street's only client. Others <a href="https://www.inputmag.com/tech/cbp-ice-the-secret-service-are-reportedly-locate-x-to-track-people-through-their-apps" target="_blank">include</a> the Immigration and Customs Enforcement, the Customs and Border Protection, the Coast Guard, the Drug Enforcement Administration, and others.</p><p>The Secret Service has <a href="https://www.protocol.com/government-buying-location-data" target="_blank">apparently used</a> Locate X to identify credit card skimming thieves. <a href="https://www.protocol.com/government-buying-location-data" target="_blank" rel="noopener noreferrer dofollow">According to</a> former employees of location-based companies that provide data to Babel Street, "the sale of personal location data from commercial firms to the government is more widespread and has been going on longer than previously known."</p><p>In 1985, Neil Postman wrote about Orwell's "1984" prophecy coming and going. In "Amusing Ourselves to Death," the cultural critic noted that Orwell got a lot right, but with "Brave New World," Aldous Huxley may have been the real winner when it comes to understanding the future. </p><p style="margin-left: 20px;">"What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture."</p><p>There's no need for a secret government plot to microchip us. Tech companies sold chips that we willingly bought; governments are simply purchasing the rights to locate them. That's not a conspiracy. We're staring straight into the truth every single day. </p><p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a>, <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer dofollow">Facebook</a> and <a href="https://derekberes.substack.com/" target="_blank" rel="noopener noreferrer dofollow">Substack</a>. His next book is</em> "<em>Hero's Dose: The Case For Psychedelics in Ritual and Therapy."</em></p>