Should law enforcement be using AI and cell phone data to find rioters?
The attack on the Capitol forces us to confront an existential question about privacy.
- The insurrection attempt at the Capitol was captured by thousands of cell phones and security cameras.
- Many protestors have been arrested after their identity was reported to the FBI.
- Surveillance experts warn about the dangers of using facial recognition to monitor protests.
Brad Templeton: Today's Surveillance Society is Beyond Orwellian
<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="06eb4a2ab19b644f5a3c0bf35ac2f42b"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/awFrWxfDA30?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>There's the <a href="https://www.thecut.com/2021/01/capitol-rioter-larry-rendall-brock-identified-to-fbi-by-ex.html" target="_blank">ex-wife of a retired Air Force lieutenant colonel</a> whose neck gaiter was pulled down; the <a href="https://www.inquirer.com/news/nation-world/capitol-insurrectionists-losing-jobs-social-media-identification-20210108.html" target="_blank">patriotic cohort of Internet detectives</a> crowd-sourcing information for the FBI; the director of the infamous pseudoscience film, "Plandemic," <a href="https://conspirituality.net/transmissions/plandemics-mikki-willis-joins-praises-violent-capitol-mob/" target="_blank">praising the "patriots" that breached the building</a> moments after he left the siege himself; and that unemployed actor who regularly attended QAnon events leaving the most public trail imaginable, and who is <a href="https://www.bbc.com/news/world-us-canada-55606044" target="_blank">currently in custody</a> facing serious charges.</p><p>Fish in barrels, all of them. What of the remaining thousands? </p><p>This privacy discussion is not new. Arthur Holland Michel, founder and co-director of the Center for the Study of the Drone at Bard College, <a href="https://bigthink.com/technology-innovation/gorgon-stare-surveillance" target="_self">warned Big Think in 2019</a> about the dangers of surveillance technology—specifically, in this case, a camera known as Gorgon Stare. </p><p style="margin-left: 20px;">"Say there is a big public protest. With this camera, you can follow thousands of protesters back to their homes. Now you have a list of the home addresses of all the people involved in a political movement. If on their way home you witness them committing some crime—breaking a traffic regulation or frequenting a location that is known to be involved in the drug trade—you can use that surveillance data against them to essentially shut them up. That's why we have laws that prevent the use of surveillance technologies because it is human instinct to abuse them. That's why we need controls."</p><p>Late last year, University of Miami students <a href="https://bigthink.com/technology-innovation/facial-recognition-software" target="_self">pushed back against school administrators</a> using facial recognition software for potentially insidious means—a protest not limited to that campus. Can you place students refusing to attend classes during a pandemic with armed insurrectionists attempting to change the results of a democratic election? Not even close. More to the point, however, we should leave political leanings out of the equation when deciding who we think should be monitored. </p>Protesters enter the U.S. Capitol Building on January 06, 2021 in Washington, DC. Congress held a joint session today to ratify President-elect Joe Biden's 306-232 Electoral College win over President Donald Trump.
Credit: Win McNamee/Getty Images
<p>Shortly after the siege, the New Yorker's Ronan Farrow <a href="https://www.newyorker.com/news/news-desk/an-air-force-combat-veteran-breached-the-senate" target="_blank">helped reveal the identity</a> of the aforementioned lieutenant colonel while conservatives <a href="https://www.washingtonpost.com/nation/2021/01/07/antifa-capitol-gaetz-trump-riot/" target="_blank">claim the riots were actually antifa</a>—a conspiracy theory that's <a href="https://apnews.com/article/virus-outbreak-race-and-ethnicity-suburbs-health-racial-injustice-7edf9027af1878283f3818d96c54f748" target="_blank">been peddled before</a>. Politics simply can't be avoided in this age. Still, Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, doesn't believe the insurrection attempt <a href="https://www.wired.com/story/opinion-the-capitol-attack-doesnt-justify-expanding-surveillance/" target="_blank">justifies an uptick in facial recognition technology</a>.</p><p style="margin-left: 20px;">"We don't need a cutting-edge surveillance dragnet to find the perpetrators of this attack: They tracked themselves. They livestreamed their felonies from the halls of Congress, recording each crime in full HD. We don't need facial recognition, geofences, and cell tower data to find those responsible, we need police officers willing to do their job."</p><p>The New Orleans City Council recently <a href="https://thelensnola.org/2020/12/18/new-orleans-city-council-approves-ban-on-facial-recognition-predictive-policing-and-other-surveillance-tech/" target="_blank">banned similar surveillance technologies</a> due to fears that it would unfairly target minorities. San Francisco was the <a href="https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html" target="_blank" rel="noopener noreferrer">first city to outright ban facial recognition</a> nearly two years ago. Cahn's point is that the FBI shouldn't be using AI to cover for the government's failure to protect the Capitol. Besides, the insurrectionists outed themselves on their own social media feeds. </p><p>When Pandora's box cracks open, it's hard to push the monster back in. Naomi Klein detailed the corporate takeover of New Orleans after Hurricane Katrina in "The Shock Doctrine." Real estate brokers, charter school companies, and government agencies didn't cause the flood, but they certainly profited from it. The fear is that companies like Clearview AI, which saw a <a href="https://www.nytimes.com/2021/01/09/technology/facial-recognition-clearview-capitol.html" target="_blank">26 percent spike in usage of its facial recognition service</a> following the attack, will be incentivized, as will police departments to use such technology for any means they choose.</p><p>Cahn comes to a similar conclusion: don't expose American citizens to the "anti-democratic technology" known as facial recognition. New Yorkers had to endure subway backpack checks for nearly a decade after 9/11; this slope is even slipperier. </p>As the US braces for <a href="https://www.cnn.com/2021/01/11/politics/fbi-bulletin-armed-protests-state-us-capitol/index.html" target="_blank" rel="noopener noreferrer">further "armed protests"</a> in all 50 states over the coming week, phones need to keep capturing footage. Bystanders need to remain safe, of course. But if last week was any indication, the insurrectionists have difficulty deciphering between social media and real life. Their feeds should reveal enough.<p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a> and <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer">Facebook</a>. His most recent book is</em> "<em><a href="https://www.amazon.com/gp/product/B08KRVMP2M?pf_rd_r=MDJW43337675SZ0X00FH&pf_rd_p=edaba0ee-c2fe-4124-9f5d-b31d6b1bfbee" target="_blank" rel="noopener noreferrer">Hero's Dose: The Case For Psychedelics in Ritual and Therapy</a>."</em></p>Scientists urge UN to add 'neuro-rights' to Universal Declaration of Human Rights
Neuroscientists and ethicists wants to ensure that neurotechnologies remain benevolent.
- Columbia University neuroscience professor Rafael Yuste is advocating for the UN to adopt "neuro-rights."
- Neurotechnology is a growing field that includes a range of technologies that influence higher brain activities.
- Ethicists fear that these technologies will be misused and abuses of privacy and even consciousness could follow.
Rafael Yuste - a neuroscientist exploring the ethics of neural identity
<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="853ecb9790a2b78013033f31643b5c06"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/hh_ePuww5-c?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>The ethical question is not new. For example, the debate over embryonic stem cells raged for years. Promises of trait enhancement concerned people who thought scientists would play the role of a god. While that debate has mostly died down, the use of neurotechnology by militaries and tech companies—particularly concerning privacy—will be contentious for decades. </p><p>Cognitive liberty is a term assigned to those who believe every individual must be allowed to maintain their own agency. An extension of the concept of freedom of thought, cognitive liberty is <a href="https://www.webcitation.org/666ICKKEl?url=http://www.cognitiveliberty.org/faqs/faq_general.htm" target="_blank" rel="noopener noreferrer">defined</a> as "the right of each individual to think independently and autonomously, to use the full power of his or her mind, and to engage in multiple modes of thought," as written by neuroethicist Dr. Wrye Sententia and legal theorist Richard Glen Boire. </p><p>The challenges to cognitive liberty include <em>privacy</em>, which they argue must encompass the domain of inner thought; <em>autonomy</em>, so that thought processes remain the province of the individual; and <em>choice</em>, provided that the individual is not harming others. </p><p>Yuste believes the U.N.'s declaration, which was created in the wake of World War II in 1948, needs immediate revision. Deep brain stimulation is already an FDA-approved procedure. Whereas social media creates its own addiction and mental health problems, a sense of agency still exists. When tech has the capability to get "<a href="https://www.reuters.com/article/us-global-tech-rights-idUSKBN28D3HK" target="_blank" rel="noopener noreferrer">under the skull and get at our neurons</a>," as Johns Hopkins professor of neurology and neuroscience, John Krakauer, says, a sense of urgency exists. </p><p>For Yuste it's completely a matter of agency—and liberty. </p><p style="margin-left: 20px;">"This is the first time in history that humans can have access to the contents of people's minds. We have to think very carefully about how we are going to bring this into society."</p><p>--</p><p><em>Stay in touch with Derek on <a href="http://www.twitter.com/derekberes" target="_blank">Twitter</a> and <a href="https://www.facebook.com/DerekBeresdotcom" target="_blank" rel="noopener noreferrer">Facebook</a>. His new book is</em> "<em><a href="https://www.amazon.com/gp/product/B08KRVMP2M?pf_rd_r=MDJW43337675SZ0X00FH&pf_rd_p=edaba0ee-c2fe-4124-9f5d-b31d6b1bfbee" target="_blank" rel="noopener noreferrer">Hero's Dose: The Case For Psychedelics in Ritual and Therapy</a>."</em></p>Amazon devices have colonized homes. 'Smart neighborhoods' may be next
Here's why you may want to opt-out of Amazon's new shared network.
- Speaking at Web Summit 2020, David Limp, the senior vice president of Devices and Services at Amazon, suggested that the company is aiming to build smart devices that would operate in neighborhoods.
- Amazon recently began rolling out Sidewalk, which aims to create a shared, intermediate-range network powered by Amazon devices.
- Sidewalk, which lets nearby devices access your Wi-Fi, raises numerous privacy and security concerns.
Amazon Sidewalk
<p>Sidewalk, which began rolling out in the U.S. in November, aims to create a shared network powered by Amazon devices. It's something like a community of Wi-Fi hotspots to which compatible devices can automatically connect.</p><p>Amazon Echo and Ring devices would serve as so-called Sidewalk Bridges that "share a small portion of your internet bandwidth which is pooled together to provide these services to you and your neighbors," Amazon writes.</p><p>Say you have a security camera in your detached garage that's not working because it's just out of your router's range. With Sidewalk, it'd be able to siphon off some of the Wi-Fi connection emitted from your neighbor's devices (if they enable Sidewalk). But that also means you'd be offering up some of your Wi-Fi bandwidth (<a href="https://www.amazon.com/Amazon-Sidewalk/b?ie=UTF8&node=21328123011" target="_blank">500mb monthly maximum</a>) to any compatible devices within about a half-mile.</p><p>Amazon frames Sidewalk as a sort of first step toward creating smart neighborhoods that utilize the so-called internet of things:</p><p style="margin-left: 20px;">"When your device is connected to the Sidewalk network, you can connect to Alexa and offer more experiences for customers like keeping track of the things they care about most, whether it's their puppy, their keys, or their bicycle. Later this year, Tile devices will be supported on the Sidewalk network, enabling customers to keep track of their things by asking Alexa questions like 'Alexa, where are my keys?'"</p>Amazon
<p>When Amazon first announced Sidewalk in 2019, CEO Jeff Bezos <a href="https://www.geekwire.com/2019/amazon-ceo-jeff-bezos-sees-new-opportunity-yard-sidewalk-beyond/" target="_blank">told</a> reporters the system would fill a gap in wireless connectivity:</p><p style="margin-left: 20px;">"It's a completely new way of thinking about intermediate-range wireless. There are a lot of things where Bluetooth is way too short-range, WiFi is way too high power, and so to have something that's still low-power, but that has much longer range is really a gap in the marketplace. [...] People don't even realize yet how important that intermediate range is going to be."</p><p>On the neighborhood level, this could prove convenient for people with compatible devices, as Jonathan Collins, research director at ABI Research, told <a href="https://cities-today.com/what-amazons-new-neighbourhood-network-means-for-smart-cities/" target="_blank">Cities Today</a>:</p><p style="margin-left: 20px;">"Amazon Sidewalk has the potential to deliver low-cost, low-data, citywide mesh connectivity. It could certainly benefit from adoption within smart-city projects. For example, integrating Sidewalk into parking meters would bring low-cost network access for the city, but it will also provide more mesh nodes to strengthen network coverage and capacity."</p>Security concerns
<p>Sidewalk raises no shortage of red flags in terms of security and privacy.<br></p><p style="margin-left: 20px;">"The initial concern is really about what all the devices connected to Sidewalk collect," Forrester security analyst Jeff Pollard <a href="https://www.cnet.com/news/amazon-sidewalk-extends-your-network-but-security-is-already-in-question/" target="_blank">told</a> CNET. "If use cases like home automation or IoT devices make use of this technology, they generate telemetry data. Connected devices -- especially in the home -- give vast amounts of information about your behaviors and activities, which could also go to Amazon with this connectivity."</p>Illustration from Amazon's Sidewalk security white paper
Amazon
<p>Anticipating concerns, Amazon published a <a href="https://m.media-amazon.com/images/G/01/sidewalk/privacy_security_whitepaper_final.pdf" target="_blank">white paper</a> describing how Sidewalk uses three layers of encryption, doesn't reveal users' identities, and makes "it difficult for anyone, including Amazon, to piece together [a user's] activity history over time."</p><p>Difficult, but not impossible. Amazon said it's automatically enabling Sidewalk on Echo and Ring devices, though users can opt-out of the system. If you'd rather not serve as a guinea pig in Amazon's new shared network, here's how to disable it:</p>Andrew Yang backs California’s data privacy campaign
"Our data should be ours no matter what platforms and apps we use," Yang said.
- In November, Californians will vote to pass Proposition 24, which aims to expand data privacy laws in the state.
- Proposition 24 aims to strengthen the California Consumer Privacy Act, which went into effect this year.
- However, some privacy advocates say Proposition 24 doesn't go far enough, and in some cases actually erodes the CCPA.
Critiques of Prop. 24
<p>Still, some advocates say even these additions to the CCPA don't go far enough, including organizations like the ACLU of California, the Consumer Federation of California and the Electronic Frontier Foundation (EFF).</p><p>Calling it a "mixed bag of partial steps backwards and forwards," the EFF <a href="https://www.eff.org/deeplinks/2020/07/why-eff-doesnt-support-cal-prop-24" target="_blank">said</a> it wouldn't support Proposition 24 because (to name a few reasons) it:</p><ul><li>Would expand "<a href="https://www.eff.org/deeplinks/2019/02/payoff-californias-data-dividend-must-be-stronger-privacy-laws" target="_blank" rel="noopener noreferrer">pay for privacy</a>" schemes by allowing a company to withhold discounts unless consumers in loyalty clubs allow it to harvest certain data. This could lead to a society of privacy "haves" and "have-nots," wrote the EFF.</li><li>Fails to establish an "opt-in" model of data collection. Under the CCPA, consumers have to opt-out of collection, which places the burden on consumers to protect privacy. "Privacy should be the default," the EFF wrote.</li><li>Would expand the power of companies to refuse a consumer's request to delete their data.</li></ul>(Photo by Scott Eisen/Getty Images)
<p>As for Yang? It's unclear what the former presidential hopeful, whose campaign was based in part on data privacy, thinks of these critiques. But in a recent interview with <a href="https://www.ksro.com/2020/09/02/interview-andrew-yang-on-prop-24/" target="_blank" rel="noopener noreferrer">KSRO</a>, Yang said the U.S. lags far behind European nations in terms of data privacy laws, and that Proposition 24 would be a huge step towards our data dignity. He added that other states beyond California would likely follow suit if the proposal passes.</p>The Data Dividend Project
<p>Yang is also spearheading the <a href="https://www.datadividendproject.com/" target="_blank">Data Dividend Project</a>, a "movement dedicated to establishing and enforcing data property rights and to getting you compensated when companies monetize your data." The project, which operates under the laws established by the CCPA, aims to tax tech companies when they use consumer data, and to support new data privacy legislation across the country. (Some critics have <a href="https://www.vice.com/en_us/article/935358/andrew-yangs-data-dividend-isnt-radical-its-useless" target="_blank" rel="noopener noreferrer">questioned the efficacy of the project</a>.)</p><p>In an op-ed about his data dividend proposal published in the <a href="https://www.latimes.com/opinion/story/2020-06-23/andrew-yang-data-dividend-tech-privacy" target="_blank" rel="noopener noreferrer">Los Angeles Times</a>, Yang wrote: </p><p style="margin-left: 20px;">"If Congress and other states adopt legislation like the CCPA, millions more would be able to band together with even greater bargaining power to hold tech companies accountable and, ultimately, demand that they share some of the revenue generated from consumers' personal data."</p>Police can track cars nationwide with new license plate surveillance network
The system is basically facial recognition technology, but for cars.
- Some police departments use automatic license plate readers to track suspects.
- A company called Flock Safety is now allowing police departments to opt in to a national network, which shares data on car movements.
- Privacy advocates are concerned about the potential for errors and abuse.
Map tracking the car movements of a murder suspect in Alabama.
Flock Safety
<p>Flock Safety says its cameras help police solve more crimes. The company <a href="https://www.flocksafety.com/flock-safety-resources" target="_blank" rel="dofollow">website</a> notes that "70% of crime involves a vehicle" and law enforcement agencies say "a license plate is the best piece of evidence to track leads and solve crimes."</p><p>But critics of Flock Safety have raised concerns over the potential for errors and abuse. In August, for example, <a href="https://gizmodo.com/cops-terrorize-black-family-but-blame-license-plate-rea-1844602731" target="_blank">police in Colorado held a family at gunpoint</a> after a license plate reader flagged a car as stolen. It turned out to be the wrong vehicle.</p><p>With TALON, police would also have unprecedented information about the movements of citizens. It's not hard to see how this data could be abused. Think, for example, of the Florida police officer who used the Driver and Vehicle Information Database (D.A.V.I.D.) to get women's contact information so he could <a href="https://www.mercurynews.com/2019/03/11/police-in-florida-allege-officer-used-database-to-gets-dates/" target="_blank" rel="noopener noreferrer dofollow">ask them out on dates</a>.</p>Flock Safety
<p>It's currently unclear how many police departments plan to join TALON. But like the advent of facial recognition technologies, the spread of automatic license plate reader technology highlights how mass surveillance isn't always driven by the state.</p><p style="margin-left: 20px;">"We often think of dystopian surveillance as something that's imposed by an authoritarian government," Evan Greer, deputy director of the digital rights group Fight for the Future, told <a href="https://www.cnet.com/news/license-plate-tracking-for-police-set-to-go-nationwide/?utm_source=reddit.com" target="_blank">CNET</a>. "It's clearer every day that there is an enormous threat posed by privately owned and managed surveillance regimes, which will be weaponized by the rich and powerful to protect not just their wealth but the exploitative system that helped them amass it."</p>