Harvard's Cass Sunstein: Algorithms can correct human biases
A tool that can slowly build a better world.
- Algorithms help drive the modern world.
- Algorithms reflect human biases, but some — as Harvard's Cass Sunstein notes — can be built to help correct our biases.
- If you build the right algorithm, you might be able to help contribute to a better world.
Algorithms are part of the engine that drives the modern world. When you search for something on Google, you're relying on a search engine defined by a specific algorithm. When you see what you see on your news feed on Facebook, you're not looking at something that comes to you naturally; you're looking at something defined by a specific algorithm.
There's been pushback recently on the idea of the efficacy with which algorithms make our world easier (which is part of the way in which algorithms are discussed — that they make our world easier). Some of the pushback is philosophical. Some of the pushback comes from an immediately practical place.
The pushback from a practical place takes the form of an article appearing in October of this year noting that Amazon got rid of an AI recruiting tool it was using that didn't like women. Another article from ProPublica noted that the algorithm used to determine whether or not a criminal defendant in the United States was liable to re-offend was racially biased.
Part of the reason why some algorithms are having trouble is because there are numerous mathematical ways to define the concept of 'fair', nor is every system built with enough flexibility to account for all the different ways in which 'fairness' can be defined. Consider a way by which one system assessed potential child abuse in the Pittsburgh region, as flagged in an article in Nature: "And, for reasons that are still not clear, white children that the algorithm scored as at highest risk of maltreatment were less likely to be removed from their homes than were black children given the highest risk scores."
But that doesn't mean that there aren't positive things at work among all this — there are, and Harvard Kennedy School professor Cass Sunstein recently released a paper to testify to that fact, arguing that "algorithms can overcome the harmful effects of cognitive biases."That being said, it's worth noting a strange moment in the paper where Sunstein writes that "... the decisions of human judges, with respect to bail decisions, show neither disparate treatment nor disparate impact. As far as I am aware, there is no proof of either." There is apparent proof, as noted in an article published in the Quarterly Journal of Economics. In the piece, the authors note that "estimates from Miami and Philadelphia show that bail judges are racially biased against black defendants, with substantially more racial bias among both inexperienced and part-time judges."
The paper from the Quarterly Journal of Economics renders slightly problematic the assertion that an algorithm can simply make an already race-blind decision-making process ('no proof of either') more efficient based on the mere potential for criminality. It also renders slightly problematic the notion that an algorithm can more or less reproduce what a Judge considering bail produces but a little bit more efficiently — with less crime and the like.
But this doesn't necessarily occlude some of what Sunstein points out about the particular algorithm noted in a paper put out under the auspices of the National Bureau of Economic Research:
1. "Use of the algorithm could maintain the same detention rate now produced by human judges and reduce crime by up to 24.7 percent. Alternatively, use of the algorithm could maintain the current level of crime reduction and reduce jail rates by as much as 41.9 percent ... thousands of people could be released, pending trial, without adding to the crime rate."
2. " … judges release 48.5 percent of the defendants judged by the algorithm to fall in the riskiest 1 percent. Those defendants fail to re-appear in court 56.3 percent of the time. They are rearrested at a rate of 62.7 percent. Judges show leniency to a population that is likely to commit crimes," treating "high-risk defendants as if they are low-risk when their current charge is relatively minor" while treating "low-risk people as if they are high-risk when their current charge is especially serious."
3. "If the algorithm is instructed to produce the same crime rate that judges currently achieve, it will jail 40.8 percent fewer African-Americans and 44.6 percent fewer Hispanics. It does this because it detains many fewer people, focused as it is on the riskiest defendants."
These are seemingly clear results that prove the thesis — that there is a bias that can be corrected to achieve a better result. Even with the complicating results found among inexperienced and part-time judges in Miami and Philadelphia, we can see that some judges interpret 'noise' as a signal. An algorithm can help provide a necessary aspect of clarity, and — potentially — justice.
To create wiser adults, add empathy to the school curriculum.
- Stories are at the heart of learning, writes Cleary Vaughan-Lee, Executive Director for the Global Oneness Project. They have always challenged us to think beyond ourselves, expanding our experience and revealing deep truths.
- Vaughan-Lee explains 6 ways that storytelling can foster empathy and deliver powerful learning experiences.
- Global Oneness Project is a free library of stories—containing short documentaries, photo essays, and essays—that each contain a companion lesson plan and learning activities for students so they can expand their experience of the world.
An extinction events expert sounds a dire warning.
- The supervolcano in Yellowstone National Park could cause an "ultra-catastrophe," warns an extinction events writer.
- The full eruption of the volcano last happened 640,000 years ago.
- The blast could kill billions and make United States uninhabitable.
Just before I turned 60, I discovered that sharing my story by drawing could be an effective way to both alleviate my symptoms and combat that stigma.
I've lived much of my life with anxiety and depression, including the negative feelings – shame and self-doubt – that seduced me into believing the stigma around mental illness: that people knew I wasn't good enough; that they would avoid me because I was different or unstable; and that I had to find a way to make them like me.
A joint study by two England universities explores the link between sex and cognitive function with some surprising differences in male and female outcomes in old age.
- A joint study by the universities of Coventry and Oxford in England has linked sexual activity with higher cognitive abilities in older age.
- The results of this study suggest there are significant associations between sexual activity and number sequencing/word recall in men. In women, however, there was a significant association between sexual activity in word recall alone - number sequencing was not impacted.
- The differences in testosterone (the male sex hormone) and oxytocin (a predominantly female hormone) may factor into why the male cognitive level changes much more during sexual activity in older age.
Mathematicians studied 100 billion tweets to help computer algorithms better understand our colloquial digital communication.