When we limit the clash of ideas, we ultimately hinder progress for the entire society.
- Pluralism is the idea that different people, traditions, and beliefs not only can coexist together in the same society but also should coexist together because society benefits from the vibrant workshopping of ideas.
- Cancel culture is a threat to a liberal society because it seeks to shape the available information rather than seek truth.
- Practicing toleration for those ideas does not mean merely putting up with them but actually acknowledging the ideas with an open spirit, as Chandran Kukathas, professor at Singapore Management University, says.
Machine learning is a powerful and imperfect tool that should not go unmonitored.
- When you harness the power and potential of machine learning, there are also some drastic downsides that you've got to manage.
- Deploying machine learning, you face the risk that it be discriminatory, biased, inequitable, exploitative, or opaque.
- In this article, I cover six ways that machine learning threatens social justice and reach an incisive conclusion: The remedy is to take on machine learning standardization as a form of social activism.
Here are six ways machine learning threatens social justice<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDUyMDgxNC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0MzM0NjgxOH0.zHvEEsYGbNA-lnkq4nss7vwVkZlrKkuKf0XASf7A7Jg/img.jpg?width=980" id="05f07" class="rm-shortcode" data-rm-shortcode-id="a7089b6621166f5a2df77d975f8b9f74" data-rm-shortcode-name="rebelmouse-image" />
Credit: metamorworks via Shutterstock<p><strong></strong><strong>1) </strong><strong>Blatantly discriminatory models</strong> are predictive models that base decisions partly or entirely on a protected class. Protected classes include race, religion, national origin, gender, gender identity, sexual orientation, pregnancy, and disability status. By taking one of these characteristics as an input, the model's outputs – and the decisions driven by the model – are based at least in part on membership in a protected class. Although models rarely do so directly, there is <a href="https://www.youtube.com/watch?v=eSlzy1x6Fy0" target="_blank">precedent</a> and <a href="https://www.youtube.com/watch?v=wfpNN8ASIq4" target="_blank">support</a> for doing so.</p><p>This would mean that a model could explicitly hinder, for example, black defendants for being black. So, imagine sitting across from a person being evaluated for a job, a loan, or even parole. When they ask you how the decision process works, you inform them, "For one thing, our algorithm penalized your score by seven points because you're black." This may sound shocking and sensationalistic, but I'm only literally describing what the model would do, mechanically, if race were permitted as a model input. </p><p><strong>2) Machine bias</strong>. Even when protected classes are not provided as a direct model input, we find, in some cases, that model predictions are still inequitable. This is because other variables end up serving as proxies to protected classes. This is <a href="https://coursera.org/share/51350b8fb12a5937bbddc0e53a4f207d" target="_blank" rel="noopener noreferrer">a bit complicated</a>, since it turns out that models that are fair in one sense are unfair in another. </p><p>For example, some crime risk models succeed in flagging both black and white defendants with equal precision – each flag tells the same probabilistic story, regardless of race – and yet the models falsely flag black defendants more often than white ones. A crime-risk model called COMPAS, which is sold to law enforcement across the US, falsely flags white defendants at a rate of 23.5%, and Black defendants at 44.9%. In other words, black defendants who don't deserve it are <a href="https://coursera.org/share/df6e6ba7108980bb7eeae0ba22123ac1" target="_blank" rel="noopener noreferrer">erroneously flagged almost twice as much</a> as white defendants who don't deserve it.</p><p><strong>3) Inferring sensitive attributes</strong>—predicting pregnancy and beyond. Machine learning predicts sensitive information about individuals, such as sexual orientation, whether they're pregnant, whether they'll quit their job, and whether they're going to die. Researchers have shown that it is possible to <a href="https://youtu.be/aNwvXhcq9hk" target="_blank" rel="noopener noreferrer">predict race based on Facebook likes</a>. These predictive models deliver dynamite.</p><p>In a particularly extraordinary case, officials in China use facial recognition to <a href="https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html" target="_blank" rel="noopener noreferrer">identify and track the Uighurs, a minority ethnic group</a> systematically oppressed by the government. This is the first known case of a government using machine learning to profile by ethnicity. One Chinese start-up valued at more than $1 billion said its software could recognize "sensitive groups of people." It's website said, "If originally one Uighur lives in a neighborhood, and within 20 days six Uighurs appear, it immediately sends alarms" to law enforcement.</p>
Recourse: Establish machine learning standards as a form of social activism<p>To address these problems, take on machine learning standardization as a form of social activism. We must establish standards that go beyond nice-sounding yet vague platitudes such as "be fair", "avoid bias", and "ensure accountability". Without being precisely defined, these catch phrases are subjective and do little to guide concrete action. Unfortunately, such broad language is fairly common among the principles released by many companies. In so doing, companies protect their public image more than they protect the public.<br></p><p>People involved in initiatives to deploy machine learning have a powerful, influential voice. These relatively small numbers of people mold and set the trajectory for systems that automatically dictate the rights and resources that great numbers of consumers and citizens gain access to.</p><p>Famed machine learning leader and educator Andrew Ng drove it home: "AI is a superpower that enables a small team to affect a huge number of people's lives... Make sure the work you do leaves society better off."</p><p>And Allan Sammy, Director, Data Science and Audit Analytics at Canada Post, clarified the level of responsibility: "A decision made by an organization's analytic model is a decision made by that entity's senior management team."</p><p>Implementing ethical data science is as important as ensuring a self-driving car knows when to put on the breaks.</p><p>Establishing well-formed ethical standards for machine learning will be an intensive, ongoing process. For more, <a href="https://youtu.be/ToSj0ZkJHBQ" target="_blank">watch this short video</a>, in which I provide some specifics meant to kick-start the process.</p>
A new study shows how poor children are negatively impacted neurologically.
- Children in poor neighborhoods exhibit abnormal activation of motivational circuits in their brains.
- The neurological impact increases the likelihood of criminal behavior and substance abuse later in life.
- Researchers suggest focusing on shaping the environment to set up the child for success.
We wouldn't want to live without it, so how can we create art that's durable?
- You cannot kill the arts. This is particularly true when you talk about poetry, which does well in a world of social media as its easy to digest in its short form.
- Measuring success in art can be tricky, though. Impact and influence can be felt immediately, so how does art find that everlasting durability?
- Philanthropy can encourage and enable art, and as a result, potentially lengthen its lifespan. If we can find ways to measure art in its own terms, we can effectively give a platform to new voices who complete the cultural picture.
A new report calls on the departments of certain scientific fields to double the number of black students by 2030.
- A new report calls for doubling the number of undergrad degrees awarded to black students in physics and astronomy by 2030.
- In the United States, black students earned a total of 223 bachelor's degrees in physics and just 10 in astronomy in 2018.
- The report found that unsupportive environments in physics and astronomy departments and systemic financial challenges faced by black students contributed to the underrepresentation of black students.
Black underrepresentation in the sciences<p>In the last 25 years, the total number of bachelor's degrees earned in the United States has shot up from 1.1 million to 2.1 million. The number earned across all fields by African Americans has more than doubled, going from 85,856 to 193,567. Yet, in physics and astronomy, this has not been proportional. While the number of undergraduate degrees earned in those fields has surged, according to research from the Statistical Research Center at the American Institute of Physics (AIP), only 4% are received by black students. In 2018, black students earned a total of 223 degrees in physics and just 10 in astronomy. Obviously, black students have the same levels of ambition, intellect, and talent to obtain those degrees as any other group. So what's going on?</p><p>A <a href="https://www.aip.org/diversity-initiatives/team-up-task-force" target="_blank">two-year investigation</a> by the AIP's National Task Force to Elevate African American Representation in Undergraduate Physics and Astronomy (TEAM-UP) discovered that two big challenges contributed to the perpetual underrepresentation of black students in the fields of physics and astronomy. One was a less-than supportive environment in the departments, and the other was systemic personal financial challenges faced at higher rates by black students. </p><p>The investigation included recommendations from student and faculty surveys, interviews with black students and department chairs, and visits to physics department sites. Drawing on those findings, the report lays out recommendations to facilitate the sweeping cultural changes necessary to increase black representation in physics and astronomy. The TEAM-UP goal is to at least double the number of African Americans who earn bachelor's degrees in those fields by 2030. </p><p>"The physics community is looking very deeply at itself," Shirley Malcom, a senior adviser at the American Association for the Advancement of Science, <a href="https://physicstoday.scitation.org/doi/10.1063/PT.3.4405" target="_blank">said to <em>Physics Today</em></a>. "We were born into a society that tends not to value black people. Let's get over that and change behaviors that keep African Americans from thriving in our colleges and universities and contributing to the advancement of physics."</p>
Fostering a sense of belonging<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yMjY2ODc4Ni9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNzQ2MjAxOH0.scAJPobtSLMvSgbr-Jyrjo7evqlmJ-pL4i0T3oxXju8/img.jpg?width=1245&coordinates=0%2C42%2C0%2C42&height=700" id="d11a4" class="rm-shortcode" data-rm-shortcode-id="838240897b96eb0cae3fce5ed5948d65" data-rm-shortcode-name="rebelmouse-image" alt="Black students raising their hands at a leadership conference" />
The College of DuPage hosted a Black Student Leadership Conference in 2016.
Photo Credit: COD Newsroom / Flickr<p>According to the report, physics and astronomy departments can foster a sense of belonging for black students by promoting values of inclusion and by ensuring black students that they are valued and expected to succeed. The report suggests that departments take extra steps to ensure that black students feel comfortable in 'common areas' and are encouraged to take on leadership positions on campus. It also advises that faculty members adopt methods to combat racist behavior and more subtle <a href="https://www.vox.com/2015/2/16/8031073/what-are-microaggressions" target="_blank">microaggressions</a> toward black students.</p><p>The report also outlined recommendations for alleviating financial burdens for black students. In 2016, the median income of white families ($171,000) was 10 times that of black families ($17,600). As a result, black students disproportionately end up having to juggle financial or medical stresses while in school. To address these strains, TEAM-UP calls for the creation of a $50-million fund that would support marginalized students in physics and astronomy. Half of that value would be used to directly support at least 150 minority physics and astronomy students. Each student would receive around $8,000 to help pay for their education. The rest would be used to help departments implement campus resources like counseling, extra lecturers, and new programs. </p>