Why democratizing AI is absolutely crucial

Without regulations, implicit bias could shape artificial intelligence into a nightmare for some.

KAREN PALMER: One of the key themes kind of in the subtext of the narratives of my work that I create is about democratizing artificial intelligence and kind of looking at the lack of AI governance and AI regulation. And the consequence and implications of that which is what my Perception iO project reflects for the user experience. This is a really big deal girls and boys out there. This is a really big deal. You know there was something called the, you may have heard of called email and the internet which was with the government and the military for decades before it came to us, the people. There's serious consequences if us, the people, don't have access or are not involved in the development of these networks with these really powerful forms of technology. There's something called implicit bias which basically means everybody basically has bias that they've had being brought up and their livelihood and their experiences in life.

And if you're a developer or a designer you tend to subconsciously program implicit bias into what you're doing. The consequences of implicit bias in technology like AI is basically catastrophic. So I'm going to give an example. There's a system called the Compass system which is a system which supports judges as they're sentencing a criminal. This is an AI system and it has been proven that this system has given longer sentences for people of color and black people than it does to white people. There's also a similar system in the UK which has been proven to give longer sentences to working class people. The artificial intelligence which supports the judges in these decisions is designed by private organizations and corporations. The data in this AI has no regulation governing it and basically a commercial entity has created this, given it to judges and it is affecting people's lives. People of color and black people for the worse on a daily basis.

As a black woman working in storytelling and technology this type of conversation is very important to bring to the fore. For other developers and academics in this area that's not a priority for them. They have other narratives that they want to bring. So part of democratizing, creating a regulation, a governance is to me essential and that's why when you experience my stories and my immersive experiences this is the context of them. Because maybe lots of people this is just too heavy for them. They want to watch The Voice, they want to watch X Factor. This is way too heavy shit. But if you're in an immersive experience and you're feeling it and you're feeling this emotion and you're seeing the consequences maybe viscerally it can connect with you in your gut in a different way. So that's why I created experiences to kind of show, bring these things to people in a way which is accessible to them in a language and experience which they understand.

  • Implicit biases are feelings and ideas subconsciously attributed to a group or culture based on learned associations and experiences. Everyone has them, but it can be dangerous when those biases are transferred to a powerful technology like AI.
  • By keeping the development of artificial intelligence private, we are risking building systems that are intrinsically biased against certain groups.
  • Governance and regulations are necessary to ensure that artificial intelligence remains as neutral as possible.


Catacombs of Paris: The city of darkness finds its new raison d'être

Ancient corridors below the French capital have served as its ossuary, playground, brewery, and perhaps soon, air conditioning.

Credit: Inspection Générale des Carrières, 1857 / Public domain
Strange Maps
  • People have been digging up limestone and gypsum from below Paris since Roman times.
  • They left behind a vast network of corridors and galleries, since reused for many purposes — most famously, the Catacombs.
  • Soon, the ancient labyrinth may find a new lease of life, providing a sustainable form of air conditioning.
Keep reading Show less

Baby's first poop predicts risk of allergies

Meconium contains a wealth of information.

Surprising Science
  • A new study finds that the contents of an infants' first stool, known as meconium, can predict if they'll develop allergies with a high degree of accuracy.
  • A metabolically diverse meconium, which indicates the initial food source for the gut microbiota, is associated with fewer allergies.
  • The research hints at possible early interventions to prevent or treat allergies just after birth.
Keep reading Show less

Asteroid impact: NASA simulation shows we are sitting ducks

Even with six months' notice, we can't stop an incoming asteroid.

Credit: NASA/JPL
Surprising Science
  • At an international space conference, attendees took part in an exercise that imagined an asteroid crashing into Earth.
  • With the object first spotted six months before impact, attendees concluded that there was insufficient time for a meaningful response.
  • There are an estimated 25,000 near-Earth objects potentially threatening our planet.
Keep reading Show less

Big think: Will AI ever achieve true understanding?

If you ask your maps app to find "restaurants that aren't McDonald's," you won't like the result.

Credit: GABRIEL BOUYS via Getty Images
Mind & Brain
  • The Chinese Room thought experiment is designed to show how understanding something cannot be reduced to an "input-process-output" model.
  • Artificial intelligence today is becoming increasingly sophisticated thanks to learning algorithms but still fails to demonstrate true understanding.
  • All humans demonstrate computational habits when we first learn a new skill, until this somehow becomes understanding.
Keep reading Show less
Quantcast