How will AI shape the future of storytelling?
Can technology act as a feedback loop for human emotions?
Karen Palmer is the Storyteller from the Future. She is an award-winning international artist and TED speaker. She creates immersive film experiences at the intersection of film, A.I. technology, gaming, immersive storytelling, neuroscience, consciousness, implicit bias, and the parkour philosophy of moving through fear. She is the creator of RIOT, an emotionally responsive film, which uses facial recognition and A.I. technology to navigate through a dangerous riot.
KAREN PALMER: My name is Karen Palmer and I'm the storyteller from the future. And I've come back to enable people to survive what is to come through the power of storytelling. So I create films that watch you back using artificial intelligence and facial recognition. As they watch my films the narrative branches in real time depending on their emotional response. Therefore, they can become conscious of their subconscious behavior. And if they want learn to kind of neurologically reprogram themselves by going into the film more than once and changing their emotions and changing the narrative of the film.
Perception iO, perception input/output is my second of my artificial intelligence trilogy series. Perception iO puts the participant in the future world of law enforcement. You see in the future of law enforcement is about artificial intelligence. But someone has to program the artificial intelligence and in this case it's going to be you. So you are going to be sitting in what is training data for the future of law enforcement. So you will watch a series of two films from the perception of law enforcement coming into a situation which is chaotic. Both films will have a lead character, but the only difference is that the lead will be in one film black and one film will be white. And as you watch the film and the action unfolding your emotions will determine how the officer will respond to the person.
So if you deem the person is someone that needs assistance maybe you will call for backup. If you deem that the person is someone that is a threat maybe you may arrest them or maybe you may shoot them. This experience is to make you aware of your own implicit bias. Because the only difference with these two characters is their color. And I also want you to make the participant aware of how artificial intelligence is built. It's not built by a computer. It's built by a person. The film makes you conscious of how your emotions affect the narrative of the film, but it also makes you aware of how your emotions affect the narrative of your life.
I used to direct music videos and TV commercials about a decade ago and I became very aware of the power and the influence of just linear film. How I used to style somebody in the video of the dancers and I'd go on the street and I'd see people wearing that style. And I really felt like a great responsibility in what I was doing. So much so that I kind of came at the music industry and I very much wanted to explore the true power and potential of digital media. And I wanted to find a way not to project an image or representation or ideology onto someone but how to use digital media as more of a feedback loop so it could enable you to discover your true potential.
- Technology will change the way that humans tell and experience stories in the future.
- Palmer presents an idea for AI film that watches the viewer and changes the narrative based on their emotional responses to chaotic events.
- By acting as a feedback loop, the AI will make storytellers aware of their implicit bias and become conscious of subconscious behaviors.
- Palmer's latest interactive installation, Perception IO., is on view at Cooper Hewitt, Smithsonian Design Museum in New York City.
- How the Human Brain Became Hardwired to Tell Stories - Big Think ›
- Can deepfake technology actually benefit society? - Big Think ›
- Bill Gates is convinced that artificial intelligence will make our lives ... ›
A pragmatic approach to fixing an imbalanced system.
- Intentional or not, certain inequalities are inherent in a digital economy that is structured and controlled by a few corporations that don't represent the interests or the demographics of the majority.
- While concern and anger are valid reactions to these inequalities, UCLA professor Ramesh Srinivasan also sees it as an opportunity to take action.
- Srinivasan says that the digital economy can be reshaped to benefit the 99 percent if we protect laborers in the gig economy, get independent journalists involved with the design of algorithmic news systems, support small businesses, and find ways that groups that have been historically discriminated against can be a part of these solutions.
Is there a way for more human-centered algorithms to prevent potentially triggering interactions on social media?
- According to a 2017 study, 71% of people reported feeling better (rediscovery of self and positive emotions) about 11 weeks after a breakup. But social media complicates this healing process.
- Even if you "unfriend", block, or unfollow, social media algorithms can create upsetting encounters with your ex-partner or reminders of the relationship that once was.
- Researchers at University of Colorado Boulder suggest that a "human-centered approach" to creating algorithms can help the system better understand the complex social interactions we have with people online and prevent potentially upsetting encounters.