Maslo: This free A.I. app by ex-Google developers could be your new bestie
Two ex-Googlers release a new phone app that uses A.I. to become a safe, understanding listener.
For getting things done, we’ve got Siri, Alexa, Google Assistant, and Cortana—your mileage may vary as to just how useful and/or responsive any of them are. But what about an A.I. companion designed to support you spiritually by helping you strengthen mindfulness? A couple of ex-Google employees have recently released Maslo, a “voice journal with personality and smarts” that “interacts with empathy and playfulness.” “We encourage personal curiosity and believe in growing into identities that are bigger versions of who we are now,” say the developers, Ross Ingram and Cristina Poindexter. Could Maslo become your digital bestie?
We’ve been down this road before. Replika is an interview-based A.I. companion app released in 2016 that claimed to be ever-more friendworthy and more and more like you. Hopefully that last part isn't true, because it’s prone to some pretty loopy interaction.
Is Replika listening? Gotta wonder.
Maslo, a free download for iOS and Android, takes a far less ambitious approach than an app like Replika. In so doing, it may actually be more useful. It certainly responds more sensibly.
What using Maslo is like
Maslo is fundamentally a voice journal with A.I. analysis.
The app operates in two modes. In the first, you tap on Maslo’s large, undulating orb and it displays a question meant to help you think of what you would like to record as a journal entry—you don’t actually need to answer the question. If the prompt isn’t doing it for you, you can tap the icon beneath it to cycle through other similar queries. Maybe you don’t even need a prompt. In any event, you tap the orb and a counter appears during which you have 60 seconds to speak a journal entry. Tap when you’re done, and Maslo’s A.I. processes your audio recording through a neural net that analyzes its content for patterns. It spots words that express feelings and references to people, places, and things. For a few moments, it displays “Thinking…” then offers an emotional summary of your entry as well as a corresponding emoticon. You give a thumbs up or down. Ingram and Poindexter say the quality of the app’s insights will evolve as it gets to know you better.
The second mode is invoked by tapping the little list icon in the lower left of the screen. Maslo presents you a word-cloud that reveals the frequency with which you use certain words. It also displays an emoticon tally of the number of mood-related words you’ve used. You can tap Voice Recordings to display a list of, and play back, your journal entries.
It’s a very simple sort of interaction. For this first iteration of Maslo, the user experience is about building trust, an attempt to “build this sense of companionship between machine and the user so that it is this safe space,” Ingram tells TechCrunch. To keep it safe, most of the processing occurs on the phone itself. As one might surmise, the app’s name is a nod to Maslow’s hierarchy of human needs and is similarly meant as a tool for personal growth. Poindexter explains what the Maslo user experience is meant to accomplish: "We really want to reflect back to people what they’re saying… [Maslo] holds up a mirror… it’s a sounding board and doesn’t necessarily give you the answers but shows you what you might already know.”
The birth of Maslo
Ross Ingram and Cristina Poindexter.
Ingram and Poindexter had been working at Google when they met at a birthday party. Coder and marketer Ingram had joined Google from robotics firm Sphero, developers of the toy version of Star Wars’ BB-8. At the time, he was working at in Google's Advanced Technologies and Projects group. Poindexter had just helped launch Google Assistant for the Pixel phone and on Google Home. The Yale-educated sociologist had joined Google to address her concerns about the potentially destructive influence of technology in our lives, but she was burnt-out and thinking of fleeing to a farm in Italy.
A few weeks later they reconnected in the Coffee Lab on Google’s campus and found themselves talking about ways to make technology more personal. According to Ingram, “[Poindexter] understood the psychology that drove our love of technology,” a result of her work with Google Assistant. She writes in a blog post, “A lot of these interactions were non-utility queries. There was this need to go in and help people on a deeper level… I have a background in sociology and I look at it from a users’ perspective of what do people need. A lot of these interactions were mulling things over and needing a place to express them.”
Most importantly, Poindexter “was inspired to do more to make technology helpful," says Ingram. The two soon decided to leave Google, move to Los Angeles, and begin work on what is now Maslo.
It’s time for technology to mature. When it engages with us on deeper human topics, we’ll know it has. That’s what we’re building at Maslo: technology that grapples with the existential and that understands the psychological, because our generation will mature hand-in-hand with technology. If it doesn’t relate with us on these levels, we won’t either. So instead of stigmatizing those philosophical, psychological, and existential questions as cliche, our technology dives right in to help us answer the meatiest and most perplexing questions for ourselves. What do we want to do with our lives? How do we identify? Where do we belong? What makes us happy? — Cristina Poindexter
Ingram says Maslo develops a “platonic” relationship with users mixed with, as Business Insider puts it, “a hint of intimacy.” Poindexter insists, ”We're not saying Maslo is therapy by any sense,” but, “It can be therapeutic.” She says the app may eventually be even more helpful, helping the user grow by asking more penetrating questions such as, "I noticed you've felt this way lately. Do you want to talk about why it bothers you?"
Poindexter predicts in TechCrunch, “There are going to be different classes of machines that interact and relate to humans on different levels. We are seeing thousands of people using machines for assistant-based things… we know that where this is going we’re going to start talking more to whatever you want to call them… and Alexa won’t help you figure out if you need help.”
“It’s the way we define an assistant versus a companion,” says Ingram. “Assistants help things get done in the external world and companions are going to help us get things done in our internal world.”
Political activism may get people invested in politics, and affect urgently needed change, but it comes at the expense of tolerance and healthy democratic norms.