Say What? Chatbots Can Create Their Own Non-Human Language to Communicate

Facebook researchers have found that dialog agents being trained to negotiate will create their own non-human language to be more effective. What does this mean for the future of language?

Let's just hope the chatbots are gossiping behind our back.


You know that slight anxiousness you feel when two people are negotiating in a language you don't understand? Well, it turns out that chatbots are able to create their own non-human language to communicate back-and-forth. This was reported by researchers at Facebook Artificial Intelligence Research (FAIR), who were developing "dialog agents" with the newfound capability of negotiation. 

In order for bots to communicate more efficiently with other bots, they learned to create their own simple language.

"To go beyond simply trying to imitate people, the FAIR researchers instead allowed the model to achieve the goals of the negotiation. To train the model to achieve its goals, the researchers had the model practice thousands of negotiations against itself, and used reinforcement learning to reward the model when it achieved a good outcome. To prevent the algorithm from developing its own language, it was simultaneously trained to produce humanlike language."-Deal or no deal? Training AI bots to negotiate 

Is This a Big Deal?

On its face, it seems logical that chatbots would create their own language. If one of the major reasons why human language evolved is so we could more effectively convey our desires, it makes sense that bots aiming for efficiency would shortcut human imitation. At the same time, the concept of a non-human language further erodes the uniqueness we may feel as a human (given how central language has been to society's development).

The findings by the Facebook researchers falls on the heels of research from OpenAI, which in March 2017 reported the ability for bots to create their own language. Similar to the report by FAIR, the researchers found that bots were able to create their own language when going through reinforcement learning. 

The researchers from OpenAI set out to "teach AI agents to create language by dropping them into a set of simple worlds, giving them the ability to communicate, and then giving them goals that can be best achieved by communicating with other agents. If they achieve a goal, then they get rewarded. We train them using reinforcement learning and, due to careful experiment design, they develop a shared language to help them achieve their goals."

In the above video, one-word phrases were created by two agents to form simple tasks while the more challenging tasks by three agents led to multiple-word sentences. The rewarding of concise communication led to the development of a larger vocabulary. The researchers noted, however, that the bots had a tendency to turn whole sentences into one-word utterances (which wouldn't be desirable as an interpretable language).

Reinforcement Learning and the Creation of a Bot Langauge 

Reinforcement learning is a form of trial and error, where the bots are keeping track of what receives a reward and what doesn't. If the bot, or the "dialog agent" in the case if Facebook's research, receives a reward then it learns to continue that behavior. Agents are learning to modify their communication output to maximize the reward. As pointed out in FAIR's report, "[d]uring reinforcement learning, the agent attempts to improve its parameters from conversations with another agent."

The creation of the bot language came about when it was more efficient and more rewarding in bot-to-bot communication to have a shared language (as opposed to mimicking our human language). 

"[T]he researchers found that updating the parameters of both agents led to divergence from human language as the agents developed their own language for negotiating."-Deal or no deal? Training AI bots to negotiate 

 

So is it bad that the agents diverged from human language? It is if the goal is always to mimic human language and also retaining the ability to decipher bot-to-bot communication. Then again, we may soon have bot translators.

===

Want to connect? Reach out @TechEthicist and on Facebook. Exploring the ethical, legal, and emotional impact of social media & tech. Co-host of the upcoming show, Funny as Tech.

'Upstreamism': Your zip code affects your health as much as genetics

Upstreamism advocate Rishi Manchanda calls us to understand health not as a "personal responsibility" but a "common good."

Sponsored by Northwell Health
  • Upstreamism tasks health care professionals to combat unhealthy social and cultural influences that exist outside — or upstream — of medical facilities.
  • Patients from low-income neighborhoods are most at risk of negative health impacts.
  • Thankfully, health care professionals are not alone. Upstreamism is increasingly part of our cultural consciousness.
Keep reading Show less

Calling out Cersei Lannister: Elizabeth Warren reviews Game of Thrones

The real Game of Thrones might be who best leverages the hit HBO show to shape political narratives.

Photo credit: Mario Tama / Getty Images
Politics & Current Affairs
  • Sen. Elizabeth Warren argues that Game of Thrones is primarily about women in her review of the wildly popular HBO show.
  • Warren also touches on other parallels between the show and our modern world, such as inequality, political favoritism of the elite, and the dire impact of different leadership styles on the lives of the people.
  • Her review serves as another example of using Game of Thrones as a political analogy and a tool for framing political narratives.
Keep reading Show less

Following sex, some men have unexpected feelings – study

A new study shows that some men's reaction to sex is not what you'd expect, resulting in a condition previously observed in women.

Credit: Pixabay
Sex & Relationships
  • A new study shows men's feelings after sex can be complex.
  • Some men reportedly get sad and upset.
  • The condition affected 41% of men in the study
Keep reading Show less
Videos
  • Climate change is no longer a financial problem, just a political one.
  • Mitigating climate change by decarbonizing our economy would add trillions of dollars in new investments.
  • Public attitudes toward climate change have shifted steadily in favor of action. Now it's up to elected leaders.