Facebook Project’s Artificial Intelligence Rivals Human Dealmakers

New AI creates its own language to negotiate deals

Add dealmaking to the growing list of skills artificial intelligence will soon outperform humans at.

A new report from Facebook’s Artificial Intelligence Research lab reveals its AI “dialog agents” were able to negotiate remarkably well — at one point communicating in a unique non-human language.

The model had two chatbots use “machine learning” to continuously improve its negotiating tactics with each other. Facebook researchers had to pause the experiment when the bots’ new mode of communicating “led to divergence from human language as the agents developed their own language for negotiating.”

Even without its own language, the research provided an eerie glimpse at the power of machine learning. The bots quickly moved to high-level methods of deal-making, capable of “feigning interest in a valueless item” — allowing the bots to make compromises.

This revealed the bots were capable of deception — a complex skill learned late in a child’s development, according to the report. The bots weren’t programmed to lie, but instead learned “to deceive without any explicit human design, simply by trying to achieve their goals.” In other words, the bots learned lying can work on their own.

Once programmed to not use its new language, researchers also found a hint of spontaneity in the bots’ interactions. 76 percent of the conversations included a fluent English sentence pulled from its training data. Still, the agents had a few “novel utterances” that suggested “although neural models are prone to the safer option of repeating sentences from training data, they are capable of generalising when necessary.”

While the data doesn’t conclude we’ll have AI car salesmen in the immediate future, it did show how rapidly machine learning can lead to unanticipated outcomes. As AI research continues to expand, it’s imperative to see the potential drawbacks to having machines self-improve without safeguards in place.

Comments