Minions of Mark Zuckerberg taught AI to play the real-world game of “Diplomacy.” Niccolo Machiavelli would be proud. The field of Artificial Intelligence just got a whole lot scarier. Meta, the parent company mothership for Facebook, has a new “artificial intelligence model, named Cicero after the Roman statesman, that demonstrates skills of negotiation, trickery and forethought.” It wins more often than not, through ruthlessness, intimidation and trickery. The really scary part is that “this type of technology could be used to concoct smarter scams that extort people or create more convincing deep fakes.”
Would you like to play a game of ‘Diplomacy?’
The Hitchhiker’s Guide to the Galaxy has a few words to say on the subject of Diplomacy. The history of warfare, it relates, can be subdivided into the phases of “retribution, anticipation, and diplomacy.” The guide helpfully provides examples. “Thus, retribution: ‘I’m going to kill you because you killed my brother.‘ Anticipation: ‘I’m going to kill you because I killed your brother.‘ And diplomacy: ‘I’m going to kill my brother and then kill you on the pretext that your brother did it.‘”
Cicero has mastered those concepts quite well. “Meta let Cicero play 40 games of Diplomacy with humans in an online league, and it placed in the top 10 percent of players.”
Keyboard artists over at Meta are grinning from ear-to-ear after their Cicero AI model demonstrated proficient “skills of negotiation, trickery and forethought.”

They’re thrilled to announce that “more frequently than not, it wins at Diplomacy, a complex, ruthless strategy game where players forge alliances, craft battle plans and negotiate to conquer a stylized version of Europe.” This latest evolution in artificial intelligence has potential for extensive real world application.
Cicero is no mere chatbot. It can “trick humans into thinking it was real.” Meta confirms it “can invite players to join alliances, craft invasion plans and negotiate peace deals when needed.” It amazed it’s creators.
“The model’s mastery of language surprised some scientists and its creators, who thought this level of sophistication was years away.” It’s real skill at diplomacy comes from the way it learned to master the arts of deception and intrigue. “If it’s talking to its opponent, it’s not going to tell its opponent all the details of its attack plan.”

Ability to withhold information
The key to Diplomacy, either in the game or real life, is the “ability to withhold information.” Cicero seems to enjoy it. Along with speed of light calculating ability to “think multiple steps ahead of opponents and outsmart human competitors.” That, the team admits, “sparks broader concerns.” They realize that this “type of technology could be used to concoct smarter scams that extort people or create more convincing deep fakes.”
When Kentaro Toyama, a professor and artificial intelligence expert at the University of Michigan, read Meta’s paper on the subject, he freaked out. “It’s a great example of just how much we can fool other human beings. These things are super scary.” It’s easy to see how they “could be used for evil.” Since Cicero’s code “is open for the public to explore,” Anyone “could copy it and use its negotiation and communication skills to craft convincing emails that swindle and extort people for money.”
Robots are already beating humans in head-to-CPU competition. “In 2019, Facebook created an AI that could bluff and beat humans in poker.” AI generated art “has been able to trick experienced contest judges, prompting ethical debates.”

The advances are snowballing. Diplomacy skills are just the latest layer of the onion on top of “advances in natural language processing and sophisticated algorithms that can analyze large troves of text.”
Cicero actually has a split personality. One to play the game of Diplomacy and guide all the “strategic reasoning.” That’s the module which “allowed the model to forecast and create ideal ways to play the game.” A separate module puts it into words. “The other guided dialogue, allowing the model to communicate with humans in lifelike ways.”
They trained it well by feeding it “troves of text data from the internet, and on roughly 50,000 games of Diplomacy played online at webDiplomacy.net, which included transcripts of game discussions.” If someone “trained the language model on data such as diplomatic cables in WikiLeaks,” experts warn, “you could imagine a system that impersonates another diplomat or somebody influential online and then starts a communication with a foreign power.“


