Simon Parkin at MIT Technology Review: “In any fictionalized universe, the distinction between playful antagonism and earnest harassment can be difficult to discern. Name-calling between friends playing a video game together is often a form of camaraderie. Between strangers, however, similar words assume a different, more troublesome quality. Being able to distinguish between the two is crucial for any video-game maker that wants to foster a welcoming community.
Spirit AI hopes to help developers support players and discourage bullying behavior with an abuse detection and intervention system called Ally. The software monitors interactions between players—what people are saying to each other and how they are behaving—through the available actions within a game or social platform. It’s able to detect verbal harassment and also nonverbal provocation—for example, one player stalking another’s avatar or abusing reporting tools.
“We’re looking at interaction patterns, combined with natural-language classifiers, rather than relying on a list of individual keywords,” explains Ruxandra Dariescu, one of Ally’s developers. “Harassment is a nuanced problem.”
When Ally identifies potentially abusive behavior, it checks to see if the potential abuser and the other player have had previous interactions. Where Ally differs from existing moderation software is that rather than simply send an alert to the game’s developers, it is able to send a computer-controlled virtual character to check in with the player—one that, through Spirit AI’s natural-language tools, is able to converse in the game’s tone and style (see “A Video-Game Algorithm to Solve Online Abuse”)….(More)”.