Presented by Cohere
How do you build a more inviting and inclusive gaming community? In this VB On-Demand event, AI/ML experts from Cohere and Google Cloud dive into managing player-generated content at scale to increase retention, foster positivity, and build a fun gaming community
Watch here for free on demand!
Building and nurturing strong communities is critical in a crowded gaming market, said David Wynn, lead solutions consulting at Google Cloud for Games. More than last year 10,000 new games were only released on Steam and that record looks set to be broken again this year. As studios and publishers compete for players’ attention, they’re discovering that communities can make the experience stickier and more meaningful to their player base. But that’s only if it doesn’t fall prey to the toxicity that can plague so many online spaces.
“Building community helps bring out basic human aspects of talking to friends, sharing experiences, and building relationships,” says Wynn. “If that works as part of the gaming experience you’re trying to create, it becomes all the more important to make sure you’re designing it right and making sure it’s a good experience for everyone involved.”
The challenges are fundamental, the ones ingrained in human interaction in every crowded arena, replete with a diversity of experiences from race, gender and class to religion and more. But also the wide range of differences in the way people like to interact, expect to interact and are encouraged to interact, Wynn says, all of which together create the community of a game or title.
“People will bring their own experiences, perspectives and potential challenges to the community. Even if we create virtual worlds, they still come from here and bring everything they experience here to them,” he says. “We can, through tools and the knowledge that others have already built up, create experiences to change the way they interact. The multiplicity and the scale are both things that studios and publishers need to consider, because the world is coming at us fast. As much as we would like to think we can build our own islands, people have come from somewhere and are taking it with them.”
What can go wrong is unique to a title in terms of how a community experience is formed to facilitate your objectives, how complex an experience you design and how invested your players get, and that directly affects your moderation – and intervention styles. A frowning face can mean a bad day; it could also be an indication of a larger, more insidious trend, or a signal that a new layer of moderation is needed.
Adding AI to the content moderation mix
It used to be that the number of interventions available when it became toxic was limited, both in theory and in practice. A moderator or admin can apply the banhammer if they decide that behavior is unacceptable – if they see it at the right time, or if it’s reported at the right time. Or certain types of words can be blocked with simple string substitution so that it appears as four asterisks instead of an F-bomb. They are effective tools for getting the message across, albeit a fairly blunt approach, difficult to fine-tune and virtually impossible to scale.
Natural language processing (NPL), AI and machine learning based models have enabled significantly more sophisticated interventions with an even more readily available classification. Whether your moderation team is overworked or your usual methods are producing false positives, these algorithms allow community owners to spot issues before they start, and they can do it at scale.
“AI takes resources, effort and attention to train, but most of all it is resource-efficient to run, and at scale it opens up a whole new avenue for identifying the behaviors we want to minimize or reinforce,” says Wynn. “It’s also creating new types of interventions, whether it’s through chatbots or through interesting types of augmentation that isn’t just ‘if, if else’ series substitution.”
AI/ML can also analyze broader patterns – not just text, but also communication with speech transcripts, in identifying behaviors such as sadness or giving other players a hard time. It’s the kind of thing that needs to be reliably identified in synthetic environments so it can be addressed or mitigated quickly.
“None of this is new. I’m sure people were figuring out how to make Pong boring to play against when it first came out,” says Wynn. “But what you see with the new AI/ML models being developed and published is that you don’t have to be a data scientist to translate these big language models into something that actually works for your game, even if you’re a smaller studio , or you try to try it yourself. Instead, you have an API from someone like Cohere that you can just grab and start using right away and start seeing the benefit.
To learn more about identifying the patterns that are causing communities to sour, the AI/ML solutions available to everyone, the most effective ways to implement them, and more, don’t miss this VB On-Demand event.
Watch here for free on demand!
agenda
- Tailor resources to your community’s unique language and policies
- Increasing the ability to understand the nuance and context of human language
- Using language AI that learns as toxicity evolves
- Significant acceleration in ability to identify toxicity at scale
Presenters
- David Wynnhead of Solutions Consulting, Google Cloud for Games
- Mike LaviaEnterprise Sales Lead, Cohere
- Dean TakahashiLead Writer, GamesBeat (Moderator)
Janice has been with businesskinda for 5 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider businesskinda team, Janice seeks to understand an audience before creating memorable, persuasive copy.