Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Media & Entertainment Tech Outlook
THANK YOU FOR SUBSCRIBING
Media Entertainment Tech Outlook | Thursday, November 24, 2022
Building and nurturing strong communities is crucial in a crowded game market, says David Wynn, head of solutions consulting at Google Cloud for Games.
FREMONT, CA: According to David Wynn, Head of solutions consulting at Google Cloud for Games, creating and maintaining robust communities is essential in a saturated gaming business. The number of new games launched on Steam alone last year exceeded 10,000, and it appears that this year's record will be higher than in the past.
Studios and publishers are discovering that communities can make the experience stickier and more meaningful for their player base as they compete for players' attention. But only if it avoids being contaminated by the negativity that may permeate so many online spaces.
In every crowded setting where there is a diversity of experiences, including those based on race, gender, and class, as well as religion and other factors, there are inherent difficulties that must be overcome. The community of a game or title is also created by the wide range of variations in how people prefer to participate, anticipate interacting, and are motivated to interact.
Adding AI to the Content Moderation Mix
In principle and fact, there weren't many interventions accessible when the situation became toxic. If a moderator or administrator decides that a particular behaviour is inappropriate, they may use the banhammer if they observe it at the appropriate time or if it is reported at the appropriate time. Or certain words may be prevented via a straightforward string substitution, making it look like four asterisks rather than an F-bomb. They are instruments that effectively convey the message, despite their sometimes basic style, difficulty in customisation, and near impossibility in scaling.
Models based on natural language processing (NLP), artificial intelligence (AI), and machine learning have enabled much more sophisticated treatments with even more readily available classifications. These algorithms enable community owners to identify issues before they arise and do it at scale, regardless of whether the moderation team is overworked or typical approaches yield false positives.
AI does require resources, work, and attention to train, but it's exceptionally resource-efficient to run and at scale, opening up a whole new avenue of recognising the behaviour that either want to minimise or magnify. Additionally, it generates novel forms of intervention, such as those utilising chatbots or intriguing augmentation techniques other than simple if, if else text substitution.
In addition to text, AI/ML may also examine broader communication patterns, including voice transcriptions, to identify activities like griefing and other forms of player-on-player conflict. It's the kind of thing that, in synthetic situations, needs to be accurately identified to be swiftly remedied or reduced.
Read Also
Copyright © 2025 Media and Entertainment Tech Outlook. All rights reserved | Sitemap | About Us
However, if you would like to share the information in this article, you may use the link below:
www.mediaentertainmenttechoutlook.com/news/how-ai-is-transforming-content-moderation-nwid-1047.html