Game Career Guide is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Get the latest Education e-news
  • The Hidden Dangers Of Shoddy Moderation And Toxicity

    - Janne Huuskonen

  • Sponsors and brands don't want to associate with toxic communities

    As player numbers continue to swell, gaming has become the perfect environment for advertisers to reach wider audiences than ever. Courting high-profile partnerships and collaborating with well-known brands has become a big part of the business of games. From in-game fashion items and movie launches to exclusive limited-time events - who wouldn't want to wield Thanos' Infinity Gauntlet in Fortnite?

    But without the right moderation strategy in place, brands won't want to collaborate with a specific game if it's well-known for toxicity, as it could tarnish their reputation by association.

    This is even more pressing in competitive gaming, much like traditional sports sponsorship is a key revenue stream, with as much as 50% of Esports revenue coming from it. Activision Blizzard's ongoing struggles have also seemed to lead T-Mobile to pull sponsorship for the Call of Duty and Overwatch professional Esports leagues, and depending on the result of the active lawsuit against it - others could follow. So it's vital the major platforms in Esports look to the right moderation tools, to protect their interests and appeal to the widest variety of brands possible, without having toxic players, fans or even bots put off potential business partners.

    That's not to say that major titles haven't been trying to stem the flow of hate speech in their communities. Industry leaders are grappling with how to better moderate and manage their communities. Discord acquired AI moderator Sentropy, Unity followed suit with its acquisition of voice chat moderator OTO, showing that calls for change are being heard. But it seems that many companies are still unsure of how to deal with the content management problem.

    Moderation needs to keep pace with the ways we now play games

    So how can game publishers make sure their communities are a safe place for players?

    Right now, moderation - if it's in place at all - is likely to be a combination of basic content filters, banned word lists and human moderation. Lots of games and platforms don't even have this, and instead, rely on the players themselves to report toxicity. It's no wonder so many gamers are reporting rising harassment and toxicity, if the approach to moderation is so woefully unprepared.

    We are now in an era of games as a service, where FortniteMinecraft and Roblox can have hundreds of millions of players across multiple platforms, and console owners can stream their games to a mobile phone or smart TV. The way we moderate games and safeguard players must fit with these models, and the only way that can be done is to use automation, AI and other technologies that can cope with the scale and complexity of the games industry as it is today and will be tomorrow.


comments powered by Disqus