CS:GO has always been a competitive game. One of the biggest factors in how well a team works together is the ability to talk to your teammates using a microphone. Being able to communicate quickly and accurately is vital to letting your teammates know where an enemy is or where the bomb is planted.
Even so, many players play without a microphone or keep themselves muted, which makes it harder for the team to coordinate. One potential fix would be to not allow players to join competitive matches without a microphone plugged in. Of course, people could just refuse to talk in-game, even if they own a mic, but this would at least ensure that those who are playing do have the means to speak. Players could further be encouraged to talk in-game by giving those who communicate a small “teamwork boost” to their competitive ELO.
One of the reasons for this refusal to talk is the in-game harassment some receive. Players who are female, trans, or nonbinary, and even those who just have unusual voices, are often made fun of or otherwise discriminated against when their voices are heard, even if they’re good players. This means that even if someone does have a microphone, they may be wary or wholly unwilling to use it because of previous experiences. The sweet spot would be if everyone was nice to each other and people were required to have a plugged-in mic.
Thankfully, CS:GO has options for reporting players who are abusive through text or voice chat. Features like these should hopefully help weed out the toxic players who engage in such behaviors. Let’s take a look at one such automated system, FACEIT’s Minerva AI. Last year, FACEIT worked together with Google Cloud and Jigsaw, a Google technology incubator, to create an AI that examines CS:GO chat messages. In the first month and a half, Minerva marked 7,000,000 messages as toxic, issued 90,000 warnings, and banned 20,000 players. FACEIT reports that following this treatment, the number of toxic messages was reduced by 20% between August and September and the number of unique players sending toxic messages fell by 8%.
Back in 2013, Dota 2, another competitive Valve game, released a blog update after their chat abuse system had been live for over a month. According to their statistics, there was a 35% drop in negative communication during the system’s first month. While less than 1% of the active player base was banned, 60% of players who received bans changed the way they communicate and did not receive further bans. Lastly, total reports were down more than 30%. A report system like this can and does have fruitful outcomes.
In a perfect world, this all could be summed up as “be nice to your teammates and it’ll all be fine,” but this is not a perfect world. I think the definitive solution to this would be a combination of requiring a microphone for competitive modes and an ELO boost for mic usage in addition to a Minerva-like AI and the existing reporting system. This would encourage people to communicate more during games while discouraging others from being toxic through serious consequences.