Bullies in gaming are a universal problem. The question ‘if games instill violence’ is debatable but bullying in gaming isn’t an if; it’s a matter of when. Every gamer has faced a bullying experience at least once in their life.
Has toxicity in the gaming community taken a turn for the worse? Read on to understand our view on Toxicity in Gaming from our experts at Affine Gaming CoE.
Toxicity is an enigma that haunts Games
While there have been several studies on the causes and effects of cyberbullying, bullying is in tandem with video games, and it always has been so.
A recent study revealed that 80% of multiplayer gamers (games include DOTA 2, Valiant, Grand Theft Auto, Call of Duty, and CS: GO, to name a few) faced toxicity in the form of online harassment.
Bullying was at its peak around the time MMOs, and MMORPGs became mainstream. Player vs. player and co-op games that required team effort were gaining a significant threshold amongst gamers.
Gaming is a competitive environment for many players, and adrenaline is at an all-time high. While a decent number of members in the gaming community are cool-headed, the majority resorts to bullying.
Bullying victims generally face verbal harassment in multiplayer lobbies and in-game chats. According to ResearchGate & ScienceDaily, cyberbullying victims are 1.9 times more likely to commit suicide.
Multiplayer lobbies are still one of the most infamous virtual playgrounds for the bullying culture in gaming. New players are constantly subject to abuse due to their lack of skills, amongst many other reasons.
While it is impossible to pinpoint the inception of this toxic culture, various studies attest that rewarding behavior or acceptance of players who practice discrimination in groups is one of the reasons for toxic culture amongst multiplayer gamers.
Cyberbullying is a bully’s playground. Safely hidden behind the internet, there are fewer repercussions. Instant gratification and not seeing the impact of bullying on the victim makes this a dangerous affair.
The bully has a layer of anonymity hidden behind the privacy of the internet, which accelerates the toxic culture and bullying. Women and members of the LGBTQ community are primary victims of bullying in gaming, with a respective percentage of 40 and 37.
2020 saw an increase in bullying incidents.
Fortnite, Overwatch and GTA online lobbies carry forward this trend of cyber abuse carried out by ‘griefers.’ These are just a few of the well-known bullying lobbies, which means there are many other lobbies where bullying is commonplace.
The scary part of cyberbullying is that only 38% of the victims report their experiences to their parents.
Some bullies even go to the extent of personally stalking their victims both online and offline. We’ve all heard about illegal swatting incidents. Swatting is a prank call reporting dangerous activities to the emergency services to the intended victim’s address, often resulting in the dispatch of the SWAT team to the reported location. One known incident resulted in the victim getting killed.
A recent study in the USA revealed that 2020 saw a seven percent increase in bullying.
One might ask what preventive measures have game developers taken to block bullying?
Game developers have a special weapon to tackle bullying.
The ‘mute and report’ is probably the most used feature by many online gamers. That’s pretty much a cyberbullying victim can do from their end.
In 2017, a Fair play Alliance was formed by key gaming organizations including Microsoft, Intel, Epic, Blizzard, and 200 other companies. They formulated a unique approach to tackle bullying by rewarding fair players and communities where bullying wasn’t a norm.
Blizzard even went to publicly name-and-shame 18,000 Overwatch players, banning them from playing the game.
Developers like Rockstar have taken measures to prevent bullying inside in-game chats. Explicit language is automatically filtered and censored. Their online versions of Grand Theft Auto and Red Dead Redemption have implemented this method.
Ubisoft saw tremendous success in its multiplayer co-op game, Rainbow 6-siege. Bullying voice chat abuses sexist and racist threats ensued, partially owing to the game’s competitive nature. They retort to half-hour bans on such players, which frankly isn’t even the bare minimum.
But all these aren’t enough. Game developers and publishers can tackle the toxic culture nightmare seen amongst their players if they want to.
Why aren’t there any promising efforts then? It doesn’t hit their economy.
Developers must face the heat
Rockstar rakes in astronomical revenues with its 8-year-old Grand Theft Auto title. They are even releasing an enhanced edition for the PlayStation 5, much to the protests of fans.
GTA Online lobbies are a stage for bullies or griefers, in this case, who harass even peaceful players minding their own business. Reporting is an option, and players often get suspended. However, bullies often use multiple accounts and hence, aren’t seriously affected.
Rockstar could fix this problem if they wanted to, but they don’t because it doesn’t affect their economy.
The Call of Duty franchise has always been a double-edged sword for Activision. It brought an insane amount of revenue but has always been the hotspot for the toxic gamer culture.
Recently, Activision was in the news for discriminatory workplace practices and sexual harassment allegations. Activision will undergo a federal investigation. The US Security and exchange commission has served a subpoena to several top brasses for an alleged ‘Frat boy’ culture that tolerated toxic behavior.
Government regulations can help here. Strict laws and penalties against developers can help strengthen the fight against bullying in games. But regulations must be well-thought, implemented, and moderated for maximum efficiency.
Businesses in foresight must prioritize their reputation to tackle the toxicity enigma. Revenue is imperative, but it’s the responsibility of every business to create a space void of toxicity for employees and gamers.
Online predators inhabit audio chatrooms.
Many people use Discord instead of in-game chat for increased privacy. Only adding the people you know is another way to avoid abusive and unconsented interaction with unknown strangers. But it comes with its baggage. The community is humungous, and the moderation varies from server to server.
Predators target teenage gamers under the guise of fellow gamers and groom them. These predators easily bypass the filters and stay well within community guidelines to avoid detection.
Add voice chat to the equation, and the plot thickens.
Real-time speech analysis for moderating online bullying is a challenging aspect. In this year’s Game Developers Showcase, Intel showcased Bleep, an Artificial Intelligence-powered speech recognition technology where the user would get an option to enable filters among various categories like hate speech, xenophobia, misogyny, etc.
Selective filtering is a questionable aspect of this technology, but the bigger question is if this will work on languages other than English?
Gaming has blown up recently and has worldwide users across multiple platforms, partially due to the pandemic. Linguistic limitations further hinder tackling cyberbullying practices across in-game and chat lobbies.
Players in countries and regions where English isn’t the spoken language still engage in bullying, but even such sophisticated technologies can fail to prevent cyberbullying.
Technologies such as Bleep are a step in the right direction, and developers can closely work with such organizations to implement preventive measures against bullying.
Cyberbullying isn’t a problem with a simple solution; it isn’t like turning off a switch. Factors including but not limited to the environment, school, home, relationship with parents and siblings can decide how a person behaves online.
Stereotyping is again part of the problem. Teenagers are the usual suspects of bullying, but the average gamer is 34 years old. There have been many instances where adults have targeted and continue to target underage teens under the pretense of friendship on platforms like Discord.
There isn’t a standard profile template that fits cyberbullies and online perpetrators.
Parental control can help
Prevention is a long-term process. Parents taking part in the child’s gaming activities is a good start. It can provide a sense of support to the kids, and the parent(s) can act as a non-intrusive moderator and induce awareness.
Parental control can reduce the chances of kids stumbling on inappropriate content or behavior. Steam has something called ‘Family View’ that limits which user can play which game. It is, however, not an ideal long-term solution but more of a preventive measure.
Developers and gamers also must take measures and ensure a friendly virtual environment for all gamers.
While many developers and tech firms are coming up with anti-bullying measures, awareness is also imperative. Features to block toxic users and filter hate texts have made a sea of difference in recent times.
Developers still have a lot to do in terms of reducing toxic player behavior.
The prospects of anti-bullying measures, coupled with systematic awareness against cyberbullying and bullying in general from an early age, are the long-term solutions to eradicate bullying. It’s improbable that bullying in its form can be completely eradicated as bullying methods seem to evolve with time. Constant learning and monitoring coupled with prompt action with the help of technology seem the only way forward. Indeed, this is a long and never-ending arduous journey for the gaming community.