The executive producer of Riot Games’ Valorant has posted about harassment in the game via social media. Anna Donlon took to social platform X to address the concerns that have been raised in the community after a prominent streamer received a threat of physical violence in a Valorant gaming session.
Valorant exec speaks out
In the post, Donlon said, “I’m sorry for not posting sooner, but please know I haven’t missed a word. This has been the top thing on my mind (a lot of our minds) since yesterday.”
Valorant News responded to the post, saying, “Just know that a lot of people truly enjoy, and love the game you and your team have created. Some just wanna feel safe, and respected while doing so.”
The streamer in question was Taylor Morgan. She posted on her X page the abusive language and the threats amid a desperate plea for support from Riot:
I have never made a more desperate plea that what I am about to say right now. @riotgames @RiotSupport I need you guys to fucking do something. I am an incredibly strong person and I have been streaming for a very, very long time. But absolutely nothing prepares you for someone… pic.twitter.com/Gr77uBsBrT
— TaylorMorgan (@TaylorMorganS_) May 13, 2024
Donlon committed to acknowledging the severity of online harassment and abuse but asked for time to gather her thoughts before taking necessary action.
She said “It’s important to me that we lead with action first, so until we’d actually pressed the right buttons and made some necessary internal changes, I didn’t want to tweet out empty condolences when it’s on us to do the hard work here. I want to share some of my thoughts on the topic of player behavior in gaming at some point, but want to pull those thoughts together first.”
Online harassment in gaming
Harassment and abuse in games has been something gaming developers have sought to rule out using multiple approaches. Temporary bans and the shut down of accounts can be applied, but it is increasingly difficult to capture audio harassment and identify it to a particular account.
We reported that Activision used artificial intelligence (AI) to root out over 2 million toxic voice chats in Call of Duty. Activision said of the landmark figure: “More than 2 million accounts have seen in-game enforcement for disruptive voice chat, based on the Call of Duty Code of Conduct.”
It remains to be seen if Valorant will enforce a similar way of detecting and responding to threatening or abusive behavior, but we await Donlon and Riots’ further action on the topic that they have publicly stated they are serious about.
Image: Riot Games.