Steam, Minecraft, Roblox and Fortnite risk "becoming onramps to abuse, extremist violence, radicalisation or lifelong harm", claim Australian government

2 hours ago 17

The companies behind all four have been asked what they're doing to tackle such things

A bunch of Roblox characters avoiding a lava floor. Image credit: Roblox

Valve, Epic Games, Microsoft, and the Roblox Corporation have all been issued transparency notices by the Australian government's eSafety commissioner, with the body seeking to learn what steps are being taken to keep kids safe on Steam, Fortnite, Minecraft and Roblox. The Australian government say this step's been taken as without action, all four platforms risk "becoming onramps to abuse, extremist violence, radicalisation or lifelong harm".

In a press release, Australia's eSafety commissioner Julie Inman Grant - an ex-global director of privacy and internet safety at Microsoft - wrote that predatory adults "target children through grooming or embedding terrorist and violent extremist narratives in gameplay."

"We’ve seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay," she continued. "This includes Islamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft.

"Media reports have also pointed to games in Fortnite gamifying the horrific events of the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots, while Steam is reportedly a hub for a number of extreme-right communities."

So, the Australian government want to make sure the four companies "take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalisation or lifelong harm".

I've asked Valve, Epic Games, Microsoft, and the Roblox Corporation for comment. It's worth noting that some steps have already been taken by these companies in the face of criticism over these issues. For example, Roblox Corp have moved to limit access to social hangouts and unrated games for children aged under 13 and brought in selfie-based "facial age estimation technology" in recent years, with the goal of keeping young players safer.

We'll see if any new measures come out of this, but judging by the Australian government seeking transparency above all else, the companies simply sending them lengthy rundowns of all the stuff they've already put in place (plan to going forwards) to tackle grooming and extremism seems the most likely outcome.

Read Entire Article