Commonwealth asks to meet Roblox after child grooming reports. Here’s what we know

Commonwealth asks to meet Roblox after child grooming reports. Here’s what we know


Roblox has been called to an urgent meeting with the federal government in the wake of reports that children are being groomed by predators and exposed to sexually explicit material on the popular online gaming platform.

Communications Minister Anika Wells has written to Roblox, saying Australian parents have deep concerns over the platform’s safety measures.

Roblox, which is not under the government’s under-16s social media ban, sought an exemption after it made commitments last year to improve its safety measures, engaging with the eSafety Commission on the changes.

“Despite this, the issues appear to persist,” Ms Wells’ letter to the platform reads.

“This is untenable, and these issues are of deep concern to many Australians parents and carers.

“I therefore seek an urgent meeting with you to discuss the steps Roblox is taking to improve safety outcomes and experiences on its platform.”

The eSafety Commission has also written to the platform, saying it will test the implementation of safety measures put in place last year.

What are the concerns about Roblox?

Ms Wells highlighted two reports in her letter to Roblox, which alleged the platform was being used by predators to groom children and expose them to “graphic and gratuitous user-generated content”, including sexually explicit and suicidal material.

Ms Wells has asked the platform to explain what measures Roblox had put in place to protect users, especially children, from harm.

The government has also asked the Classification Board to review whether the PG rating for Roblox remains appropriate.

“The safety of children online is non-negotiable,” she said in a statement.

Commonwealth asks to meet Roblox after child grooming reports. Here’s what we know
Anika Wells said the reports highlighted the need for a digital duty of care, which will place the onus on digital platforms to keep users, specifically children safe.()

“The reports we’ve been hearing about children being exposed to graphic content on Roblox, and predators actively using the platform to groom young people are horrendous. Something must be done — now.

“These sorts of harms show why we need a digital duty of care, which will place the onus on digital platforms to proactively keep their users, particularly children, safe.”

What safety measures has the platform put in place?

Last September, Roblox said it had put in place nine commitments following consultation with the eSafety Commissioner.

It said the commitments were to ensure the platform was compliant with the Online Safety Act.

Roblox’s measures include:

  • Making accounts for users aged under 16 private by default.
  • Introducing tools to prevent adult users from contacting under-16s without parental consent.
  • Switching off key features by default for children in Australia, i.e. direct chat and “experience chat” within games, until the user has gone through age estimation.
  • After a child under 16 has done age estimation, chat is enabled, but they are not able to chat with adults
  • Controls allowing parents to disable chat for 13 to 15-year-old users.
  • Voice chat not allowed between adults and 13 to 15-year-olds, alongside prohibiting the function entirely for under-13s.

By the end of 2025, Roblox informed the regulator it had delivered on the commitments, including requiring users to be verified as 17 years or older to access certain games.

The eSafety Commissioner, Julie Inman Grant, said she notified the platform last week that the regulator would be testing its compliance with commitments.

“We remain highly concerned by ongoing reports regarding the exploitation of children on the Roblox service, and exposure to harmful material,” Ms Inman Grant said.

Julie smiles while standing next to a glass window wearing a beige suit jacket
Julie Inman Grant said the eSafety Commissions tests on Roblox will allow the regulator to have first-hand insights into the platforms compliance measures.()

The eSafety Commission said subject to the outcome of the tests, it might take further action against Roblox under the Online Safety Act, which includes penalties of up to $49.5 million.

New codes under the legislation, which come into effect on March 9, focus on age-restricted material such as online pornography, high-impact violence and self-harm.

It also had requirements for online gaming services to prohibit and take proportionate action against non-consensual sharing of intimate images, the grooming of children, and sexual extortion.

The eSafety Commissioner said she would assess Roblox’s compliance under the new standards.

Which apps are included in Australia’s social media ban?

Banned: TikTok

A black and red logo shaped like a musical note.
TikTok logo

TikTok is used to create, share and discover short videos, owned by Chinese tech company ByteDance.

Before the ban, the eSafety Commission said there were about 200,000 Australian users aged 13 to 15, among a total of almost 10 million Australian accounts.

While TikTok has its own minimum age of 13, the regulator has found it is one of the most popular platforms for users aged between eight and 12 as well.

The platform’s Australia policy lead, Ella Woods-Joyce, said TikTok would comply with the ban, but warned it could have unintended consequences.

“Experts believe a ban will push younger people into darker corners of the internet where protections don’t exist,” Ms Woods-Joyce said.

Banned: Instagram

A square logo with a pink, purple and orange gradient background under a simple white camera symbol.
Instagram logo

Instagram was the most used app among Australian teenagers aged 13 to 17, with more than a million monthly active users in this age cohort, according to the eSafety Commission.

The platform is owned by Meta, which also owns Facebook, WhatsApp, Threads and Messenger.

Instagram said its “teen accounts” were automatically applied to users aged 13 to 17, which came with built-in limits on who can contact them and filters on “sensitive content”.

Users in this age cohort also received notifications prompting them to leave the app after 60 minutes of use in one day.

Despite these measures, Instagram is under Australia’s social media ban for under-16s.

Banned: Snapchat

A logo with a yellow background under a white ghost symbol outlined in black.
Snapchat logo

Snapchat was also among the most popular apps for young people, with more than a million of its 8.3 million Australian users aged 17 or under.

Snapchat is a messaging app that allows users to send images, videos and texts that are only available for a short period once they are opened.

Users can also choose to share their location with friends on Snap Map.

Snapchat sought to allow underage users to download and archive their data before their accounts were disabled.

Banned: YouTube

A logo featuring a red rectangle with a white triangle "play" button.
YouTube logo

YouTube has been one of the most popular online platforms for young Australians, with more than 643,000 users aged 17 and under.

The regulator found it was the top platform for users aged between eight and 12, with more than two-thirds of those surveyed picking it as their platform of choice.

The Australian government was planning to exempt YouTube from its social media ban, but later backflipped on this decision.

Rachel Lord from YouTube Australia and New Zealand says there’s substantial evidence that YouTube is widely used in classrooms and supported by parents.

“YouTube is not a social media platform; it is a video streaming platform with a library of free, high-quality content, and TV screens are increasingly the most popular place to watch,” she said.

YouTube Kids, a filtered version of the platform that allows parents to create accounts for children under 12, is still allowed.

Banned: Facebook

A logo with a sky blue circle and a white lower case "f".
The Facebook logo.

While Snapchat, Instagram, TikTok and YouTube dominated for young social media users, Facebook still had an estimated 455,000 Australian users aged between 13 and 17.

The platform is owned by Mark Zuckerberg’s Meta and already has a minimum age of 13.

Meta policy director Mia Garlick recently told a Senate hearing that the company would comply with the ban but was still solving “numerous challenges” to identify teenagers’ accounts.

Facebook’s private messaging service Messenger is still allowed.

Banned: Twitch

A square purple chat box with two purple lines for eyes.
Twitch logo.()

Streaming platform Twitch was added to the list of banned apps after the eSafety Commission found it has the sole or significant purpose of online social interaction.

“Twitch is a platform most commonly used for live streaming or posting content that enables users, including Australian children, to interact with others in relation to the content posted,” a statement posted to the eSafety website said.

Twitch is mainly used by gaming and eSport players to broadcast their gameplay with audio commentary, but it’s also used to share and broadcast music, live sports, and food programs, according to the regulator.

Banned: X

A logo with a bold back "X" on a white background.
X logo

Formerly known as Twitter and owned by billionaire Elon Musk, X falls under the list of banned platforms for under-16s.

While it wasn’t among the most popular apps for young users, the eSafety Commission has concerns about the prevalence of “online hate” on the platform.

In June 2023, the regulator said it had received more complaints about online hate on X in the past 12 months than any other platform, saying X had “dropped the ball on tackling hate”.

Banned: Reddit

A red speech bubble below a white, teddy-bear-like alien face with a single antenna on its head.
Reddit logo

Messaging board Reddit, the seventh-most visited site in the world, is on the list of age-restricted platforms.

The platform bans mature content until a user declares they are 18 or over; however, there was no age verification system in place, according to the eSafety Commission.

The platform has said it would comply with the ban, but disagreed with the “scope, effectiveness and privacy implications” of the law.

Banned: Kick

A logo with the word KICK in bold, bright green lettering on a white background.
Kick logo

Along with Reddit, Kick was a late addition to the government’s list of age-restricted platforms.

Kick is an Australian competitor to the video live streaming platform Twitch, where users can watch live video streams covering games, music and gambling.

The eSafety Commission recently suggested Twitch, along with gaming site Roblox, could also soon fall under the ban.

Banned: Threads

A black squiggle that looks like a kind of @ symbol drawn in reverse.
Threads logo

Threads is a microblogging platform similar to X, which requires users to have an Instagram account for access.

As Instagram is banned for under 16s, Threads also falls under the list of banned platforms.

What can parents do to protect their children on the platform?

The eSafety Commissioner has put together a safety guide.

It covers online safety issues, including parental controls, child grooming, and protecting children from sexual abuse online.

The regulator recommends parents and carers should be:

  • Aware of who their child socialises with offline, and who they know online.
  • Keep communication both open and calm so their children come to them when something does not feel right, even if a mistake has been made.
  • Encourage avoiding visiting sites or apps specifically for adults, i.e. some social media, dating, online chat or gaming sites where they may be contacted by older teens or adults.
  • Check what restrictions exist on the apps used by your child, and manually change them if needed. Some apps restrict adults from messaging children who do not follow them.
  • Ensure their child’s account cannot be viewed publicly. Make accounts private or change privacy settings so they can control who can look at their photos and posts.
  • Ensure personal information from full names, phone numbers, addresses or schools are not shared.
  • Discuss the risks of sharing location data on apps, and help turn off automatic location sharing.

The regulator also has guidelines on how to talk about online child sexual abuse with children under 12 and children aged between 13 and 18.

Loading…

Leave a Reply

Your email address will not be published. Required fields are marked *