Australia announced on Tuesday that social media companies will not be required to verify the ages of all users under its ban on under-16s using the platforms.
However, platforms such as Facebook, Snapchat, TikTok, and YouTube are expected to take “reasonable steps” to prevent children from accessing their apps.
While Australia has been at the forefront of global efforts to curb online harm, the current legislation provides few details on how the ban will be enforced, raising concerns among experts that it could end up being largely symbolic.
Social media companies have also criticized the laws as “vague,” “problematic,” and “rushed.”
Communications Minister Anika Wells said that platforms would need to take “reasonable steps” to detect and deactivate underage users.
“We cannot control the ocean, but we can police the sharks. Today, we’re showing the world how this can be done,” she said.
“There is no excuse for social media platforms to fail in meeting their obligations under the new laws.”
Tuesday’s long-awaited regulatory guidelines state that social media companies will need to adopt a “multilayered” approach to age verification.
However, the head of Australia’s online regulator, the eSafety Commission, acknowledged that there is “no one-size-fits-all solution” for enforcing the world-first legislation.
“By taking a layered or ‘waterfall’ approach, platforms can manage risks associated with any errors in age estimation,” Inman Grant said.
“Platforms won’t have to verify the age of every Australian user to comply.”
The eSafety Commission will be able to fine social media companies up to Aus$49.5 million for failing to comply with the rules.
An independent study ordered by the Australian government found this month that age checking can be done “privately, efficiently and effectively”, though it admitted no single solution would fit all contexts.
The regulator has also introduced a number of rules taking effect in Australia in the coming months to protect children from “lawful but awful” content, including online pornography and AI chatbots capable of sexually explicit conversations.
This week, gaming giant Roblox Corp agreed to curb the risk of adults grooming children on its platform in Australia.