The legislation amends the current Online Safety Act 2021 and defines an “age-restricted user” as a person under the age of 16. However, it does not name specific platforms that will be subject to the ban.
The legislation defines an “age-restricted social media platform” as including services where:
The legislation notes that some services are “excluded”, but does not name specific platforms. For example, while services providing “online social interaction” would be included in the ban, this would not include “online business interaction”.
If social media platforms do not take “reasonable steps” to stop under 16s from having accounts, they could face fines of up to A$50 million.
As well as having to verify the age of people wanting to create an account, tech companies will also need to verify the age of existing account holders – regardless of their age. This will be a significant logistical challenge.
There are a few options social media platforms might pursue:
The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Social media platforms should be safe spaces for all users. They provide valuable information and community engagement opportunities to people of all ages. The onus is now on the tech companies to restrict access for youth under 16. However, the work needed to keep all of us safe, and to hold the tech companies accountable for the content they provide, is only just beginning.
Q: What is the penalty for non-compliance?
A: Social media platforms that do not take “reasonable steps” to stop under 16s from having accounts could face fines of up to A$50 million.
Q: How will tech companies verify age?
A: There are a few options, including using credit cards as a proxy linked to a person’s app store account or facial recognition technology.
Q: What is the digital duty of care?
A: The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Q: What is the purpose of the legislation?
A: The purpose of the legislation is to restrict access to social media platforms for youth under 16, to ensure their online safety and well-being.
The 13th Annual TYR Wodapalooza: A Preview of the Action The 13th annual TYR Wodapalooza…
Millions Exposed to Toxic Wildfire Smoke as Fires Burn Through Los Angeles Area The Dangers…
Sitting on the Toilet Too Long: Health Risks and Prevention Sitting on the toilet for…
Free Weights vs. Machines Debate Six-time Mr. Olympia Dorian Yates built his title-winning legs with…
Katie Price's Health Concerns Spark Worry Among Fans Katie Price, 46, Showcases Slimmer Figure in…
Meta's New Content Moderation Policy: A Threat to Sexual and Reproductive Health Information Online Last…
This website uses cookies.