The legislation amends the current Online Safety Act 2021 and defines an “age-restricted user” as a person under the age of 16. However, it does not name specific platforms that will be subject to the ban.
The legislation defines an “age-restricted social media platform” as including services where:
The legislation notes that some services are “excluded”, but does not name specific platforms. For example, while services providing “online social interaction” would be included in the ban, this would not include “online business interaction”.
If social media platforms do not take “reasonable steps” to stop under 16s from having accounts, they could face fines of up to A$50 million.
As well as having to verify the age of people wanting to create an account, tech companies will also need to verify the age of existing account holders – regardless of their age. This will be a significant logistical challenge.
There are a few options social media platforms might pursue:
The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Social media platforms should be safe spaces for all users. They provide valuable information and community engagement opportunities to people of all ages. The onus is now on the tech companies to restrict access for youth under 16. However, the work needed to keep all of us safe, and to hold the tech companies accountable for the content they provide, is only just beginning.
Q: What is the penalty for non-compliance?
A: Social media platforms that do not take “reasonable steps” to stop under 16s from having accounts could face fines of up to A$50 million.
Q: How will tech companies verify age?
A: There are a few options, including using credit cards as a proxy linked to a person’s app store account or facial recognition technology.
Q: What is the digital duty of care?
A: The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Q: What is the purpose of the legislation?
A: The purpose of the legislation is to restrict access to social media platforms for youth under 16, to ensure their online safety and well-being.
TYR Wodapalooza Moves to Miami Beach: A New Home for the Winter Fitness Festival TYR…
Visiting National Parks: A Healthcare Boost for All Good for Our Health and Wellbeing Visiting…
Natalie Cassidy Reveals Daughter Thinks Her EastEnders Character is "So Boring" Natalie Cassidy, the actress…
Concussion Testing Diagnosing a concussion can be tricky. Although many degrees of head trauma can…
Real talk: When was the last time you stretched out your toes? If the answer…
The 2024 Bigman Weekend Spain Pro is Set to Take Place in Alicante, Spain The…
This website uses cookies.