What’s in the Final Bill?
The legislation amends the current Online Safety Act 2021 and defines an “age-restricted user” as a person under the age of 16. However, it does not name specific platforms that will be subject to the ban.
What are the Key Provisions?
The legislation defines an “age-restricted social media platform” as including services where:
- The “sole purpose, or a significant purpose” is to enable “online social interaction” between people
- People can “link to, or interact with” others on the service
- People can “post material”, or
- It falls under other conditions as set out in the legislation.
The legislation notes that some services are “excluded”, but does not name specific platforms. For example, while services providing “online social interaction” would be included in the ban, this would not include “online business interaction”.
What’s the Penalty for Non-Compliance?
If social media platforms do not take “reasonable steps” to stop under 16s from having accounts, they could face fines of up to A$50 million.
What’s Next for Tech Companies?
As well as having to verify the age of people wanting to create an account, tech companies will also need to verify the age of existing account holders – regardless of their age. This will be a significant logistical challenge.
How Will Tech Companies Verify Age?
There are a few options social media platforms might pursue:
- One option might be for them to check someone’s age using credit cards as a proxy linked to a person’s app store account.
- Another option is to use facial recognition technology, but this technology is among the various strategies being trialled for the government to restrict age for both social media platforms (for ages under 16) and online pornography (for ages under 18).
What About the Digital Duty of Care?
The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Conclusion
Social media platforms should be safe spaces for all users. They provide valuable information and community engagement opportunities to people of all ages. The onus is now on the tech companies to restrict access for youth under 16. However, the work needed to keep all of us safe, and to hold the tech companies accountable for the content they provide, is only just beginning.
FAQs
Q: What is the penalty for non-compliance?
A: Social media platforms that do not take “reasonable steps” to stop under 16s from having accounts could face fines of up to A$50 million.
Q: How will tech companies verify age?
A: There are a few options, including using credit cards as a proxy linked to a person’s app store account or facial recognition technology.
Q: What is the digital duty of care?
A: The government promised to impose a “digital duty of care” on tech companies, requiring them to regularly conduct thorough risk assessments of the content on their platforms and respond to consumer complaints, resulting in the removal of potentially harmful content.
Q: What is the purpose of the legislation?
A: The purpose of the legislation is to restrict access to social media platforms for youth under 16, to ensure their online safety and well-being.