On the 23rd of January 2022 , the Online Safety Act 2021 came into effect. The Act serves to bolster and expand existing online safety laws by providing a clear set of responsibilities for online services providers and users. It is enforced by the eSafety Commissioner who has powers to block access to violent or abhorrent material, monitor complaints and administer penalties.
The Act is significant as it broadens the scope and nature of online regulations. It applies to:
- Social Media Services
- Relevant electronic services
- Designated internet services
- Internet search engine services
- App distribution services
- Hosting services
- Internet carriage services
- End users of these services
Basic Online Safety Code
Social Media Services are subject to the most regulation under the Act, however there is legislative scope for other service providers to be regulated under additional codes . All providers are required to now comply with the Basic Online Safety Code which states that these providers should take reasonable steps to minimise the risk of harm of end users.
Such steps include
- Making sure they are restrictions to children accessing the service if the service is adult in nature
- Creating a clear and accessible complaints service that allows users to report cyber bullying and any breach of the Terms of Service and respond to those complaints within 48 hours.
- Preventing anonymous accounts from using the service for unlawful or harmful purposes
You read a detailed summary of the Basic Online Safety Code here.
The eSafety Commissioner
In addition, the eSafety Commissioner now can restrict access to content through 4 new notices.
If the Commissioners finds content through their own investigation or via complaint that is harmful, abusive or abhorrent they can issue a
- Removal Notice: requiring a provider to take all reasonable steps necessary to remove the material.
- Blocking Notice: Internet service providers can be required to block access to material that depicts, incites or instructs abhorrent violent conduct. This if the material is likely to cause significant harm to the community.
- App Removal Notice: App distribution service providers may need to prevent users from downloading an app that permits the posting of certain material.
- Link Deletion Notice: Internet search engine providers may be required to cease offering a link to certain material.
Providers have 24 hours to respond to these notices. If you fail to comply with these notices the Commissioner may impose formal warnings, infringement notices, enforceable undertakings, injunctions and civil penalties. Civil penalties for corporations range between $111,000 and $555,000.
What should I do as a Service Provider?
- We recommend all services provides and platforms read up on the Basic Online Safety Code and implement any requirements
- Ensure you have an adequate and efficient complaints system that allows you to respond to complaints within 48 hours.
- Develop policies addressing abuse and cyberbullying within your Terms and Services
- Have procedures to respond to a notice from the eSafety Commissioner within 24 hours.