The Online Safety Act 2023 (the "Act") is a new piece of legislation that aims to enhance internet safety and tackle online harms. This legislation went through various stages in both the House of Lords and the House of Commons before receiving Royal Assent and officially becoming part of the law on 26th October 2023. The act introduces several key measures that media companies and content creators should be aware of.

What the Online Safety Act 2023 covers:

The main purpose of the Act is to hold large social media platforms and search engines more accountable for keeping their users, especially children, safe online. Key areas covered by the act include:

  • Requiring companies to remove illegal content such as child sexual abuse material;
  • Preventing the spread of misinformation and disinformation;
  • Protecting users from cyberbullying and trolling;
  • Limiting users' exposure to inappropriate or harmful content; and
  • Companies in scope of the regulations will need to carry out regular risk assessments and have clear user redress methods in place. There will also be much more transparency required around algorithmic decision making processes.

What does it mean for media businesses?

The requirements set out in the Act apply predominantly to major platforms like Facebook, TikTok and Google. But, any media business that hosts user-generated content or allows user interactions could still be impacted because of how the areas relate to them.

It will be important for all media companies to have robust community guidelines and moderation policies in place. Policies which are drafted by media lawyers would be recommended so that it is made sure that they comply with the Act. Some key considerations include:

  • Setting out very clearly what content is and is not acceptable;
  • Having fast processes for reviewing/removing content that violates policies;
  • Allowing users to easily report problematic content or accounts;
  • Being transparent about decisions to remove or restrict content; and
  • Promoting age assurance and identifying verification to better protect younger users. Media companies should look closely at their age gatekeeping and parental control options.

By taking proactive steps to enhance online safety, media businesses can demonstrate responsibility while also future-proofing against compliance burdens. With child protection and ethical online operation more important than ever, not only in the law but also in the press, the Act marks a significant step in the right direction when it comes to accountability around digital safety.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.