Online Safety Act 2023: Media Businesses Beware

The Online Safety Act 2023 (the “Act”) is a new piece of legislation that aims to enhance internet safety and tackle online harms. This legislation went through various stages in both the House of Lords and the House of Commons before receiving Royal Assent and officially becoming part of the law on 26th October 2023. The act introduces several key measures that media companies and content creators should be aware of.

What the Online Safety Act 2023 covers:

The main purpose of the Act is to hold large social media platforms and search engines more accountable for keeping their users, especially children, safe online. Key areas covered by the act include:

  • Requiring companies to remove illegal content such as child sexual abuse material;
  • Preventing the spread of misinformation and disinformation;
  • Protecting users from cyberbullying and trolling;
  • Limiting users’ exposure to inappropriate or harmful content; and
  • Companies in scope of the regulations will need to carry out regular risk assessments and have clear user redress methods in place. There will also be much more transparency required around algorithmic decision making processes.

What does it mean for media businesses?

The requirements set out in the Act apply predominantly to major platforms like Facebook, TikTok and Google. But, any media business that hosts user-generated content or allows user interactions could still be impacted because of how the areas relate to them.

It will be important for all media companies to have robust community guidelines and moderation policies in place. Policies which are drafted by media lawyers would be recommended so that it is made sure that they comply with the Act. Some key considerations include:

  • Setting out very clearly what content is and is not acceptable;
  • Having fast processes for reviewing/removing content that violates policies;
  • Allowing users to easily report problematic content or accounts;
  • Being transparent about decisions to remove or restrict content; and
  • Promoting age assurance and identifying verification to better protect younger users. Media companies should look closely at their age gatekeeping and parental control options.

By taking proactive steps to enhance online safety, media businesses can demonstrate responsibility while also future-proofing against compliance burdens. With child protection and ethical online operation more important than ever, not only in the law but also in the press, the Act marks a significant step in the right direction when it comes to accountability around digital safety.

If you would like to find out more information or have any questions concerning the topics discussed, please contact our commercial team.

Olivia Larkin
Solicitor, Commercial
View profileContact Us

This reflects the law and market position at the date of publication and is written as a general guide. It does not contain definitive legal advice, which should be sought in relation to a specific matter.

Latest Legal Insights

Best Law Firms 2024

Herrington Carmichael has once again been named in the Times Best Law Firms. We were first listed in 2023 and have once again made the Best Law Firms list for 2024.

Best Law Firm 2024