The Online Safety Act has now received Royal Assent, triggering a paradigm shift in the operation of the UK online landscape. We consider the scope of the Act and the impact it could have on businesses operating online.
What has happened?
On Thursday 26 October 2023, following extensive debate, the Online Safety Act (the Act) became law. The Act has been described as “game-changing” by Technology Secretary Michelle Donelan, who explained that the Act helps the government’s mission in making the UK the “safest place in the world to be online.”
The Act takes a zero-tolerance approach to protecting children from online harm and places legal responsibility on tech companies to prevent and rapidly remove illegal and harmful content, such as terrorism and revenge pornography.
Whilst the Act has been lauded by some, it has been critiqued by others, namely on the basis that the implementation of the Act’s requirements will undermine the privacy and security of users in the UK.
What are the key requirements of the Online Safety Act?
The Act ultimately increases the responsibilities of web-based entities such as social media platforms and search engines, who will now need to ensure that illegal content is quickly removed and will also need to actively prevent children from accessing harmful content.
The Act imposes duties on certain online service providers, namely:
- User-to-user services, an internet service where content is generated directly by one user and that content can be encountered by another user; and
- search services, an internet service that is or includes a search engine (i.e. a service that enables persons to search more than one website/database).
This is a significant change. Previously, social media companies were typically expected to be reactive to such content – removing it once it has been flagged to them. They were not expected to proactively seek out and manage harmful content. The Act has placed world-first legal duties on social media platforms to do things such as:
- Remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm;
- Prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, and/or content depicting or encouraging serious violence or bullying content; and
- ensure the platform is more transparent about the risks and dangers posed to children on its site (including a requirement to carry out risk assessments).
Ofcom will have a big role to play ensuring that the above is adhered to, including the power to fine platforms up to £18million or 10% of their global avenue revenue, whichever is bigger.
What has the response been to the Act?
As the bill approached Royal Assent, concern had been raised in respect of the Act’s impact on free speech and freedom of expression, as well as concerns over privacy. A particularly controversial aspect of the Act was in relation for Ofcom’s ability to scan messages for illegal content.
As a result, WhatsApp and Signal had threatened to withdraw their services from the UK, stating that they cannot access or view messages without destroying existing privacy protections for all users. The government responded by explaining that it is impossible to circumvent end-to-end encryption (used by WhatsApp and Signal) without violating the privacy of users. This would only be done once ‘feasible technology’ had been developed that satisfies minimum standards of privacy and accuracy. Presently, this technology does not exist.
It remains to be seen what this means for the future of encrypted messaging platforms and the chilling effects the Act may trigger.
What is the impact and what happens next?
There are concerns that the costs to comply with the Act could be disproportionately high for smaller businesses, and that tech giants could pass their costs of compliance down the supply chain to the SMEs that use their services. Initially, Ofcom is likely to focus its attention on the larger entities operating in this area.
Many social media platforms, such as TikTok and Snapchat, have already taken a more stringent approach regarding underage users, and it is anticipated that others will follow suit once they had time to digest the provisions of the Act. Each platform will need to consider what compliance looks like for them on a case-by-case basis.
Ofcom is designated in the Act as the online safety regulator and will now begin to work on tackling illegal content, with the first consultation being published on 9 November 2023 relating to guidance and codes for illegal harm (as part of its three-phase approach to implementing the Act).
What the Act (and enforcement of the Act) will look like in practice will depend greatly on the codes and guidelines that Ofcom will issue in the coming months. Online businesses that fall within scope of the Act should keep a close eye on Ofcom’s publications to understand what actions they will need to take to ensure compliance under the Act.
For further information or to discuss any of the issues raised, please get in touch with any of the key contacts below.
For further information, please contact:
Michael Howard, Hill Dickinson
michael.howard@hilldickinson.com