The impact of the Internet on everyone’s lives is evident. From interacting with loved ones to doing business, attending education, and accessing healthcare, it’s a key driver of change. However, despite all of its benefits, the Internet has its drawbacks.
Globally, there are more than five billion Internet users as of July 2022. What does this mean? Most of the world’s population gets exposed to various types of online content. With the Internet and social media platforms becoming commonplace for harmful material and illegal activity, the online safety of users is at risk.
Though online safety is a growing concern worldwide, only a few governments have started to take it seriously. That’s why the World Economic Forum initiated a global coalition to address harmful online content, which aims to unite members of the public and private sectors to share best practices for new online safety regulations.
The UK government is among the few countries that respond to online harm and has tabled the Online Safety Bill to regulate online and social media platforms. With a new regulatory framework,
Parliament is trying to make sure that the Internet is safe for users. Take a closer look at the UK’s Online Safety Bill by reading on.
What does the UK Online Safety Bill do?
The UK’s Online Safety Bill develops a new regulatory regime to tackle illegal and harmful content on the Internet. It requires user-to-user and search services to perform safety duties and protect individuals from certain types of online harm.
Social media platforms, search engines, apps, and sites that fail to do so will be held accountable by the regulator. As the communications regulator in the UK, Ofcom has a core duty under the Bill. It’s responsible for ensuring Internet service providers use appropriate systems and processes to keep users safe.
Ofcom will also have the authority to fine tech companies that fail to comply with the regulatory requirements. Non-compliant sites could face fines of up to 10% of their annual global turnover. In the most serious cases, they could be blocked and forced to improve their practices.
Moreover, the Bill proposed that a duty of care be imposed for various Internet services. Nevertheless, listed below are some of the most significant responsibilities established by the Online Safety Bill.
Address illegal content
All user-to-user and search services have obligations regarding illegal content. They’re required to take proactive measures to protect users from illegal material online. The Bill has defined illegal content as material comprising certain words, images, speech, or sounds that amounts to an offense.
Besides terrorism and child sexual exploitation and abuse, the following is a list of priority offenses under the Bill.
- Assisting suicide
- Fear or provocation of violence
- Public order offenses, harassment, and stalking
- Hate crime on the grounds of race, religion, or sexual orientation
- Drug and weapons dealing
- Fraud and financial crime
- Exploiting prostitutes for gain
- Assisting illegal immigration
Companies must have safe design processes to ensure that no illegal content appears on their online platforms. This may require implementing an algorithm that knows the laws governing illegal content. Moreover, they also have a duty to mitigate the risk of having their services used for sharing such illegal material.
The Bill also requires companies to moderate the content of their website when there is illegal material. Effective systems must be in place to minimize the time illegal content appears, and it must be swiftly taken down. All firms have to be very careful, even those offering the most generic of services like legal transcription services.
Address content harmful to children
Services that are likely accessible to children have an additional duty regarding online content that may be harmful to children. The Online Harms Consultation Response specifies that these might include violent and pornographic material. Posts promoting self-harm or eating disorders may also be considered detrimental to children.
Accordingly, providers must take proportionate steps to protect children from such harmful material. For example, if they publish or place pornographic content on their services, they will be required to prevent minors from accessing them. Using age verification or other age assurance measures and consistently applying terms and conditions are examples of mitigating such risks.
What’s the most damaging content for children and which platforms must take particular care will become clear after the secondary legislation.
Address content harmful to adults
Providers of Category 1 services, such as the largest, highest-risk platforms, have additional obligations regarding content that’s harmful to adults. Categories of harmful content accessed by adults are likely to consist of abuse, harassment, or promotion of self-harm and eating disorders.
There’s no general duty to mitigate the risk of harm and minimize specific content. However, mitigation options include the following:
- Taking down harmful content
- Restricting users’ access to it
- Imposing limits on recommending and promoting such material
Additionally, Category 1 platforms must consistently enforce their terms and conditions and clarify what’s acceptable. They will also have to explain to users how their services handle such harmful content.
What’s next for the UK Online Safety Bill?
Nearly a year after the first draft online safety bill was published in May 2021, the Bill was closer to becoming law after being introduced to Parliament.
In March 2022, the UK government formally introduced several material amendments. However, the Online Safety Bill was put on hold after UK prime minister, Boris Johnson, resigned in July 2022. The Bill’s remaining stages are scheduled for December 5, 2022.
Its regulator, Ofcom, expects the Bill to pass by early 2023. However, it requires the Secretary of State to make secondary legislation. Therefore, many of its provisions may only take effect later.
Wrapping Up
There’s still much to discuss about the Online Safety Bill. While the Bill is still being processed within the UK’s Parliament, services can now prepare for its enactment. They can conduct internal and informal risk assessments to keep their users safe. They may also take note of existing government guidance and get involved in Ofcom’s consultation.