Legal briefing | |

The Online Safety Bill – what is it about and what does it do?

Overview

The UK Government published its long-awaited draft Online Safety Bill in May 2021. The Bill creates a new legal framework for identifying and removing illegal and harmful content from the internet. The aim is to prevent harm to individuals in the UK. Those covered will face fines and other sanctions for non-compliance.

A pre-legislative scrutiny joint committee has since been established by the House of Lords and the House of Commons to review the Bill and has recently opened a call for evidence.

Who does the Bill apply to?

The Bill applies to providers of:

  • User-to-User Services – internet services which allow users to upload and share user-generated content such as large social media platforms, online market places, and online forums as well as gaming sites; and
  • Search Services – search engines which enable users to search multiple websites and databases

(together these are referred to in the Bill as "Regulated Services").

Territorial scope

To fall within the scope of the Bill, the User-to-User Service or Search Service must have links with the UK. For example, a service that has a significant number of UK users, or a service where UK users form one of the target markets, if not the only target market. A service also has links with the UK if it is capable of being used in the UK by individuals, and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK arising from that use. For example, in the case of a User-to-User Service, if there is harmful content present on the service, or in the case of a Search Service, harmful content that may be encountered in or via search results.

Content must amount to 'Regulated Content' in order to fall within scope of the Bill. The Bill includes a list of exemptions from 'Regulated Content', which is essentially content considered to be low risk, such as email, SMS and MMS messaging only services, internal business networks such as intranets, and user generated content which is considered to have limited scope for damage such as comments sections on newspaper and media websites.

Duties created by the Bill

The Bill imposes new significant responsibilities on Regulated Services operators to monitor the content that is published and shared on their platforms by users, and do something about it if it is illegal or harmful to others. For example, they will have a duty to undertake risk assessments with regard to content that is illegal or harmful to children, and in some cases harmful to adults and monitor and moderate such content.

Service providers will also be classed into category 1 and category 2. Ofcom will designate which services fall into which category, however, the largest social media companies are likely to fall into Category 1, with most other providers in scope being classed as Category 2.

Additional duties will apply to those services designated as Category 1.

Duties of care for all regulated services

All regulated services will have the following duties:

  • to carry out and maintain illegal content risk assessments
  • to take steps to mitigate and manage risks of harm caused by illegal content
  • to protect freedom of expression and privacy
  • to provide a reporting and redress mechanism for users (such as take down requests from users)
  • to keep clear and transparent records to evidence compliance.

Where the service is likely to be accessed by children, they must also comply with additional duties relating to harmful content likely to be accessed by children.

Those services which are classified as Category 1 services will have yet further duties, including carrying out risk assessments in relation to harmful content (not just illegal content), a duty to protect adults' online safety, a duty to protect the free expression of journalistic content and content of democratic importance, and additional reporting obligations.

Services providers will also have to make judgment calls about a lot of content, since the definitions of 'illegal content' and where applicable, content that is "harmful" to children and adults are subjective and not precisely defined. For example, for content that is harmful to children or adults, the test is where the service provider has "reasonable grounds to believe that the nature of the content is such that there is a material risk of it having, or indirectly having, a significant adverse physical or psychological impact on a child [or adult] of ordinary sensibilities…"

Codes of practice

Ofcom has been appointed as the regulator to uphold and enforce the Bill. Part of its role will be to prepare specific codes of practice for providers of Regulated Services to follow, setting out recommended action for the purpose of complying more specifically with their duties.

These codes will be important as they will set out more detail about how service providers are expected to carry out their duties of care in practice, and service providers will have to justify any departure from them. However, we anticipate that it may take some time for such codes to materialise, since the Bill includes a duty for Ofcom to consult with those with vested interests when preparing the codes.

Enforcement powers

Sanctions for compliance are potentially onerous. Ofcom will have a range of enforcement powers at its disposal, including:

  • Fines for non-compliance: Ofcom has the power to impose fines on service providers that fail to comply with their duty of care to users. The maximum penalty levied is capped at £18 million or 10% of qualifying worldwide turnover (whichever is higher).
  • Criminal sanctions: the Government reserves the right at a later date to impose criminal sanctions on senior managers of non-compliant service providers, where they fail to comply with requests by Ofcom for information/co-operation.
  • Running interference: Ofcom will be able to disrupt a service provider's business operations by applying to court for the grant of restriction and access orders blocking access to non-compliant sites and platforms, and restricting the ability of those which provide ancillary services (such as advertisers and payment providers) to operate via a non-compliant site.

When does the Bill come into force?/next steps

The Joint Legislative Committee is required to report back by 10 December this year. There are expectations that the Bill will be presented to Parliament in the spring of next year. However, even then, there is no clear date for entry into force. Framework provisions such as definitions and interpretation will commence once the Bill passes through legislative procedures and receives Royal Assent. However, other provisions, such as the duties which the Bill creates, will only come into force by way of secondary legislation. It is not clear when this would be implemented.

Nevertheless, given the extent of policy and operational changes which will need to be put in place, and the cost of these, those platforms, sites and search engines which are likely to be affected, in particular the larger, more well known sites, will no doubt be busy scrutinising just what the Bill will require from them, and how they can address its requirements at a practical level. Furthermore, the UK is not alone in looking to impose greater responsibility on tech companies for the content on their sites; at the end of the year, the EU unveiled plans for its own Digital Services Act package, aimed at overhauling the digital market, which includes amongst other measures, greater onus on hosting providers and platforms to deal effectively with content which is notified to them as illegal.

Get in touch

Read Lora Abagero's Profile
  • Lora Abagero

  • Trainee
Read Vivien Halstead 's Profile
Back to top