😠 Delegated Legislation Committee
General Committees
The UK Parliament debated the draft Online Safety Act 2023 regulations, focusing on categorizing online services into three risk levels: Category 1, 2A, and 2B. Concerns were raised about the exclusion of “small but risky” platforms from the highest risk category, potentially leaving vulnerable users at risk. Despite opposition, the regulations passed, emphasizing the need for swift implementation to enhance online safety. The government promised to continue reviewing the legislation’s effectiveness and to tackle smaller high-risk sites through Ofcom’s dedicated taskforce.
Summary
-
The Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 were debated in the UK Parliament. These regulations aim to categorize online services based on their user numbers and functionalities, which affects the level of regulatory duties they must follow.
-
Category 1 services, set to have the most stringent duties, include platforms with over 34 million or 7 million monthly active users in the UK. These services will have additional responsibilities such as transparency reporting, protecting journalistic and democratic content, and giving users more control over the content they see.
-
Category 2A targets high-risk search engines, focusing on the potential harm to individuals from illegal or child-harmful content, while Category 2B looks at other high-risk and high-reach platforms.
-
The regulations were criticized for not addressing the risks posed by “small but risky services” that do not meet the user thresholds. Critics argue that these platforms can pose significant dangers, such as promoting self-harm or targeting minorities with harmful content.
-
The government defended the regulations, explaining that the Secretary of State’s hands are tied by the Act, which was amended during its passage to focus on user numbers and dissemination speed rather than inherent risks. They emphasized that illegal content duties and child safety codes apply to all services, including smaller ones.
-
Concerns were also raised about livestreaming functionalities, which critics believe should automatically place a service into Category 1 due to the potential for real-time harm and child exploitation.
-
The government acknowledged Ofcom’s role in reviewing small but risky services and assured that the regulatory framework would be regularly reviewed and potentially updated.
-
The regulations passed with a vote of 10 in favor and 3 against, showing a division in opinion regarding the adequacy of the current approach to online safety.
Divisiveness
The session exhibited significant disagreement, primarily centered around the interpretation and implementation of the Online Safety Act 2023 concerning the categorization of online services. The disagreement was robust and extended across multiple points of contention, but the majority of members ultimately supported the motion, indicating a strong but not polarized opposition.
-
Disagreement on Categorization Thresholds: There was intense disagreement about the thresholds set for categorizing online services into Categories 1, 2A, and 2B. Several members, such as Martin Wrigley, Kirsty Blackman, and Sir Jeremy Wright, argued that the thresholds were set too high (at 7 million and 34 million monthly users), which would exclude smaller but potentially harmful services. This was a central point of contention, with some members feeling that the thresholds did not reflect the intent of Parliament to focus on risk rather than just size.
-
Interpretation of the Act: There was significant discord on the interpretation of the Online Safety Act itself. The Minister, Feryal Clark, maintained that the Secretary of State was constrained by the Act’s requirement to focus on the number of users for categorization, while others, such as Kirsty Blackman and Sir Jeremy Wright, argued that the Act allowed for more flexibility, citing specific sections like Schedule 11, which they believed enabled consideration of functionalities and other factors.
-
Concerns about Small but Risky Services: Members expressed frustration over the perceived failure to adequately address smaller platforms that could pose higher risks. For example, Robin Swann and Sir Jeremy Wright highlighted the dangers posed by platforms promoting self-harm, suicide, and targeted abuse, arguing that these sites should be included in Category 1 despite their smaller size.
-
Implementation and Enforcement: There were worries about how the Act would be enforced, particularly against small, high-risk services. Martin Wrigley raised concerns about the effectiveness of Ofcom’s regulatory approach and the lack of enforcement action against such sites. The Minister responded by explaining Ofcom’s capabilities and the existence of a dedicated taskforce, but this did not fully quell the concerns.
-
Voting Outcome: Despite the strong disagreements voiced by multiple members, the vote resulted in a 10 to 3 majority in favor of the motion. This indicated that while there was considerable debate and opposition, the majority of the Committee was willing to support the regulations as drafted, possibly due to the urgency expressed by the Minister to implement the Act swiftly.
Overall, the session demonstrated significant disagreement, but the majority decision to pass the regulations suggests that the opposition, while vocal, was not sufficiently extensive to sway the majority vote. The rating of 4 reflects the intensity and number of disagreements voiced but also acknowledges that the motion was ultimately approved, suggesting underlying areas of agreement or acceptance.