After years of development, the British online safety bill has cleared the final hurdles and is entering into force, sparking very mixed reactions. This bill aims to hold social media companies accountable, but its scope and ambition has expanded over the years.
Enforced by Ofcom, the UK’s telecommunications regulator, it requires companies – large and small – to remove illegal content and prevent children from viewing harmful content. However, it has gradually expanded to include other crimes, from cyberflashing to animal cruelty to online fraud.
“Our common-sense approach will deliver a better future for Britons, by ensuring that what is illegal offline is also illegal online,” said Michelle Donelan, Secretary of State for Technology. “It prioritizes the protection of children and allows us to catch keyboard criminals and crack down on the heinous crimes they seek to commit. »
Companies that fail to comply with the law face fines of up to £18 million (€20 million) or 10% of their global annual turnover, whichever is greater. being retained, or billions of euros in the case of the largest platforms.
The bill has been challenged at every stage, and its final version will do little to allay concerns. Perhaps the thorniest issue is encryption, with the bill giving Ofcom the power to issue notices to force companies to scan private messages for illegal content.
Earlier this month the government appeared to return somewhat to this issue, with Lord Parkinson of Whitley Bay making a statement.
“When deciding whether to issue a CSAM search notice, Ofcom will work with the service to identify reasonable and technically feasible solutions to address the risk of child sexual exploitation and abuse, including by s “supporting the evidence of a report from a qualified person,” he said. “If there is no suitable technology that meets these requirements, Ofcom cannot require its use. »
The bill has been welcomed by many, from Which?, a consumer group which has campaigned for the inclusion of scam adverts, to charities such as the National Society for the Prevention of Cruelty to Children.
“Technology companies can now seize the opportunity to adopt security by design,” says Sir Peter Wanless, chief executive of the NSPCC.
The decision to weaken the requirement for tech companies to break encryption should prevent Signal and WhatsApp from disappearing from the UK in the near future.
However, some rights groups are still not satisfied.
“Although the UK government has admitted that it is not possible to securely scan all of our private messages, it has given Ofcom the powers to force tech companies to do so in the future,” says James Baker, head of campaigns at the Open Rights Group.
“These are powers more suited to an authoritarian regime than a democracy, and could harm journalists and whistleblowers, as well as victims of domestic violence, parents and children who want to protect their communications from predators and online harassers,” he adds.
Joe Mullin, senior policy analyst at the Electronic Frontier Foundation, said: “If regulators claim the right to require the creation of dangerous backdoors in encrypted services, we expect encrypted messaging services to keep their promises and withdraw from the UK if the UK government compromises their ability to protect other users.”
The new requirement that digitalization must be “technically feasible” allows Ofcom to push back on the idea of end-to-end encryption, probably for an indefinite period.
But Will Cathcart, head of WhatsApp, said in a tweet: “The fact is that scanning everyone’s messages would destroy privacy as we know it. This was as true last year as it is today. @WhatsApp will never break our encryption and remains vigilant against threats to do so.”
This article is originally published on forbes.fr