In a controversial move, Meta, the parent company of Facebook, has come under fire from the UK government over its decision to implement automatic encryption for all messages on its Facebook and Messenger platforms. Home Secretary James Cleverly labeled this move as a “significant step back” for child safety, expressing concerns that the encryption could empower child sexual abusers.
National Crime Agency (NCA)
Meta’s decision to introduce end-to-end encryption means that only the sender and receiver of messages will have access to the content, rendering it inaccessible to anyone else, including law enforcement. Cleverly emphasized that the lack of appropriate safety measures in this encryption rollout could impede the efforts of police and the National Crime Agency (NCA) to bring offenders to justice.
The new features, which include encrypted calls, are immediately available to users. However, Meta anticipates that it will take some time to extend end-to-end encryption to over 1 billion users as the default option. During this transition, users will be prompted to set up a recovery method to restore their messages.
Child safety advocates, including the NSPCC’s Sir Peter Wanless and Susie Hargreaves from the Internet Watch Foundation, have strongly criticized Meta’s move. Wanless accused Meta of turning a blind eye to crimes against children, and Hargreaves expressed outrage, urging other platforms not to follow Meta’s example.
Meta’s Head of Messenger, Loredana Crisan, defended the decision, highlighting the added layer of security provided by end-to-end encryption. Crisan stated that this security measure protects the content of messages and calls from the moment they leave the sender’s device until they reach the recipient’s device. Notably, Instagram, another Meta-owned platform, will not be subject to end-to-end encryption at this time, although Meta previously announced that the change would apply to Instagram after completing the Messenger upgrade. WhatsApp, also owned by Meta, already utilizes end-to-end encryption for its conversations.
Robust safety measures
The debate over encryption has spilled into the UK’s Online Safety Act, with a provision allowing the communications watchdog, Ofcom, to order messaging services to use “accredited technology” to detect and remove child sexual abuse material. Privacy campaigners have raised concerns that this provision could threaten end-to-end encryption. Messaging apps have even threatened to withdraw from the UK market over the dispute. The government clarified that the regulator would only intervene if content scanning was technically feasible and met minimum standards of privacy and accuracy.
In response to the criticism, a Meta spokesperson asserted that the company had implemented “robust” safety measures, such as restricting individuals over 19 from messaging teenagers who do not follow them. The ongoing clash between privacy concerns and child safety advocacy continues to shape the future landscape of social media and messaging platforms.