Facebook’s parent company, Meta, said that message encryption for applications would be available in 2023. Only the sender and receiver can read the communications, but neither police enforcement nor Meta can.
Child advocacy organizations and lawmakers, on the other hand, have cautioned that it might make it more difficult for police to investigate child abuse. Private texting, according to the National Society for the Prevention of Cruelty to Children (NSPCC), is “the first line of child sexual abuse”.
Priti Patel, the UK Home Secretary, has also criticized the technology, stating earlier this year that it might “severely impede” law enforcement efforts to combat illegal activities, such as online child abuse.
What is it: Privacy Vs Protection?
End-to-end encryption protects data by scrambling or encrypting it as it travels between phones and other devices. Getting physical access to an unlocked device that transmitted or received the message is generally the only method to read it. The technology is used by Meta’s popular messaging program WhatsApp, but not by the rest of the company’s products.
Last year, the NSPCC issued Freedom of Information requests to 46 police forces in England, Wales, and Scotland, requesting a breakdown of the platforms used to conduct sexual offenses against minors.
More than 9,470 cases of child sex abuse photos and online child sex offenses were reported to authorities, according to the replies. Over a third of the incidents occurred on Instagram, and 13 percent on Facebook and Messenger, with relatively few cases occurring on WhatsApp.
This has raised concerns that Meta’s ambitions to expand encryption to Facebook Messenger and Instagram direct communications, which are extensively used, will conceal the bulk of offenders from discovery.
According to the NSPCC, encrypting messages by default might make it easier for child abuse images or online grooming to proliferate.
However, proponents argue that encryption protects users’ privacy and prevents both governments and unscrupulous hackers from spying on them. When Mark Zuckerberg, the CEO of Meta, introduced Facebook’s encryption, he made those reasons himself.
Additional Preventative Measures
Meta’s worldwide head of safety, Antigone Davis, explained that the delay in introducing encryption until 2023 was due to the company’s desire to “get this right.”
The business has previously stated that the transition will take place no later than 2022.
Ms. Davis said, “As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”
She also outlined several additional preventative measures that the company had already implemented, including:
- “Proactive detection technology” that scans for suspicious patterns of activity, such as a user who creates new profiles frequently or messages a large number of people they don’t know.
- Educating young people with in-app tips on how to avoid unwanted interactions.
- By default, under-18 users are placed in secret or “friends only” profiles, and adults are barred from texting them if they aren’t already linked.
The NSPCC’s head of child safety online strategy, Andy Burrows, applauded Meta’s decision. He stated, “They should only go ahead with these measures when they can demonstrate they have the technology in place that will ensure children will be at no greater risk of abuse. More than 18 months after an NSPCC-led global coalition of 130 child protection organizations raised the alarm over the danger of end-to-end encryption, Facebook must now show they are serious about the child safety risks and not just playing for time while they weather difficult headlines.”