Executives at Meta Platforms moved forward with plans to introduce end-to-end encryption across messaging services linked to Facebook and Instagram despite internal warnings that the move could weaken the company’s ability to detect and report child-exploitation cases, according to newly released court documents.
Bodo/Glimt Shock Inter Milan to Reach Champions League Last 16
The internal communications were disclosed as part of a lawsuit filed by New Mexico Attorney General Raul Torrez, who accuses Meta of allowing predators to access underage users and facilitating harmful connections that allegedly led to real-world abuse and trafficking. The case, currently being heard in a New Mexico state court, marks the first jury trial of its kind against the company.
Documents obtained during discovery include emails, internal chats and briefing materials from 2019, when Meta leadership was preparing to publicly announce the encryption initiative. In one exchange, Monika Bickert, the company’s head of content policy, warned that the decision could significantly undermine safety operations and limit the company’s ability to assist law enforcement.
According to internal assessments, encryption could have reduced Meta’s reporting of child sexual exploitation material to the National Center for Missing and Exploited Children by nearly 65 percent. Company estimates suggested that thousands of potential cases involving exploitation, sextortion and even terrorism threats might have gone unreported if encryption had been implemented earlier.
End-to-end encryption, which ensures only the sender and recipient can read messages, is widely used across messaging services such as Apple’s iMessage, Google Messages, and Meta-owned WhatsApp. However, child-safety advocates argue that integrating the technology into large social networks increases risks by limiting proactive monitoring.
Meta spokesperson Andy Stone said the concerns raised internally led the company to introduce additional safeguards before encrypted messaging was fully rolled out on Facebook and Instagram in 2023. These measures include reporting tools and protections designed to prevent adults from initiating contact with minors they do not know.
The filing comes as Meta faces growing legal and regulatory pressure worldwide over youth safety and the broader impact of social media platforms on young users, with multiple lawsuits and investigations ongoing across the United States.
