Meta Platforms has faced significant scrutiny regarding the safety of children on its social media platforms, with CEO Mark Zuckerberg and Instagram head Adam Mosseri recently providing testimonies during a trial in New Mexico. The case pits Meta against the state’s attorney general, Raul Torrez, who accuses the company of prioritizing profits and user engagement over the safety of minors, claiming that it knowingly allowed predators to exploit children on Facebook and Instagram.

In his deposition, Zuckerberg acknowledged the harsh reality of operating a platform with billions of users, stating, “If you’re serving billions of people, the unfortunate reality is that some very small percent of them are going to be criminals.” He emphasized that while the goal is to combat harmful behavior, perfection in preventing all exploitation is unattainable.

The trial, which started in February and is expected to last approximately seven weeks, has revealed troubling statistics. Prosecutors presented evidence indicating that as many as 500,000 children received inappropriate communications on Instagram each day in 2020. This figure was partly influenced by an algorithm designed to suggest potential connections, which the court heard was a main driver for exploitation cases. Despite these findings, Meta has insisted it has made substantial investments in safety technology and that the welfare of users, including protective measures for teens implemented in 2024, is a priority.

The internal challenges of policing harmful interactions have also been brought to light. Testimonies reveal that even after implementing stricter settings for teen accounts, gaps remained, with instances where minor accounts were still recommended to adults considered potential violators. Furthermore, the decision to introduce end-to-end encryption for Facebook Messenger in 2023 has drawn criticism from child safety advocates, who argue that this hampers the detection of illegal activity, including the sharing of child sexual abuse material.

Mosseri pointed out that Meta has devised systems to identify accounts that exhibit potentially suspicious behavior and prevent those accounts from engaging with younger users. He noted that in 2025 alone, these efforts helped identify millions of accounts on both Facebook and Instagram that posed risks to minors, which were subsequently restricted from interacting with them.

As the trial progresses, it highlights the complexities of managing user safety in an expansive digital landscape. While Meta’s leaders acknowledge the potential for abuse, they assert that their commitment to improving safety features continues. The ongoing discourse emphasizes the need for urgent solutions to ensure the protection of vulnerable users while balancing privacy and safety concerns inherent in digital communication platforms.

This case illuminates the broader challenge faced by social media companies in safeguarding children in an ever-evolving online environment. As the trial unfolds, it is clear that the ongoing dialogue about child safety on social media will play a critical role in shaping future policies and technological innovations aimed at preventing exploitation while preserving user rights.

Popular Categories


Search the website