Jury Condemns Meta On Child Safety. Investors Buy The Dip
Azul Cibils Blaquier

A New Mexico jury ordered Meta to pay $375 million after finding the company willfully violated state consumer protection law by misleading users about the safety of Facebook, Instagram, and WhatsApp and enabling child sexual exploitation on its platforms. It’s the first jury verdict of its kind against the company.
Meta’s stock rose 5% in after-hours trading.
The seven-week trial centered on evidence that Meta knew its platforms were being used by predators and chose not to implement basic safeguards. New Mexico’s investigation included undercover agents who created accounts posing as children and documented the sexual solicitations that followed, along with Meta’s response, or lack of one. Internal company documents presented at trial showed employees had warned about widespread exploitation. One internal discussion revealed that Zuckerberg’s 2019 decision to make Facebook Messenger end-to-end encrypted by default would eliminate the company’s ability to report approximately 7.5 million instances of child sexual abuse material to law enforcement. They did it anyway.
The jury found Meta liable on all counts, including “unfair and deceptive” and “unconscionable” trade practices. Jurors awarded the maximum penalty of $5,000 per violation across thousands of violations.
The $375 million is less than a fifth of the roughly $2.1 billion New Mexico sought. For a company valued at $1.5 trillion, it’s roughly what Meta generates in revenue every 14 hours. Perhaps unsurprisingly the market viewed this as a rounding error.
But the legal consequences extend beyond the dollar amount. The next phase of the case will seek additional financial penalties and court-mandated changes to Meta’s platforms, including effective age verification, removal of predators, and protections for minors in encrypted communications. Meta said it “respectfully disagrees” and will appeal.
This is the first state to get a jury verdict, but it won’t be the last. Meta faces similar lawsuits from dozens of state attorneys general and hundreds of school districts across the country over how its platforms affect children’s mental health. The state accused Meta of designing features like infinite scroll and auto-play video specifically to maximize engagement among minors despite internal evidence those features were driving depression, anxiety, and self-harm.
The legal reckoning could be coming for all tech companies. Last week, three Tennessee teenagers filed a class action lawsuit against Elon Musk’s xAI, alleging that Grok’s image generation tools were used to create sexually explicit images and videos of them from their real photos.
Researchers at the Center for Countering Digital Hate estimated Grok generated approximately 23,000 sexualized images of children in an 11-day period alone, roughly one every 41 seconds. The lawsuit alleges xAI knew this would happen and released the product anyway.
xAI is facing simultaneous investigations in the U.S., EU, UK, France, Ireland, and Australia.
Two different companies, two different technologies, the same failure: building products they knew would be used to exploit users – children – and choosing to ship them anyway. Could New Mexico’s Meta verdict be a turning point?
Well, given that Meta’s stock rose 5% after the verdict, the market’s take is clear: $375 million isn’t accountability. It’s a cost of doing business.