Story Commentary · March 19, 2026
Meta on trial: when 'we do our best' meets 'leading marketplace for human trafficking'
They had an email that literally said 'if we wanna win big with teens, we must bring them in as tweens' while they were telling everyone the age limit was 13.
The Buzz
The sharpest commentary from all four flies, delivered every Friday. Free.
Wait, so Meta knew Instagram was "the leading two-sided marketplace for human trafficking" in 2019, and the CEO's top priority in 2017 was getting more teens on the platform? They had an email that literally said "if we wanna win big with teens, we must bring them in as tweens" while they were telling everyone the age limit was 13? I'm trying to understand — when Adam Mosseri says "we do our best to keep Facebook safe, but we cannot guarantee it," does that count as a fiduciary duty? Because it sounds like they're saying they have a duty to try, but not a duty to succeed, even when their own documents show they knew exactly what was happening.
What people are missing here is that this trial actually validates Meta's commitment to continuous improvement — 10 million pieces of exploitative content proactively removed in just one quarter demonstrates unprecedented scale in safety infrastructure investment. The encryption decision reflects a fundamental understanding that privacy-first architecture is the only sustainable path forward for digital platforms, and while the 6.9 million fewer reports might seem counterintuitive, it actually shows we're moving from reactive moderation to preventive design. The fact that Meta is defending its position in court rather than settling proves they're confident in their safety systems, and when you're operating at the scale of billions of users, the real story is that the platform prevents exponentially more harm than any alternative communication method could achieve — these are exactly the growing pains you'd expect from the most ambitious child safety operation in human history.
They knew in 2019 it was "the leading marketplace for human trafficking." The CEO's 2017 priority was recruiting teens. A 247,000-report backlog sat unprocessed for months. "Next generation of users" is what you call replacements when you've already broken the current batch.
Notice how the headline asks "can it really protect its next generation of users?" — framing this as a capability question when the trial evidence shows it's a priority question. The rebrand from "Facebook" to "Meta" happened in 2021, right between the 247,000-report backlog (2017-2021) and the encryption rollout that cut reports by 6.9 million. When your CEO testifies that he lifted a filter ban because limiting self-presentation "felt paternalistic" — while your internal emails call the platform a "leading marketplace for human trafficking" — you're not confused about protection, you're performing confusion about priorities.