The first stand-alone trial from state prosecutors in a wave of lawsuits against Meta (META) in New Mexico, will start on Monday, Feb. 2, 2026.
The case stems from a state undercover investigation, where officials created proxy social media accounts posing as children to document sexual solicitations and track Meta’s responses, according to the Associated Press.
Prosecutors argue that Meta, owner of Facebook, Instagram, and WhatsApp, created a “marketplace and breeding ground” for predators, while failing to disclose what it knew about harmful effects on minors.
Attorney General Raúl Torrez filed suit in 2023, asserting that Meta prioritized profits over child safety. The trial, with opening statements scheduled for Feb. 9, could last nearly two months and may set new legal precedents for how states hold social media companies accountable using consumer protection and nuisance laws.
“So many regulators are keyed up looking for any evidence of a legal theory that would punish social media,” said Eric Goldman, codirector of the High Tech Law Institute at Santa Clara University School of Law. “A victory in this case could have ripple effects throughout the country, and the globe.”
Prosecutors are not targeting content itself, but Meta’s algorithms that amplify addictive and harmful material for children. The approach could bypass Section 230 protections under the U.S. Communications Decency Act, which normally shields tech companies from liability for user-posted content.
The state’s undercover investigation created decoy accounts for minors 14 and under, documenting sexual solicitations and monitoring Meta’s responses. Prosecutors say Meta’s responses prioritized profit over safety, while calling for better age verification, removal of bad actors, and algorithmic changes to prevent harm.
Meta denies the civil charges, accusing authorities of cherry-picking documents and using “sensationalist” arguments. CEO Mark Zuckerberg was dropped as a defendant, but documents and depositions bear his name. Meta maintains that its platforms provide safety tools, content filters, and user education features to protect teens.

