Wednesday, March 25, 2026 - Meta Platforms has been ordered to pay $375 million after a jury found it violated consumer protection laws by misleading users about the safety of its platforms and enabling child sexual exploitation.
The verdict, delivered in New Mexico, marks the first time a
jury has ruled against the company on such claims. The case was brought by the
state’s attorney general, who accused Meta of falsely presenting Facebook,
Instagram, and WhatsApp as safe for children while failing to address harmful
content and exploitation risks.
The jury found Meta guilty of engaging in unfair and
deceptive trade practices under state consumer protection law.
New Mexico Attorney General Raúl Torrez described the ruling
as a landmark decision, saying it represents a victory for families affected by
the company’s actions and sends a strong message to major technology firms.
Meta, however, said it disagrees with the verdict and plans
to appeal, maintaining that it continues to invest in safety measures and works
to remove harmful content from its platforms.
The case stemmed from a 2023 undercover investigation by the
attorney general’s office, which revealed serious gaps in Meta’s content
moderation systems. Investigators created fake accounts posing as users under
14 and reported that these accounts were quickly exposed to explicit material
and contacted by adults seeking similar content.
According to the state, Meta continued to assure the public
that its platforms were safe for children despite internal evidence
highlighting risks related to exploitation and mental health. The lawsuit also
alleged that features such as infinite scroll and auto-play videos were
deliberately designed to maximise engagement, even when they contributed to
addictive behaviour among younger users.
The jury found the company committed 75,000 violations,
awarding $5,000 per violation, which resulted in the $375 million penalty.
The ruling comes amid increasing global scrutiny of social
media platforms and growing efforts by governments to strengthen protections
for children online. Countries such as Australia,
Indonesia, and Denmark are already moving to impose
stricter age restrictions on social media use, reflecting a broader push to
address concerns around child safety, mental health, and online exploitation

0 Comments