A New Mexico jury has ordered Meta Platforms Inc. to pay $375 million in civil penalties after finding the social media giant failed to adequately protect children from sexual predators and misled users about platform safety. The landmark verdict represents one of the most significant legal rulings against a social media company over online child protection concerns.
The civil trial, which began with opening arguments on February 9 in a Santa Fe courthouse, concluded Tuesday after six weeks of testimony. New Mexico Attorney General Raúl Torrez brought the case against Meta, alleging the company violated state consumer protection laws through inadequate safeguards on Facebook and Instagram.
Jury members found that Meta willfully violated the state's unfair practices act after deliberations that began Monday. The verdict comes after prosecutors argued that Meta prioritized user engagement and growth over child safety measures.
Undercover Investigation Reveals Safety Gaps
The case stemmed from a 2023 undercover operation conducted by the New Mexico Attorney General's office, which created a fake social media profile of a 13-year-old girl. According to Attorney General Torrez, the account "was simply inundated with images and targeted solicitations" from child abusers.
During the trial, investigators testified about conducting a broader sting operation using test accounts designed to simulate young users. Court filings revealed these accounts quickly received explicit messages, sexual images, and inappropriate offers from adults, including proposals involving pornography and large financial payments.
State attorney Linda Singer told jurors during closing arguments that "the safety issues presented in this case were not accidents. They were the result of corporate decisions that prioritized growth and engagement over the safety of children."
The investigation led to several arrests connected to online exploitation cases, according to authorities.
Former Meta Researcher Testifies on Platform Risks
Key testimony came from former Meta safety researcher Arturo Béjar, who described personal experience with platform safety issues. Béjar testified that his teenage daughter received inappropriate messages shortly after opening an Instagram account, highlighting vulnerabilities in the platform's user protection systems.
"The product is very effective at connecting people with similar interests," Béjar told the court. "If someone's interest is children, the system can connect them with children."
Béjar's testimony focused on how Meta's recommendation algorithms could unintentionally facilitate connections between predators and minors. His insights as a former company insider provided crucial evidence about internal awareness of safety vulnerabilities.
Internal Documents Reveal Scale of Problem
Court documents presented during the trial revealed internal Meta communications that warned executives about widespread harmful activity on the platforms. According to the evidence, company estimates suggested hundreds of thousands of potential exploitation incidents occur daily across Facebook and Instagram.
These internal communications demonstrated that Meta leadership was aware of the scale of predatory behavior on their platforms but allegedly failed to implement adequate protective measures. The documents became central to prosecutors' arguments that Meta's safety failures were deliberate rather than accidental.
Prosecutors argued that Meta did not properly enforce its stated minimum age requirement of 13 and failed to prevent harmful content and predatory behavior from reaching minors on the platforms.
Meta Plans Appeal Despite Safety Investment Claims
Meta representatives expressed disagreement with the verdict and announced plans to appeal the decision. In a statement following the ruling, a Meta spokesperson said, "We respectfully disagree with the verdict and will appeal."
The company emphasized its ongoing safety investments, stating, "We work hard to keep people safe on our platforms. We are clear about the challenges of identifying bad actors online, and we will continue improving our systems."
During the trial, Meta's legal team argued that the company has developed advanced automated detection tools and committed significant resources to online safety initiatives. They contended that Meta employs tens of thousands of workers dedicated to removing harmful content and protecting users across its platform ecosystem.
Broader Legal Implications for Tech Industry
The New Mexico verdict is part of a broader wave of legal challenges facing major technology companies over social media platform safety. Similar lawsuits are pending in other states, with California currently pursuing cases against Meta and Google's YouTube over claims related to social media addiction and youth mental health impacts.
Legal experts suggest the ruling could significantly influence future litigation involving Big Tech companies and may accelerate efforts to implement stricter regulation of social media platforms, particularly regarding minor protection requirements.
The case represents a shift in how courts evaluate corporate responsibility for user safety on digital platforms, potentially establishing new precedents for holding technology companies accountable for harmful content and predatory behavior.
Appeal Process and Industry Impact
Meta's planned appeal is expected to take several months to process through the court system. The final outcome could have wide-ranging implications for the entire technology industry, particularly companies operating social media platforms with significant minor user bases.
Industry observers note that the $375 million penalty represents a substantial financial consequence that could influence how other technology companies approach child safety measures and content moderation policies.
The case also highlights growing public and regulatory concern about social media platform safety, particularly regarding vulnerable user populations. As similar cases proceed in other jurisdictions, technology companies may face increasing pressure to demonstrate more robust safety measures and transparent reporting about platform risks.
The verdict comes amid broader discussions about social media regulation and corporate responsibility for online safety, with lawmakers and advocates calling for stronger protections for minors using digital platforms.
Photo credit: Photo from New Mexico Attorney General's Office
