For years, parents have watched their children disappear into their phones — scrolling for hours, losing sleep, withdrawing from the world around them — and wondered whether anyone would ever hold the companies responsible. This week, two juries answered that question with a resounding yes.
In back-to-back verdicts that sent shockwaves through Silicon Valley, juries in California and New Mexico handed Meta and YouTube historic defeats. A Los Angeles jury found both companies liable for deliberately designing addictive platforms that harmed a young woman’s mental health, awarding $6 million in damages. One day earlier, a New Mexico jury ordered Meta to pay $375 million for failing to protect children from predators on Instagram and Facebook.
These verdicts are not just legal headlines. They represent a seismic shift in how the law treats the technology your children use every day — and they could open the door for thousands of families to seek justice.
What Happened in These Cases
The Los Angeles case centered on a young woman identified in court documents only by her initials, KGM. Her attorneys argued that Instagram and YouTube were deliberately engineered to be addictive — that features like infinite scroll, autoplay, push notifications, and algorithmic recommendation systems were not accidents of design but intentional choices made to maximize the time young users spent on the platforms, regardless of the consequences to their mental health.
The jury agreed. It found both Meta and Google (YouTube’s parent company) negligent and awarded $6 million in damages — $3 million in compensatory damages and $3 million in punitive damages. Meta was held responsible for 70 percent of the liability, with YouTube accounting for the remaining 30 percent.
The New Mexico case took a different but equally damning angle. The state’s attorney general, Raúl Torrez, sued Meta in 2023 after an undercover investigation revealed that a fake profile set up to look like a 13-year-old girl was immediately inundated with predatory messages and sexually explicit content. The jury found Meta liable on all counts — including willfully engaging in “unfair and deceptive” and “unconscionable” trade practices — and ordered the company to pay $375 million.
The Legal Theory That Changed Everything
What makes these cases so significant is the legal theory at their core. For years, tech companies have argued that they are protected by Section 230 of the Communications Decency Act, which generally shields platforms from liability for content posted by their users. These lawsuits bypassed that defense entirely.
Instead of arguing that Meta or YouTube should be held responsible for specific posts or messages, the plaintiffs argued that the platforms themselves are defectively designed products — like a car with faulty brakes or a children’s toy with a choking hazard. The claim is not about what appears on the screen. It is about the engineering decisions behind the screen: the algorithms that feed vulnerable teenagers an endless stream of harmful content, the design patterns that exploit developing brains, and the corporate decisions to prioritize engagement metrics over child safety.
This is the exact same legal framework that brought down Big Tobacco decades ago. Tobacco companies did not lose in court because they sold cigarettes. They lost because they knew their products were addictive and harmful, marketed them to young people anyway, and actively concealed the evidence of the damage they were causing. The parallels to Big Tech are now impossible to ignore.
Why Parents Should Pay Attention Right Now
If you are a parent whose child has struggled with social media addiction, anxiety, depression, self-harm, eating disorders, or online exploitation, these verdicts should matter to you for three important reasons.
First, a jury has now confirmed what many parents have long suspected. These platforms are not neutral tools. They are engineered to capture and hold your child’s attention using techniques that their own designers know can cause psychological harm. Internal documents that emerged during the trial showed that these companies were aware of the damage and chose profits over safety.
Second, the legal landscape has fundamentally shifted. Before this week, no jury had found a major social media company liable for addiction-related harm under a product defect theory. That barrier is now broken. The precedent has been set, and it will make it significantly easier for families across the country to bring similar claims.
Third, there are approximately 2,000 similar cases pending in federal court. These verdicts will influence settlement negotiations and trial strategies in every single one of them. If your child has been harmed, the window to take action is open — and it may not stay open forever.
The Tobacco Comparison Is Not Hyperbole
When people hear the comparison between Big Tech and Big Tobacco, it can sound like exaggeration. It is not. The pattern is strikingly similar.
In the 1950s and 1960s, tobacco companies funded their own research, discovered that nicotine was addictive and that smoking caused cancer, and then buried the findings. They marketed to teenagers. They designed their products to maximize addiction. When the lawsuits started, they called them frivolous. They said personal responsibility was the issue, not product design. They fought in court for decades.
Eventually, the evidence became undeniable. Juries started siding with plaintiffs. A trickle of verdicts became a flood. The result was a $206 billion Master Settlement Agreement that reshaped an entire industry.
We are watching the same story unfold with social media. Internal documents from Meta have shown that the company’s own researchers warned that Instagram was toxic for teenage girls. Those warnings were downplayed or ignored. Mark Zuckerberg himself took the stand during the Los Angeles trial — a move that many legal analysts believe may have hurt Meta’s defense more than it helped.
The tobacco industry once seemed untouchable too. It is no longer. Big Tech is following the same path.
What the Evidence Revealed About These Companies
One of the most striking aspects of these trials was the volume of internal evidence that surfaced during discovery. Internal research documents, employee communications, and corporate presentations showed that both Meta and Google were well aware of the harm their platforms were causing to young users.
Meta’s own researchers had concluded that Instagram made body image issues worse for one in three teenage girls. Internal presentations acknowledged that the platform’s recommendation algorithms were pushing vulnerable users toward increasingly harmful content. Despite these findings, the company chose not to implement meaningful safety changes because doing so would have reduced user engagement — and advertising revenue.
Mark Zuckerberg’s decision to testify during the Los Angeles trial was a gamble that many legal observers believe backfired. Facing direct questioning about what he knew and when he knew it, his responses did little to convince jurors that Meta had acted responsibly. The jury’s decision to award punitive damages — damages specifically designed to punish wrongful conduct — suggests they found the company’s behavior not just negligent but intentionally harmful.
In New Mexico, the evidence was equally damaging. The state’s undercover investigation demonstrated in real time how quickly a child’s account could be targeted by predators. Within hours of creating a profile designed to look like a 13-year-old, investigators were receiving sexually explicit messages and friend requests from adult accounts with histories of predatory behavior. Meta’s safety systems failed to flag or prevent any of it.
What About the Appeals?
Meta has already indicated it plans to appeal both verdicts. That is expected. Large corporations almost always appeal unfavorable jury verdicts, and the appeals process can take years.
However, an appeal does not erase the verdict. It does not change the fact that a jury of ordinary citizens heard the evidence and decided that these companies acted negligently and deceptively. Appeals courts review legal questions — whether the judge applied the law correctly, whether evidence was properly admitted — but they do not re-weigh the facts. The jury’s findings of fact carry enormous weight.
More importantly, the appeal process does nothing to stop the 2,000 additional cases working their way through the courts. If anything, these verdicts will accelerate the pace of litigation and increase the pressure on tech companies to settle.
What Families Can Do Now
If your child has experienced harm that you believe is connected to social media use — whether that means addiction, depression, anxiety, self-harm, exposure to predatory behavior, eating disorders, or suicidal ideation — you may have legal options that did not exist before this week.
The most important step is to speak with an attorney who understands this area of law. Every case is different, and the facts of your child’s experience matter. But the legal framework for holding these companies accountable is now firmly in place, and families deserve to know their rights.
CONTACT THE WATSON FIRM
If your child has been harmed by social media, we want to hear your story.
Schedule a consultation to discuss your family’s legal options.
Watch our full video breakdown of both verdicts:
youtube.com/@WatsonWins
Disclaimer: This article is for informational purposes only and does not constitute legal advice. Every case is unique, and past results do not guarantee future outcomes. If you believe your child has been harmed, please consult with a qualified attorney to discuss your specific situation.