Juries Slam Meta and YouTube: Addiction, Mental Crisis, and Endangering Minors Exposed in Landmark Lawsuits
Juries in California and New Mexico have found Meta (Facebook, Instagram) and YouTube liable for the mental health crises and endangerment of young users through platform design. Specifically, a New Mexico jury found Meta liable for failing to protect minors from sexual abuse, awarding $375 million. A California jury also assigned Meta 70% responsibility in a finding regarding the depression and anxiety of a young woman, awarding $3 million.
Multiple voices argue the platforms were intentionally engineered to exploit youth. 'MicroWave' stated that Meta and YouTube designed their platforms to hook young users without concern for their well-being. Furthermore, 'xiao' noted that in New Mexico, Meta's algorithms allegedly directed adults toward content posted by teenage users while the company concealed internal risk findings. Reports also detail claims of negligence for failing to warn users about the dangers of Instagram and YouTube.
The weight of the legal findings points to a consensus: major tech platforms designed their systems with systemic disregard for user mental health and safety. The primary fault line centers on the algorithmic promotion of content and the alleged prioritizing of engagement over child protection.
Key Points
#1Meta found liable for endangers of minors
A New Mexico jury awarded $375 million against Meta, citing failure to protect children from predators.
#2Platform design cited for mental distress
A California jury found Meta and Google liable for a young woman’s depression and anxiety, assigning Meta 70% fault.
#3Accusation of addictive design
'MicroWave' reported that Meta and YouTube knowingly designed platforms to addict young users, leading to a $3 million award.
#4Algorithm failure to protect youth
'xiao' claimed Meta's algorithms fed adult content from teen users while the company hid known internal risks.
#5Negligence in warning users
A Los Angeles jury found Meta and YouTube negligent for not adequately warning users about the dangers of their own products.
Source Discussions (4)
This report was synthesized from the following Lemmy discussions, ranked by community score.