People's Justice is not a law firm and does not provide legal advice.
Attorney advertising. Prior results do not guarantee a similar outcome.
Do You Qualify?
Eligibility Checklist
- Child or teen under 18 who used social media platforms from defendant companies (Meta/Instagram/Facebook, TikTok/ByteDance, Snap/Snapchat, Google/YouTube, X Corp.)
- Documented signs of compulsive social media use (inability to stop, excessive screen time, checking behavior, withdrawal symptoms)
- Anxiety or depression diagnosed during or after a period of heavy social media use
- Eating disorder or body dysmorphia linked to social media content, filters, or social comparison
- Self-harm or suicidal ideation connected to social media use, cyberbullying, or harmful content exposure
- Sleep disruption from late-night social media use, notifications, or blue light exposure
How Social Media Platforms Engineer Addiction in Children
In Plain Language
Social media platforms employ algorithmic manipulation techniques — originally developed for maximizing advertising revenue — to create compulsive use patterns in users. These techniques are particularly effective on children and adolescents, whose developing brains have heightened sensitivity to social validation, peer approval, and fear of exclusion. The result is a generation of young people experiencing clinically significant addiction to products designed to exploit their neurological and developmental vulnerabilities.
Algorithmic Content Amplification
Every major social media platform uses machine learning algorithms that analyze user behavior in real time and serve content designed to provoke the strongest emotional response — maximizing engagement time regardless of the content's impact on mental health. For teenagers, this means the algorithm learns to surface content about body image, social comparison, drama, self-harm, and extreme emotional material — precisely the content most harmful to developing minds. TikTok's For You Page has been documented serving self-harm content to new teen accounts within minutes of registration.
Infinite Scroll and Autoplay
Infinite scroll eliminates natural stopping points in the user experience, creating a bottomless feed that exploits the human brain's novelty-seeking behavior. Users intend to check for a few minutes and find themselves scrolling for hours. TikTok and YouTube Shorts autoplay videos continuously without any user action required, creating a passive consumption state. These design choices are not accidental — they are documented in internal company communications as deliberate strategies to maximize "time spent" metrics.
Social Validation and Variable-Ratio Reinforcement
Likes, comments, followers, and shares deliver unpredictable social validation on the same variable-ratio reinforcement schedule that makes slot machines addictive. Users compulsively check their phones because the next notification could bring the dopamine surge of social approval — or the anxiety of social rejection. For teenagers, whose identity formation depends heavily on peer validation, this creates an extraordinarily powerful behavioral trap. Instagram's like counts and TikTok's view counts turn every post into a gamble on self-worth.
Streaks, FOMO, and Compulsive Engagement
Snapchat streaks create daily obligations — teens must exchange messages every 24 hours to maintain their streak count, transforming optional social interaction into a compulsive daily requirement. Breaking a streak triggers visible social consequences. Limited-time content (Instagram Stories that disappear after 24 hours, live streams) creates fear of missing out (FOMO) that drives constant checking behavior. Push notifications are timed using optimization algorithms to interrupt users at moments calculated to maximize re-engagement.
Danger Factors
- Children's prefrontal cortex (impulse control center) does not fully develop until approximately age 25, making them neurologically unable to resist algorithmic manipulation at the same level as adults
- No meaningful age verification exists on major platforms — a 10-year-old can create an account by entering a false birth date
- Platforms are accessible 24/7 on smartphones that children carry constantly, eliminating all natural boundaries on usage
- Algorithms learn and exploit individual psychological vulnerabilities within hours of account creation, creating personalized addiction profiles
- Beauty filters and social comparison features directly trigger body dysmorphia and eating disorders in developing adolescents
Scientific Consensus
- U.S. Surgeon General has identified social media as a driving force behind the youth mental health epidemic and called for warning labels
- American Psychological Association issued a health advisory warning that social media poses a "profound risk" to children's mental health
- Meta's own internal research showed Instagram made body image worse for 32% of teen girls
- Research shows teens spending 3+ hours/day on social media face double the risk of anxiety and depression
Why This Matters for Your Case
These design choices are not accidental — they are the product of deliberate corporate strategy documented in internal communications, whistleblower testimony, and patent filings. The Facebook Papers reveal that Meta knew its products harmed children and chose to prioritize engagement metrics and advertising revenue over user safety. This knowledge, combined with the failure to implement adequate safeguards, age verification, or warnings, forms the basis of negligence, product liability, and consumer protection claims in MDL 3047.
Injured? Get a free Social Media Addiction case review.
Get Your Free Case Reviewor call 1-800-555-0100
The Neuroscience Behind Social Media Addiction
The neurobiological basis for social media addiction is well established. Functional magnetic resonance imaging (fMRI) studies demonstrate that social media notifications and likes activate the nucleus accumbens — the same brain region involved in substance use disorders and gambling addiction. The dopamine release triggered by social media engagement is comparable to that produced by addictive substances, creating a powerful biochemical drive to continue using the platforms.
The vulnerability of children and adolescents to these effects is the central issue in this litigation. The human prefrontal cortex, which governs impulse control, decision-making, and the ability to weigh long-term consequences against short-term rewards, does not fully mature until approximately age 25. Teenagers are neurologically predisposed to respond more intensely to social validation, peer approval, and the fear of social exclusion — precisely the psychological levers that social media platforms are designed to exploit.
The American Psychological Association's 2023 health advisory specifically addressed the developmental mismatch between adolescent neurobiology and social media design. The advisory found that social media features including algorithmic feeds, like counts, and appearance-altering filters pose particular risks to adolescents because they exploit the heightened sensitivity to social evaluation that characterizes this developmental period. Chronic heavy social media use has been associated with structural changes in brain regions involved in emotional regulation, reward processing, and self-referential thinking.
A landmark study by Twenge and Haidt (2023) demonstrated a strong temporal correlation between the rise of smartphone-based social media use beginning in 2012 and a dramatic increase in rates of depression, anxiety, self-harm, and suicide among adolescents — particularly girls. The study controlled for alternative explanations and found that social media was the most significant factor explaining the deterioration in youth mental health across multiple countries.
What Social Media Companies Knew
Internal documents obtained through litigation discovery and leaked by whistleblower Frances Haugen reveal that social media companies were not merely aware of the harm their products caused to children — they actively chose profit over safety. The Facebook Papers, a trove of internal Meta documents disclosed in 2021, showed that Meta's own researchers concluded that Instagram made body image issues worse for one in three teen girls, that teens blamed Instagram for increases in anxiety and depression, and that the platform's Explore page served eating disorder content to users who had not sought it out.
Meta's response to this internal research was not to redesign the product but to suppress the findings and accelerate the development of Instagram Kids — a version of the platform targeting children under 13. Only after the Wall Street Journal's "The Facebook Files" series in September 2021 and Frances Haugen's subsequent testimony before Congress in October 2021 did Meta shelve the Instagram Kids project, which it described as a "pause" rather than a cancellation.
TikTok's algorithm has been documented as particularly aggressive in serving harmful content to young users. Research by the Center for Countering Digital Hate found that TikTok recommended self-harm content to new teen accounts within minutes of registration. The platform's For You Page algorithm learns user preferences at extraordinary speed and creates filter bubbles that amplify harmful content — a user who pauses on a body-image-related video will be served increasingly extreme content on that topic. Snap Inc.'s Snapchat has faced particular scrutiny for its role in facilitating fentanyl distribution to minors and for its streaks feature, which creates compulsive daily engagement patterns.
The Future of Social Media Litigation
MDL 3047 represents one of the largest and most consequential product liability litigations of the 2020s. Over 1,600 cases from families across the country have been consolidated before Judge Yvonne Gonzalez Rogers in the Northern District of California. The K.G.M. bellwether trial — the first case to go to trial — began February 10, 2026 in LA County Superior Court, with TikTok and Snap settling their portions confidentially in January 2026 and Meta's Mark Zuckerberg testifying on February 18-19, 2026.
The trajectory of this litigation mirrors the early stages of tobacco litigation — which began with individual claims, consolidated into coordinated proceedings, and ultimately resulted in the $206 billion Master Settlement Agreement that transformed the tobacco industry. The social media industry's multi-billion-dollar dependence on engagement-maximizing algorithms that harm children creates both a powerful incentive for continued resistance and a correspondingly large exposure to damages if courts find these practices unlawful.
Legislative momentum is accelerating. The Kids Online Safety Act (KOSA) passed the U.S. Senate in 2024. A 42-state attorney general coalition sued Meta in October 2023. Multiple states have enacted or proposed legislation restricting social media access for minors. At the international level, the European Union's Digital Services Act imposes significant new obligations on platforms regarding algorithmic transparency and youth protection.
Social Media Addiction Settlement Tiers and Compensation Ranges
Social media addiction settlement values depend on the severity of documented harm to the child, the strength of medical and academic evidence, the specific platforms and features involved, and the duration of exposure. Based on the current litigation landscape, the K.G.M. bellwether proceedings, and comparable mass tort precedents, three compensation tiers have been projected.
Tier I — Moderate Impact
ModerateSettlement Range
Criteria
- Documented excessive social media use (screen time data, 3+ hours daily)
- Therapy or counseling for anxiety, depression, or social media-related distress
- Some academic decline during period of heavy social media use
- Used platforms from defendant companies during the relevant period
Examples
- A 14-year-old who used Instagram and TikTok 4+ hours daily for two years, experienced increased anxiety and declining grades, and whose therapist documented a connection between social media use and mental health symptoms
Tier II — Significant Impact
SignificantSettlement Range
Criteria
- Diagnosed anxiety, depression, or other mental health condition linked to social media use
- Eating disorder or body dysmorphia connected to platform content or beauty filters
- Significant academic failure or social withdrawal
- Hospitalization for mental health crisis related to social media
- Cyberbullying or online harassment contributing to mental health decline
Examples
- A 15-year-old diagnosed with depression and an eating disorder after three years of heavy Instagram use, hospitalized for an eating disorder crisis, with medical records documenting the connection between body image content on Instagram and the disorder
Tier III — Severe Impact
SevereSettlement Range
Criteria
- Suicide attempt or self-harm linked to social media use or content
- Death by suicide connected to cyberbullying, harmful content, or social media addiction
- Severe eating disorder requiring inpatient treatment or residential care
- Prolonged psychiatric hospitalization
- Complete social withdrawal and functional impairment
Examples
- A 13-year-old who attempted suicide after exposure to self-harm content on TikTok and cyberbullying on Instagram, requiring psychiatric hospitalization and ongoing residential treatment, with clinical records linking the attempt directly to social media content and addiction
These ranges are estimates based on the current litigation landscape, the K.G.M. bellwether proceedings, and comparable mass tort settlements. Actual compensation depends on individual case circumstances and is not guaranteed. The bellwether trial outcomes in 2026 will provide clearer guidance on case valuations.
Who Is at Risk from Social Media Addiction?
The risk of developing social media addiction — and the strength of a potential legal claim — depends on the child's age, the intensity and duration of use, the specific platforms and features involved, and the presence of documented harm. Three primary risk profiles have emerged in the litigation.
Children Ages 10-13 (Pre-Teens)
Developing Brain / Underage Access
Common Tasks
- Creating accounts using false birth dates to bypass age minimums
- Using Instagram, TikTok, and Snapchat 3+ hours daily
- Exposure to appearance-focused content, beauty filters, and social comparison
- Developing anxiety about likes, followers, and social standing
- Encountering age-inappropriate content including self-harm and eating disorder material
Key Stat: Children in this age group have the least-developed prefrontal cortex and the highest neurological vulnerability to algorithmic manipulation. Meta's own data showed over 1 million users under 13 on Instagram despite a stated minimum age of 13. This group is particularly vulnerable to body image harm, cyberbullying, and exposure to harmful content.
Teenagers Ages 13-17
Identity Formation / Social Pressure
Common Tasks
- Using multiple platforms (Instagram, TikTok, Snapchat, YouTube) 4+ hours daily
- Deriving self-worth from likes, followers, comments, and engagement metrics
- Maintaining Snapchat streaks as daily obligations
- Comparing appearance to filtered and edited images
- Staying up late scrolling through infinite feeds, disrupting sleep
- Experiencing cyberbullying, social exclusion, or online harassment
Key Stat: Teenagers face compounded risk from both neurological vulnerability and intense social pressure. The Surgeon General's advisory notes that teens spending 3+ hours daily on social media — which includes a majority of teen users — face double the risk of depression and anxiety symptoms. Teen girls are particularly vulnerable to body image harm from Instagram and TikTok beauty filters.
Parents of Affected Children
Secondary Harm / Loss of Relationship
Common Tasks
- Discovering child's social media use is more extensive than believed
- Struggling with child's behavioral changes linked to social media
- Dealing with child's mental health crisis (self-harm, eating disorder, suicidal ideation)
- Attempting to restrict social media access and facing severe resistance
- Navigating inadequate and deliberately confusing parental controls
Key Stat: Parents have independent claims for loss of consortium — the loss of companionship and relationship with their child caused by social media addiction. Parents also face direct financial harm from medical bills, therapy costs, and treatment expenses. Parental control features on major platforms have been documented as deliberately ineffective by design.
Understanding Exposure Levels
Risk profiles are general guidelines and do not determine legal eligibility. Many factors affect the strength of an individual claim, including the specific platforms used, the documented harm, the available evidence, and the applicable state law. A free attorney consultation will evaluate your child's specific circumstances.
Internal Documents & Evidence
U.S. Surgeon General Advisory on Social Media and Youth Mental Health
“U.S. Surgeon General Dr. Vivek Murthy issued an unprecedented advisory in May 2023 identifying social media as a primary driver of the youth mental health epidemic. The advisory cited extensive evidence that social media use is associated with increased rates of anxiety, depression, body dissatisfaction, and poor sleep among adolescents. In June 2024, Dr. Murthy escalated his warning by calling for Congress to require tobacco-style warning labels on social media platforms, stating that the mental health crisis among youth is an emergency requiring urgent action.”
Impact: The Surgeon General's advisory provides the highest-level public health endorsement for the claims in MDL 3047. It establishes that the nation's top public health official has reviewed the evidence and concluded that social media poses a significant risk to children's mental health — directly supporting the legal theory that platforms are defectively designed products causing foreseeable harm.
The Facebook Papers — Internal Meta Research on Instagram and Teens
“In September 2021, the Wall Street Journal published "The Facebook Files," a series based on internal Meta documents provided by whistleblower Frances Haugen. The documents revealed that Meta's own researchers had conducted studies showing that Instagram made body image issues worse for one in three teen girls, that teens blamed Instagram for increases in anxiety and depression, that the platform's Explore page served eating disorder content to users who had not sought it out, and that Meta was aware of over 1 million reports of users under age 13 on Instagram. Despite these findings, Meta suppressed the research and accelerated development of Instagram Kids.”
Impact: The Facebook Papers are the most damaging evidence in the social media addiction litigation. They demonstrate that Meta had actual knowledge that Instagram harmed children, conducted internal research documenting the specific mechanisms of harm, and deliberately chose to prioritize revenue over safety. This evidence of knowledge and intent is critical for establishing liability and supports claims for punitive damages.
fMRI Brain Studies — Social Media Activation of Adolescent Reward Circuitry
“Multiple fMRI studies have demonstrated that social media notifications, likes, and engagement activate the nucleus accumbens — the brain's primary reward center — in adolescents at levels comparable to substance use and gambling. A 2023 study published in JAMA Pediatrics found that adolescents who habitually checked social media showed increased sensitivity to social rewards and punishments over time, with measurable changes in brain regions governing emotional regulation and self-control. The studies show that social media literally rewires developing brains to be more responsive to the platforms' engagement mechanisms.”
Impact: The neuroimaging evidence provides objective, biological proof that social media addiction is not a behavioral choice but a neurological condition with measurable brain changes. This evidence counters platform companies' arguments that users freely choose to engage and supports the classification of social media platforms as addictive products comparable to regulated substances.
APA Health Advisory on Social Media Use in Adolescence
“The American Psychological Association issued a comprehensive health advisory in May 2023 warning that social media use poses a "profound risk" to children's and adolescents' mental health. The APA's advisory board — comprising the nation's leading psychologists and researchers — found that algorithmic feeds, like counts, appearance-altering filters, and social comparison features are particularly harmful to adolescents because they exploit the heightened sensitivity to social evaluation that characterizes this developmental period. The APA called for platform redesign, stronger age verification, and regulatory action.”
Impact: The APA advisory provides authoritative scientific endorsement from the nation's primary professional organization of psychologists. Combined with the Surgeon General's advisory, it establishes an unprecedented consensus among public health authorities that social media platforms are harmful to children — strengthening the evidentiary foundation for product liability claims.
Injured? Get a free Social Media Addiction case review.
Get Your Free Case Reviewor call 1-800-555-0100
Government Actions on Social Media Addiction
Regulatory and legislative action on social media addiction and youth protection has accelerated dramatically since 2021. Federal agencies, state legislatures, state attorneys general, and international bodies have taken increasingly aggressive positions against social media platforms' impact on children.
Issued advisory identifying social media as a primary driver of the youth mental health epidemic; followed in June 2024 with call for Congress to require tobacco-style warning labels on social media platforms
The Surgeon General's back-to-back advisories represent the strongest public health condemnation of social media's impact on children. The call for warning labels places social media in the same category as tobacco — a product so dangerous that the government requires explicit health warnings.
Created MDL 3047 — In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation — consolidating over 1,600 cases before Judge Yvonne Gonzalez Rogers in the Northern District of California
MDL 3047 is the largest litigation targeting social media companies for harm to children. The consolidation enables coordinated discovery, expert testimony, and bellwether trial selection for efficient resolution of the claims.
Filed coordinated lawsuit against Meta alleging the company designed Instagram and Facebook to addict children and violated COPPA by collecting data from users under 13 without parental consent
The bipartisan coalition of 42 state AGs represents an extraordinary consensus among state law enforcement officials that Meta's practices harm children. The lawsuit seeks injunctive relief requiring platform redesign, financial penalties, and disgorgement of profits.
Passed the Kids Online Safety Act (KOSA) with 91-3 bipartisan vote, imposing duty of care on platforms to prevent harm to minors and requiring strongest privacy settings by default for users under 17
KOSA represents the most significant federal legislative effort to regulate social media's impact on children. The bill requires platforms to provide tools for parents and minors to limit addictive features, protect personal data, and opt out of algorithmic recommendations.
Filed lawsuit against TikTok and parent company ByteDance for violating the Children's Online Privacy Protection Act (COPPA) by collecting personal information from children under 13 without parental consent
The DOJ's COPPA enforcement action against TikTok complements the private litigation in MDL 3047 and the state AG lawsuits. The suit alleges TikTok knowingly allowed children under 13 to create accounts and collected their data in violation of federal law.
Secured $5 billion settlement with Meta (Facebook) for privacy violations related to the Cambridge Analytica scandal and deceptive data practices
The $5 billion FTC settlement — the largest privacy enforcement action in history — established that Meta's business practices violated consumer privacy. While focused on data practices rather than addiction, the settlement demonstrated Meta's pattern of prioritizing profit over user protection.
Multiple states enacted laws restricting social media for minors: Utah Social Media Regulation Act, Arkansas Social Media Safety Act, Texas HB 18, Florida HB 3, and others requiring age verification and parental consent
State-level legislation is creating a patchwork of youth protection laws that impose compliance costs on platforms and establish the legislative consensus that social media access must be regulated for minors. Some laws face First Amendment challenges but demonstrate broad political support for regulation.
Significance Legend
Key Takeaway
The regulatory landscape around social media and youth protection is evolving at extraordinary speed. The Surgeon General's warning label call, KOSA, the 42-state AG lawsuit, the DOJ's COPPA action against TikTok, and the K.G.M. bellwether trial collectively establish both the public health consensus that social media harms children and the legal and political momentum for accountability.
How Litigation Is Impacting the Social Media Industry
The social media addiction litigation has imposed significant financial, regulatory, and reputational costs on major platform companies. The combination of MDL 3047, state attorney general lawsuits, congressional scrutiny, and accelerating legislation is creating multi-front pressure that is already changing how platforms operate and how investors value these companies.
Timeline: Meta (Instagram/Facebook), TikTok/ByteDance, Snap Inc., Google/YouTube, X Corp.
Facebook Papers Published
The Wall Street Journal publishes "The Facebook Files" based on internal documents provided by whistleblower Frances Haugen, revealing Meta knew Instagram harmed teen mental health. Frances Haugen testifies before Congress in October 2021, triggering massive public backlash.
MDL 3047 Created
The Judicial Panel on Multidistrict Litigation creates MDL 3047 — In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation — consolidating cases before Judge Yvonne Gonzalez Rogers in the Northern District of California.
42-State AG Coalition Sues Meta
A bipartisan coalition of 42 state attorneys general files a coordinated lawsuit against Meta, alleging the company designed Instagram and Facebook to addict children and violated COPPA by collecting data from users under 13.
TikTok and Snap Settle K.G.M. Bellwether
TikTok and Snap Inc. reach confidential settlements with the plaintiffs in the K.G.M. v. Meta/YouTube bellwether case, the first social media addiction case set for trial. The settlements remove TikTok and Snap as defendants in the bellwether trial.
K.G.M. Bellwether Trial Begins
The first social media addiction bellwether trial — K.G.M. v. Meta/YouTube — begins February 10, 2026 in LA County Superior Court. Mark Zuckerberg testifies February 18-19, 2026. The trial will establish precedent for the remaining 1,600+ cases in MDL 3047.
Congressional Scrutiny and Regulatory Pressure
Social media companies have faced extraordinary congressional scrutiny, federal enforcement actions, state attorney general lawsuits, and international regulation in response to mounting evidence that their products harm children.
- Frances Haugen's congressional testimony (October 2021) revealed Meta suppressed internal research showing Instagram harmed teen mental health
- U.S. Surgeon General issued two consecutive advisories (2023, 2024) identifying social media as a youth mental health crisis driver and calling for warning labels
- 42 state attorneys general sued Meta in October 2023 for addicting children and violating COPPA
- Kids Online Safety Act (KOSA) passed the U.S. Senate in July 2024 with overwhelming bipartisan support
- DOJ sued TikTok in August 2024 for COPPA violations, alleging the platform collected data from children under 13
- Multiple states enacted laws restricting social media access for minors, including Utah, Arkansas, Texas, and Florida
Key Takeaway
The social media addiction litigation has created unprecedented financial and regulatory pressure on platform companies. MDL 3047, the 42-state AG lawsuit, KOSA, the Surgeon General's warning label call, and the K.G.M. bellwether trial represent a multi-front accountability campaign with enormous momentum. The trajectory mirrors tobacco litigation — and the social media industry's dependence on engagement-maximizing algorithms creates correspondingly large exposure to damages.
Notable Verdicts & Settlements
TikTok K.G.M. Bellwether Settlement
SettlementTikTok reached a confidential settlement with the plaintiffs in K.G.M. v. Meta/YouTube, the first social media addiction bellwether case. The settlement, reached January 26-27, 2026, removed TikTok as a defendant from the bellwether trial that began February 10, 2026. While the settlement amount is sealed, TikTok's willingness to settle before trial signals recognition of significant liability exposure. The settlement terms are expected to inform the broader resolution framework for the remaining TikTok cases in MDL 3047.
Snap Inc. K.G.M. Bellwether Settlement
SettlementSnap Inc. reached a confidential settlement with the plaintiffs in the K.G.M. bellwether case in mid-January 2026, removing Snapchat as a defendant from the trial. Snap faced particular scrutiny in the bellwether case for its streaks feature, which creates compulsive daily engagement, and for its role in facilitating contact between minors and harmful actors. The settlement signals Snap's assessment that trial outcomes would be unfavorable and is expected to influence resolution of the remaining Snapchat claims in the MDL.
Google/YouTube COPPA Settlement
SettlementGoogle and YouTube agreed to pay $170 million to settle FTC and New York Attorney General allegations that YouTube violated the Children's Online Privacy Protection Act by collecting personal information from children under 13 without parental consent and using the data to serve targeted advertising. The settlement required YouTube to implement a system for identifying children's content, limiting data collection, and disabling personalized advertising on content directed at children. At the time, it was the largest COPPA enforcement action in history.
Epic Games FTC Settlement (COPPA + Dark Patterns)
SettlementEpic Games agreed to pay $520 million to settle FTC allegations of COPPA violations and dark patterns in Fortnite. While primarily a gaming case, the settlement established critical federal precedent for enforcement against technology companies that use addictive design and deceptive practices targeting children. The FTC's findings regarding dark patterns, inadequate parental controls, and monetization of minors apply directly to social media platforms' practices and have been cited in MDL 3047 proceedings.
Meta/Facebook FTC Privacy Settlement
SettlementMeta (then Facebook) agreed to pay $5 billion to settle FTC allegations of privacy violations stemming from the Cambridge Analytica scandal and broader deceptive data practices. The settlement — the largest privacy penalty in history — established that Meta's business model systematically prioritized data collection and engagement over user privacy and protection. The settlement required Meta to implement new privacy governance structures, but critics noted the penalty represented less than one month of Meta's revenue and failed to change the company's fundamental business model.
TikTok $92M Class Action (Privacy)
SettlementTikTok agreed to pay $92 million to settle a class action lawsuit alleging the platform violated Illinois' Biometric Information Privacy Act (BIPA) and federal privacy laws by collecting and sharing users' biometric data — including facial geometry and voiceprints — without consent. The settlement covered users across multiple states and highlighted TikTok's aggressive data collection practices, particularly as they apply to minor users whose biometric data was collected without parental consent.
K.G.M. v. Meta/YouTube Bellwether (Pending)
Jury VerdictK.G.M. v. Meta/YouTube is the first social media addiction bellwether trial, which began February 10, 2026 in LA County Superior Court. TikTok and Snap settled their portions of the case confidentially in January 2026. Meta CEO Mark Zuckerberg testified February 18-19, 2026. The trial will establish critical precedent for the remaining 1,600+ cases in MDL 3047 and is expected to set the framework for settlement negotiations. The outcome will determine how courts evaluate the causal connection between platform design and youth mental health harm.
Scientific Evidence
U.S. Surgeon General Advisory on Social Media and Youth Mental Health
Office of the U.S. Surgeon General (Dr. Vivek Murthy). (2023). U.S. Surgeon General Advisory
Key Findings
- Teens spending 3+ hours daily on social media face double the risk of anxiety and depression symptoms
- Social media use is associated with increased body dissatisfaction, eating disorder risk, and poor self-image, particularly among girls
- Algorithmic feeds that maximize engagement can expose children to harmful content including self-harm, eating disorder, and suicide-related material
- The Surgeon General called for tobacco-style warning labels on social media platforms in June 2024, stating the youth mental health crisis is an emergency
Adolescent Mental Health and Social Media: Generational Trends
Twenge JM, Haidt J. (2023). Journal of Adolescence / Review of General Psychology
Key Findings
- Rates of teen depression, anxiety, self-harm, and suicide increased sharply beginning in 2012 — coinciding with widespread smartphone and social media adoption
- The increase was particularly pronounced among girls, with depression rates rising 145% between 2010 and 2020
- The pattern was replicated across multiple countries and cultures, suggesting a common cause rather than country-specific factors
- Social media's impact on mental health operates through social comparison, cyberbullying, sleep disruption, and reduced in-person socialization
fMRI Evidence for Social Media Effects on Adolescent Brain Development
Maza MT, Fox KA, Kwon SJ, et al. (2023). JAMA Pediatrics
Key Findings
- Habitual social media checking in early adolescence is associated with changes in brain sensitivity to social feedback over time
- Frequent checkers showed increased neural sensitivity to social rewards and punishments in the amygdala, prefrontal cortex, and ventral striatum
- The findings suggest social media may alter the developmental trajectory of brain regions involved in motivation, self-control, and emotional regulation
- The study provides biological evidence that social media addiction involves measurable changes in brain structure and function, not just behavioral patterns
Injured? Get a free Social Media Addiction case review.
Get Your Free Case Reviewor call 1-800-555-0100
Social Media-Induced Anxiety, Depression, and Eating Disorders
Medical Definition
Social media addiction encompasses a cluster of mental health conditions caused or exacerbated by compulsive social media use. The primary conditions include generalized anxiety disorder (GAD), major depressive disorder (MDD), body dysmorphic disorder (BDD), anorexia nervosa and bulimia nervosa, non-suicidal self-injury (NSSI), and suicidal ideation. While social media addiction itself is not yet a standalone diagnosis in the DSM-5 or ICD-11, the American Psychological Association and the U.S. Surgeon General have identified problematic social media use as a significant contributor to diagnosable mental health conditions in children and adolescents. The clinical presentation typically involves compulsive checking behavior, anxiety when unable to access platforms, withdrawal from in-person relationships, sleep disruption, and progressive deterioration of mental health during periods of heavy use.
Symptoms
Anxiety and depression
severeTeens spending 3+ hours daily on social media face double the risk of anxiety and depression (Surgeon General Advisory). Symptoms include persistent worry, irritability, loss of interest in activities, hopelessness, and social withdrawal. Social comparison, fear of missing out, and cyberbullying are primary triggers.
Eating disorders and body dysmorphia
severeMeta's own research showed Instagram made body image worse for 32% of teen girls. Beauty filters, edited photos, and appearance-focused content create unrealistic body standards. Clinical presentations include anorexia nervosa, bulimia nervosa, and body dysmorphic disorder. 13.5% of UK teen girls reported suicidal feelings linked to Instagram.
Self-harm and suicidal ideation
severeSocial media platforms have been documented serving self-harm and suicide-related content to vulnerable teens through algorithmic recommendation. TikTok was found to recommend self-harm content to new teen accounts within minutes. Cyberbullying on platforms is a documented trigger for self-harm and suicide attempts.
Sleep disruption
moderateLate-night social media use, blue light exposure from screens, notification-driven awakenings, and anxiety about missing content all contribute to chronic sleep deprivation. Insufficient sleep compounds other mental health effects and impairs academic performance, emotional regulation, and physical health.
ADHD exacerbation
moderateShort-form content (TikTok, Instagram Reels, YouTube Shorts) conditions the brain for rapid stimulus switching, worsening attention deficits. Children with existing ADHD are particularly vulnerable. The constant stream of novel stimuli reduces the ability to sustain attention on longer tasks like schoolwork.
Compulsive checking behavior
moderateVariable-ratio reinforcement from likes, comments, and notifications creates compulsive phone-checking behavior that mirrors substance craving. Studies show average teens check their phones 96 times per day. The behavior persists even when the user recognizes it as problematic and wants to stop.
Risk Factors
- Developing brain (under age 25, with particular vulnerability during adolescence due to heightened sensitivity to social evaluation)
- Pre-existing anxiety, depression, or body image concerns
- Female gender (teen girls disproportionately affected by body image and social comparison features)
- History of bullying, social exclusion, or low self-esteem
- Early access to social media (before age 13)
- Unlimited and unsupervised smartphone access, particularly at night
Diagnosis Process
- 1Screening questionnaire: Validated instruments such as the Social Media Disorder Scale (SMDS), the Bergen Social Media Addiction Scale (BSMAS), or the PHQ-9 for depression and GAD-7 for anxiety are administered
- 2Clinical interview: A mental health professional conducts a comprehensive interview covering social media use patterns, emotional responses, and functional impact
- 3Behavioral assessment: Review of screen time data, app usage reports, and social media engagement patterns to quantify the extent of use
- 4Comorbidity evaluation: Assessment for co-occurring conditions including depression, anxiety, eating disorders, ADHD, and self-harm behavior
- 5Family assessment: Interviews with parents and family members to understand the impact of social media use on family dynamics and daily functioning
- 6Functional impact assessment: Review of academic records, sleep patterns, social relationships, and medical history to document consequences
Treatment Options
Survival Rates
| Stage | 5-Year Rate | 10-Year Rate |
|---|---|---|
| CBT-treated social media-related anxiety/depression | 70-80% show significant improvement | Long-term studies ongoing |
| Family therapy combined with CBT | 75-85% show significant improvement | Best outcomes for adolescents with family involvement |
| Eating disorder treatment (when social media-related) | 60-70% achieve remission with comprehensive treatment | Relapse risk elevated without ongoing social media management |
| Untreated / self-resolution | 25-35% show improvement without treatment | Chronic social media exposure without intervention tends to worsen outcomes |
Prognosis
The prognosis for social media-related mental health conditions depends on the severity of the condition, the age of onset, the specific platforms involved, and the availability of appropriate treatment. Adolescents who receive structured treatment — particularly CBT combined with family intervention and digital boundary-setting — show the best outcomes. Early intervention before conditions become entrenched produces significantly better results. However, the ubiquity of smartphones and the social pressure to maintain social media accounts create an environment where complete disengagement is difficult, and relapse risk remains elevated.
Support Resources
- https://www.samhsa.gov/find-help/national-helpline — SAMHSA National Helpline (1-800-662-4357)
- https://988lifeline.org — 988 Suicide & Crisis Lifeline (call or text 988)
- https://www.nationaleatingdisorders.org — National Eating Disorders Association (NEDA)
- https://www.apa.org/topics/social-media-internet — APA Social Media and Mental Health Resources
Your Legal Team
Rachel Chen
Partner — Technology & Children's Rights Litigation
San Francisco, CA
Rachel Chen has spent 18 years representing families harmed by technology products, with a particular focus on children's safety and digital rights. Her background in psychology gives her a unique ability to explain the neuroscience of social media addiction to judges and juries. Rachel was among the first attorneys to file social media addiction claims following the Facebook Papers disclosures and serves on the Plaintiffs' Steering Committee in MDL 3047. She has personally represented over 200 families in social media addiction claims and was recognized by the National Law Journal for her work on children's technology safety litigation.
Education
- J.D., UC Berkeley School of Law
- B.A., Psychology, Stanford University
David Okonkwo
Senior Associate — Youth Mental Health & Mass Tort Litigation
New York, NY
David Okonkwo combines a decade of mass tort litigation experience with a master's degree in public health epidemiology, bringing a data-driven approach to social media addiction cases. His work on social media litigation began in 2021 after the Facebook Papers revelations, and he has since built one of the most comprehensive evidentiary databases connecting platform design features to documented youth mental health outcomes. David has represented families in MDL 3047, the state AG coalition case against Meta, and individual actions against TikTok and Snap. He is known for his ability to translate complex epidemiological and neuroscience evidence into compelling legal narratives.
Education
- J.D., Columbia Law School
- M.P.H., Epidemiology, Johns Hopkins Bloomberg School of Public Health
Frequently Asked Questions
Social Media Addiction Filing Deadlines
Every state has a statute of limitations — a legal deadline — for filing a social media addiction lawsuit. If you miss this deadline, you lose your right to seek compensation. Because social media addiction develops gradually and its connection to platform design may not be immediately apparent, most states apply a "discovery rule" that affects when the clock starts.
The Discovery Rule: When Does the Clock Start?
Social media addiction is not a single event — it is a gradual process that develops over months or years. Children do not become addicted overnight, and parents may not recognize the connection between a platform's design and their child's mental health deterioration until much later. Most states apply a discovery rule that starts the statute of limitations when you discovered — or reasonably should have discovered — that the platform's design caused harm to your child. For many families, this moment came after media coverage of the Facebook Papers in 2021, the Surgeon General's advisory in 2023, or a mental health diagnosis.
Applies to: Social media platforms with addictive design features (algorithmic feeds, infinite scroll, notifications)
Real-World Examples
A parent discovers their 15-year-old's therapist has linked the teen's depression and anxiety to Instagram use in 2024
In most states, the statute of limitations starts in 2024 when the therapist identified social media as a contributing cause of the mental health condition — not when the child first created an Instagram account years earlier.
A teen is hospitalized for self-harm in 2025 and clinical evaluation reveals a pattern of harmful TikTok content exposure
The hospitalization and clinical evaluation linking TikTok to the self-harm is typically considered the discovery date, giving the family the full statute of limitations period from that point to file a claim.
Parents learn in 2024 that their 12-year-old has been using Instagram since age 10 by lying about their age, and has developed an eating disorder linked to body image content
The discovery of both the underage account and the connection between Instagram content and the eating disorder may trigger the discovery rule in most states, starting the clock from when parents learned of the harm and its connection to the platform.
Social Media Addiction Lawsuit Filing Deadlines: State-by-State Guide
Statutes of limitation for personal injury claims involving addictive social media design
| State | SOL Period | Discovery Rule | Notable Exception |
|---|---|---|---|
| California | 2 years | Yes — starts at discovery of injury and cause | MDL 3047 is consolidated in N.D. Cal. Strong consumer protection laws (CLRA, UCL). 3-year SOL for fraud claims. Meta headquartered in CA. |
| Texas | 2 years | Yes — discovery rule applies | Large youth population. Texas HB 18 enacted 2023 restricting social media for minors. Texas DTPA provides additional cause of action. |
| New York | 3 years | Yes — discovery rule applies | Longer SOL than most states. Active state-level legislation on social media and youth protection. Strong consumer fraud statutes. |
| Florida | 4 years | Yes — discovery rule applies | Generous filing window. Florida HB 3 enacted 2024 restricting social media for minors under 16. FDUTPA available. |
| Illinois | 2 years | Yes — discovery rule applies | Illinois BIPA may provide additional claims for platforms using facial recognition or biometric data. BIPA carries statutory damages up to $5,000 per violation. |
| Washington | 3 years | Yes — discovery rule applies | Home state of numerous tech companies. Washington Consumer Protection Act is among the strongest in the nation. |
| Georgia | 2 years | Yes — discovery rule applies | Large youth population. Georgia Fair Business Practices Act provides additional consumer protection causes of action. |
| Ohio | 2 years | Yes — discovery rule applies | Ohio Consumer Sales Practices Act claims have a 2-year SOL. Multiple Ohio school districts have filed claims in the social media MDL. |
Bottom Line
If your child has been harmed by social media, do not wait. Filing deadlines are real, and MDL 3047 is actively proceeding toward bellwether trials. Consulting an attorney now ensures your claim is preserved and you are positioned for any settlement that may result.
This table provides general guidance. Actual deadlines depend on your specific circumstances, including when you discovered the connection between your child's social media use and their harm. An attorney can determine your exact deadline based on the facts of your case and the applicable state law.
In-Depth Guides
Instagram Teen Mental Health Lawsuit
Meta's Instagram is the most heavily scrutinized defendant in the social media addiction litigation. The Facebook Papers revealed that Meta's own research showed Instagram made body image issues worse for 32% of teen girls, and the company suppressed the findings. Instagram's Explore page, beauty filters, like counts, and algorithmic feed have been directly linked to eating disorders, depression, anxiety, and self-harm in teens.
Read guideMeta/Facebook Child Safety Lawsuit
Meta Platforms — parent company of Instagram and Facebook — is the primary defendant in social media addiction litigation. The Facebook Papers showed Meta knew its products harmed children and chose profit over safety. A 42-state AG coalition sued Meta in October 2023. Meta CEO Mark Zuckerberg is testifying in the K.G.M. bellwether trial in February 2026. Meta's $5 billion FTC settlement for privacy violations demonstrates the company's pattern of prioritizing engagement over user protection.
Read guideSnapchat Youth Lawsuit
Snap Inc.'s Snapchat faces unique litigation claims centered on its streaks feature — which creates compulsive daily engagement obligations — and its role in facilitating harmful contacts between minors and predatory actors. Snap settled its portion of the K.G.M. bellwether case in mid-January 2026. The platform's disappearing messages feature has also been linked to cyberbullying and sextortion targeting minors.
Read guideSocial Media & Eating Disorders
Social media platforms — particularly Instagram and TikTok — have been directly linked to the surge in eating disorders among adolescents. Meta's own research showed Instagram made body image worse for 32% of teen girls. Beauty filters, "fitspiration" content, calorie counting features, and algorithmic amplification of pro-eating-disorder content create a toxic environment that triggers and exacerbates anorexia, bulimia, and body dysmorphic disorder in developing teens.
Read guideSocial Media Lawsuit Settlement Amounts
Social media addiction settlement amounts are projected to range from $10,000 for moderate cases to $1,000,000+ for severe cases involving suicide or death. The K.G.M. bellwether trial (Feb 2026) will establish valuation benchmarks. Prior settlements by TikTok ($92M class action), Google/YouTube ($170M COPPA), and Meta ($5B FTC) demonstrate the platforms' massive financial exposure.
Read guideSocial Media & Teen Suicide Lawsuit
Social media platforms have been linked to a dramatic increase in self-harm and suicide among adolescents, particularly girls. Research shows that self-harm rates among teen girls increased 62% between 2009 and 2019 — a period coinciding with widespread social media adoption. Platforms' algorithms have been documented serving suicide-related and self-harm content to vulnerable teens, and cyberbullying on platforms has been identified as a direct trigger for suicidal behavior.
Read guideTikTok Addiction Lawsuit
TikTok, owned by ByteDance, faces mounting litigation alleging its For You Page algorithm is the most aggressively addictive content delivery system in the social media industry. TikTok has been documented serving self-harm content to new teen accounts within minutes. The platform settled its portion of the K.G.M. bellwether case confidentially in January 2026, and the DOJ sued TikTok for COPPA violations in August 2024.
Read guideYouTube Kids Addiction Lawsuit
Google's YouTube faces litigation alleging its autoplay algorithm and YouTube Shorts feature are designed to maximize viewing time in children through continuous, passive content delivery. YouTube already paid $170 million for COPPA violations in 2019. YouTube remains a defendant in the K.G.M. bellwether trial alongside Meta, with the trial beginning February 10, 2026.
Read guideState-Specific Information
Sources & References
- U.S. Surgeon General Advisory on Social Media and Youth Mental Health (May 2023) — Office of the U.S. Surgeon General
- U.S. Surgeon General Call for Warning Labels on Social Media (June 2024) — Office of the U.S. Surgeon General
- APA Health Advisory on Social Media Use in Adolescence (2023) — American Psychological Association
- Twenge JM, Haidt J. "Adolescent Mental Health Crisis" — Generational Trends in Anxiety, Depression, and Self-Harm (2023) — Journal of Adolescence
- Facebook Papers — Internal Meta Research on Instagram and Teen Mental Health (2021) — Internal Meta Documents / Wall Street Journal
- Center for Countering Digital Hate — "Deadly by Design" Report on TikTok Self-Harm Content (2022) — CCDH
- fMRI Evidence for Social Media Activation of Nucleus Accumbens Reward Circuitry in Adolescents — Psychological Science
- In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, MDL No. 3047 — Northern District of California