On February 27, 2026, a federal court in Virginia issued a decision with significant implications for state efforts to regulate minors’ use of social media. In NetChoice v. Jay Jones, the U.S. District Court for the Eastern District of Virginia granted a preliminary injunction blocking enforcement of Virginia Senate Bill 854, a statute that would have imposed daily time limits on minors’ access to social media platforms.
The ruling underscores a recurring theme in digital regulation: while states have a legitimate interest in protecting children online, that interest is constrained by the First Amendment’s limits on government regulation of speech.
Supreme Court Precedent: Moody v. NetChoice
The Virginia court’s analysis builds on the Supreme Court’s July 2024 decision in Moody v. NetChoice, LLC, which addressed First Amendment challenges to social media content moderation laws enacted by Florida and Texas. While the Court did not ultimately rule on the constitutionality of either state’s law (holding that neither Court of Appeals properly analyzed the facial nature of NetChoice’s challenges), it signaled that content moderation by major platforms is likely protected expressive activity and that states cannot force platforms to carry speech they would otherwise remove or deprioritize.
The Court opined that social media platforms engage in protected editorial discretion when they curate, moderate, and organize user-generated content. Writing for the majority, Justice Kagan emphasized that the First Amendment’s core principles “do not vary” with changes in technology, citing Brown v. Entertainment Merchants Association for the proposition that constitutional protections extend to new media just as they did to traditional forms of expression.
While Moody concerned state laws that sought to restrict platforms’ ability to moderate content, its recognition of platforms’ First Amendment interests provided a doctrinal foundation for the Virginia court’s analysis. If platforms possess editorial discretion over how they present content, then laws that compel platforms to limit users’ access to that content necessarily implicate First Amendment concerns—both for the platforms and for the users seeking to receive protected speech.
The Law at Issue
Virginia enacted SB 854 in response to mounting concerns about the effects of social media on youth mental health. The statute required social media platforms to use “commercially reasonable methods” to determine whether a user is under sixteen years old. Once a user was identified as a minor, the platform was required to limit the user’s access to one hour per day, unless a parent affirmatively consented to expand or remove that restriction.
The law contained several content‑based exemptions, including platforms primarily devoted to news, sports, entertainment, ecommerce, provider‑curated content, and interactive gaming. Violations were punishable by civil penalties of up to $7,500 per violation.
The Challenge by NetChoice
NetChoice, a trade association representing major online platforms such as Facebook, Instagram, YouTube, and Reddit, challenged SB 854 on First Amendment grounds. NetChoice argued that the statute impermissibly restricts minors’ access to constitutionally protected speech and, in practice, forces platforms to implement age‑verification measures that burden all users, including adults.
The Court’s Analysis
A Content‑Based Restriction on Speech
The court’s analysis turned on a threshold constitutional question: whether SB 854 is content‑neutral or content‑based. That distinction is critical, because content‑based restrictions are subject to strict scrutiny and are presumptively unconstitutional.
The court concluded that SB 854 is content‑based for two independent reasons. First, the statute’s exemptions draw distinctions based on subject matter, treating news, sports, entertainment, and gaming differently from other forms of expression. Under Reed v. Town of Gilbert, such subject‑matter distinctions render a law content‑based regardless of legislative motive.
Second, the court found that the statute favors “provider‑selected” or curated content over user‑generated content. In the court’s view, this preference amounts to a content‑based distinction that advantages certain speakers and modes of expression over others.
Failure Under Strict Scrutiny
Because SB 854 is content‑based, the court applied strict scrutiny, requiring Virginia to show that the law serves a compelling governmental interest and is narrowly tailored to achieve that interest.
The court accepted that Virginia has a compelling interest in protecting children from the potential harms of excessive social media use. The record included evidence that social media can be addictive, that minors are particularly susceptible to engagement‑maximizing design features, and that heavy social media use is associated with increased rates of anxiety, depression, and self‑harm among youth. The court also noted the U.S. Surgeon General’s 2023 declaration that children’s social media use presents a public health concern.
Nonetheless, the court held that SB 854 is not narrowly tailored. The statute’s age‑verification requirement burdens all users—adults and minors alike—by conditioning access to protected speech on proof of age. The court further emphasized the availability of less restrictive alternatives, including parental‑control tools already offered by platforms, school‑based smartphone restrictions, and public‑education efforts aimed at increasing parental awareness of existing safeguards.
Perhaps most significantly, the court rejected the statute’s default structure: restricting minors’ access to protected speech unless and until a parent intervenes. The court emphasized that minors possess First Amendment rights and that the government may not broadly suppress access to protected expression as a default rule, even in pursuit of child‑protective objectives.
Implications Going Forward
The decision adds to a growing body of case law confronting state attempts to regulate minors’ online activity. Courts have repeatedly acknowledged the seriousness of youth mental‑health concerns while expressing skepticism toward regulations that directly limit access to protected speech or impose broad, user‑wide compliance burdens.
The ruling does not foreclose state action in this area. To the contrary, the court explicitly recognized Virginia’s compelling interest in protecting children online. What it makes clear, however, is that regulatory approaches must be carefully calibrated—particularly where they rely on access restrictions or age‑verification mandates that affect adult users and protected expression.
For online platforms, the decision provides continued relief from legislatively imposed time‑limit regimes, at least for now. For lawmakers, it reinforces a clear lesson: future legislation will need to focus less on blunt access restrictions and more on design‑based, opt‑in, or parental‑empowerment strategies that are less likely to trigger strict scrutiny.
As concern over youth mental health and social media continues to intensify, this case will not be the final word. But it offers a useful roadmap for understanding the constitutional boundaries that any durable regulatory solution must respect.

