By Lily Strenberg, JHBL Staff Member

Instead of playing with blocks, kids are given iPads and computers as early as infancy to keep them occupied. Due to this uptick in screen time for minors, lawmakers are interested in regulating what children are seeing on social media.[1]  In addition to implicating Constitutional freedom of speech issues, these laws pose the question – at what point do lawmakers have a duty to step in to protect children’s mental health separate from the duty of parents?[2]  Most recently, Florida lawmakers have proposed a complete ban on social media accounts for those under the age of sixteen, legislation that is far beyond what other states have tried to implement, but previous injunctions for such laws foreshadow possible defeat for lawmakers.[3]

Congress has considered actions for protecting minors’ activity on social media, and their privacy rights on such platforms, but has not acted.[4]  Cases in Ohio and Arkansas both recently resulted in injunctions due to the unconstitutionality of their content-based laws.[5]  In Netchoice, LLC. v. Griffin, Netchoice argued that the vague application of the law violates due process and the age verification requirements target content of speech.[6]  For the vagueness standard, the court reasoned that laws must be reasonably precise as to be clear as to what facts need to be proven for incrimination.[7] The court explains this law fails because it inadequately defines what entities are subjected to such law.[8]  For the content-based speech argument, the court agreed with applying strict scrutiny generally; however, for purposes of the injunction, the court applied intermediate scrutiny.[9]  The court reasoned that, even under intermediate scrutiny, this law would fail because the law is not narrowly tailored and creates a significant burden on adults to prove age for protected speech.[10]  The court in Ohio came to similar conclusions in its issuing of  preliminary injunctions for Netchoice, LLC. v. Yost.[11]  Like the Arkansas court, the Ohio court acknowledged that the law infringed on operators’ and minors’ protected speech and that the vagueness of the law violates due process.[12]

Netchoice’s success could help lawmakers foresee future challenges to such laws and predict what type of regulation would likely be struck down by the Supreme Court. The Supreme Court has acknowledged that minors are entitled to First Amendment protection and only in narrow circumstances may governments interfere with that right.[13]  They have also pointed out that punishing third parties, social media platforms in this circumstance, for dislaying protected speech to children just because their parents may disapprove is not a proper government aid to parental authority.[14] With these laws being stopped at the state level it is unlikely on a federal level, even if this issue reached the Supreme Court, it would likely be struck down under similar arguments.[15] In order to avoid such a decision, lawmakers must create a law that is not so overinclusive that it restricts adult freedoms, but also one that is not so under inclusive that is does not actually achieve the desired effect – protecting children from harmful content.[16]  With little successful legislation, lawmakers should focus efforts on regulating the actual content children are consuming rather than focusing on access to the platform..[17]  In order to truly protect children’s mental wellbeing, lawmakers should focus on regulating the content of such platforms, including fighting misinformation, controlling propaganda, and reversing moral ills that are influencing the way children think and feel.[18]

The clash between government interest of children and children’s own constitutional rights will continue to be the core issue in creating regulations for online activity of minors. Although states have and are continuing to push legislation to restrict minors access to social media to improve their mental health, courts will likely continue to point out Constitutional violations in these restrictions and restrict their implementation. With this signal it should be the goal of lawmakers to find alternative ways to protect children online and create regulations to make children safer and smarter in their online environment.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHBL or Suffolk University Law School. 

Lily Sternberg is second year Staffer originally from Chicago, IL. She went to the University of Wisconsin Madison where she earned her Bachelor of Business Administration in Management and a certificate in Global Health Studies. Lily is interested in areas of Corporate Transactional law including Corporate Governance and Cannabis Law.


[1] See generally Social media and children 2023 legislation, NCSL (Jan. 26, 2024), https://www.ncsl.org/technology-and-communication/social-media-and-children-2023-legislation [https://perma.cc/38MZ-4VLB] (Listing all current legislation in relations to children and social media); see also Cristiano Lima-Strong, DeSantis Vetoes Florida bill banning social media for most kids, Wash. Post, (Mar. 1, 2024),  https://www.washingtonpost.com/technology/2024/03/01/florida-social-media-ban-desantis-veto/ [https://perma.cc/X5FS-RF2L].  DeSantis is preparing to veto the nation’s most restrictive bill yet to be proposed due to concerns that the bill did not include parental supervision clause.  Id.; see also Ruth Reader, States getting serious about limiting kids social media exposure, POLITICO (Jan. 13, 2024), https://www.politico.com/news/2024/01/13/kids-online-states-social-media-00135390 (discussing different state actions and how there is a need for federal regulations); see also Pam Rutledge, Why Proposed Social Media Bans Won’t Keep Your Kids Safe, Fielding Graduate University (May 5, 2023), https://www.fielding.edu/why-proposed-social-media-bans-wont-keep-your-kids-safe/ [https://perma.cc/4PBH-YKNX] (explaining how legislative approaches do not address child’s mental health online).

[2] See Lima-Strong, supra note 1.  Florida measure is likely to face constitutional challenges which had stopped similar laws in other states.  Id.  Federal judges have granted injunctions against laws reasoning that they likely violate free speech.  Id.; see also Andrew DeMillo, Judge blocks Arkansas law requiring parental approval for minors to create social media accounts, PBS (Aug. 31, 2023) https://www.pbs.org/newshour/politics/judge-blocks-arkansas-law-requiring-parental-approval-for-minors-to-create-social-media-accounts [https://perma.cc/JZ84-VLUU] (showing court siding with first amendment argument).

[3] See Online access to Materials harmful to minors, Fla. H.B. 3 (2024) (awaiting approval by Governor).  This bill requires social media platforms to prohibit certain minors from creating accounts.  Id.; Lima-Strong, supra note 1.  This is one of the most restrictive bills set out by the states.  Id.  Previous preliminary injunctions were issued in other states over freedom of speech concerns.  Id.; see also Social media and children 2023 legislation, supra note 1 (explaining other proposed legislation throughout the country regulating children’s social media activity); Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571 (W.D. Ark. Aug. 31, 2023).  The court granted Netchoice an injunction against the state law due to its unconstitutional vagueness and its unjustified broad suppression of first amendment rights.  Id. at 38-57; NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 U.S. Dist. LEXIS 24129 (S.D. Ohio Feb. 12, 2024).   The court granted an injunction due to unconstitutional vagueness and that the act is not narrowly tailored enough even if there is an interest in state protecting minors.  Id. at 16-39.

[4] See Protecting Kids on Social Media Act, S.1291. 118th Cong. (2023) (Introduced April 26, 2023).  Bill introduced that requires social media platforms to verify the age of account holders and limits access to such platforms depending on children’s age.  Id.; see also Rebecca Shabad & Liz Brown-Kaiser, Children under 13 would be ban from social media under bipartisan Senate bill, U.S. Senator Brian Schatz of Hawaii (Apr. 26, 2023), https://www.schatz.senate.gov/news/in-the-news/children-under-13-would-be-banned-from-social-media-under-bipartisan-senate-bill [https://perma.cc/6PPM-7W8B] (describing bill introduced).

[5] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571 (W.D. Ark. Aug. 31, 2023).  If the states’ purpose is to restrict access to constitutionally protected speech on the argument that the speech is harmful to minors then it would be subject to strict scrutiny.  Id. at 47; see also NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 U.S. Dist. LEXIS 24129 (S.D. Ohio Feb. 12, 2024).  Court agrees with Netchoice argument that speech is content based because there are many exceptions to the definition of social media company within the act which proves the state is targeting companies based on content or viewpoint.  Id. at 49-50.

[6] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 38 (W.D. Ark. Aug. 31, 2023) (discussing the void for vagueness doctrine of the constitution); Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 47-57(W.D. Ark. Aug. 31, 2023) (explaining content based speech, why strict scrutiny applied, and how act violates standard).

[7] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 41-42 (W.D. Ark. Aug. 31, 2023) (citing United States v. Williams, 553 U.S. 285, 306, 128 S. Ct. 1830, 170 L. Ed. 2d 650 (2008)).

[8] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 40 (W.D. Ark. Aug. 31, 2023).  The act fails to define the phrase “primary purpose” for the statutes scope.  Id.  Confusion in applicability was clearly seen when state officials gave conflicting answers as to if some social media platforms were subjected to the law or not.  Id. at 43. Act fails to define sufficient proof for express consent of parent or legal guardian.  Id. at 44.

[9] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 49 (W.D. Ark. Aug. 31, 2023).  The Court leans towards NetChoice argument that the restrictions in Act 689 are subject to strict scrutiny.  Id.  The Court will not reach that conclusion definitively at this early stage in the proceedings and will instead apply intermediate scrutiny. Id.

[10] See Id. at 49.  The Court leans towards NetChoice argument that the restrictions in Act 689 are subject to strict scrutiny.  Id.  The Court will not reach that conclusion definitively at this early stage in the proceedings and will instead apply intermediate scrutiny. Id.  This law fails intermediate scrutiny because the act is not narrowly tailored.  Id. at 53-54.  The connection between the harms of social media and children are not well defined and there is no showing that parental involvement in account creation shows their intent to be involved in child’s experience on the platform.  Id. at 59-60.

[11] See NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 U.S. Dist. LEXIS 24129 (S.D. Ohio Feb. 12, 2024).    Netchoice presented the same violations as mentioned in the previous case with the addition of claiming unconstitutionality of law due to imposition of impermissibility of overinclusion and under inclusion of minors’ access to first amendment protected speech.  Id. at 16.

[12] See Id. at 34.  The Act is not narrowly tailored to protect children against oppressive contracts as the Attorney general tries to argue.  Id. The Act is underinclusive in the aspect of regulating access and dissemination of speech while overinclusion in its assistance of parental authority.  Id. at 36-37.  The need for clarity in the law is especially important in free speech claims but this act does not provide precise definitions which invites arbitrary application of law.  Id at 38-39.

[13] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 52-53(W.D. Ark. Aug. 31, 2023).  Minors have significant first amendment protection and only in narrow instances may the government prevent dissemination of information.  Id.  State power is not the free power to restrict ideas for which children may be exposed. Id.  Content that is not obscene or have legitimate proscription cannot be restricted only because the legislature thinks it’s unsuitable.  Id.

[14] See NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 U.S. Dist. LEXIS 24129, *at 36 (S.D. Ohio Feb. 12, 2024). Supreme court points to doubt regarding argument that filing the gap between concerned parents is a great enough state interest to pass the constitutional threshold.  Id. at 36-37. Governments cannot restrict children whose parents otherwise would not care about the content they are seeing.  Id at 37.

[15] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571 (W.D. Ark. Aug. 31, 2023).  Netchoice is likely to succeed on the merits of the vagueness claim.  Id at 45.  Netchoice is likely to succeed in their claim for a violation of the First Amendment.  Id. at 63; see also NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 U.S. Dist. LEXIS 24129 (S.D. Ohio Feb. 12, 2024) (showing failure of law to pass constitutional test).

[16] See Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, * at 54-64 (W.D. Ark. Aug. 31, 2023).  The court acknowledges that other state courts have struck down similar laws requires parental consent for sale of violent video games to minors. Id. at 54.  A law cannot be underinclusive as to say something is violent then leave it in the hands of just one parent to say it is acceptable.  Id. at 55.  A law also cannot be overinclusive as to restrict those rights of parents who do not care. Id.  The court also acknowledges that the connection between the claimed harms and social media are weakly set out by the data.  Id. at 56.

[17] See Rutledge, supra note 1.  The goal should not be to restrict them but to teach them how to be social media literate and responsible online.  Id.  None of the laws help children understand the online world.  Id. Bans overlook unintended consequences such as making access more appealing or giving a false sense of security to parents.  Id.  Making parents the administrator for their kids ignores the child’s deserved level of privacy.  Id.; see also Netchoice LLC. v. Griffin, No. 5:23-CV-05105, 2023 U.S. Dist. LEXIS 154571, *at 59-60 (W.D. Ark. Aug. 31, 2023) (explaining how parental consent doesn’t solve the issue).  As the courts pointed out, one time consent by parents does not mean they intend to regulate what their children are seeing.  Id.

[18] See generally Rutledge, supra note 1.  There are no restrictions that will teach kids to deal with societal challenges such as thinking critically and being safe in a digital world.  Id.  Social media is our social currency allowing children to know what is happening in the world.  Id.  Monitoring teens and controlling what they do online doesn’t allow them to grow and learn but just makes them prisoners to their parents and leads to less trust between the two.  Id.