By: Kyla Goolsby
The emergency authorization of the distribution of Pfizer’s coronavirus vaccine brought the nation its first ray of hope in tackling the COVID-19 pandemic. Optimism grew again with the subsequent emergency authorization of Moderna’s vaccine, however, the complexities of vaccination rollout combined with the influence of social media platforms pose significant obstacles to improving public health outcomes. In an effort to share vaccination facts and dismantle frequently shared conspiracy theories, scientists and physicians have taken to the popular platform Tik Tok, producing quick videos on available COVID-19 vaccines. Other Tik Tok users are creating videos in opposition, speaking to unconfirmed and likely false post-vaccination symptoms, such as serious injury and death. Nonetheless, Tik Tok is largely unaccountable for falsities spread by its users due to Section 230 of the Communications Decency Act.
Under Section 230 of the Communications Decency Act, interactive computer service platforms cannot be held responsible for third-party content. This means, for purposes of liability, Tik Tok is not legally considered the publisher of content shared via the app. Supporters of Section 230 argue it critically protects the First Amendment’s guarantee of free speech by providing meaningful spaces for public statements, while still allowing private platform owners to reasonably restrict commentary by vetting hate speech. They also assert that the Act promotes the very essence of the Internet – accessibility and transparent exchange of ideas. By this rationale, so long as hate speech is relatively monitored, platforms may defer to user-autonomy in selecting and posting content.
This positive view of Section 230 is not universal, as there is significant bipartisan dissent regarding the Act. Former President Donald Trump sought to narrow Section 230 to impart some responsibility on social media sites for their users’ content. In a perhaps more extreme take, President Joe Biden seeks to repeal Section 230 altogether, stating that the Act “should be revoked because [Facebook] is not merely an internet company. It is propagating falsehoods they know to be false.”
These falsehoods are not unique to Facebook. Since the authorization of life-saving COVID-19 vaccines, Tik Tok users have utilized the website to spread conflicting information about the nature of the immunizations. While some purport involuntary abortions, heart attacks, and cancers, others declare the outlandish conspiracy of tracking devices being implanted into Americans through the vaccinations’ administration. Tik Tok has since communicated an effort at regulating anti-vaxx statements – albeit weak in their promises. Rather than committing to taking down every anti-COVID-19-vaxx post, Tik Tok executive Kevin Morgan merely pledges to caveat these posts with links to Centers for Disease Control and Prevention websites, with the ultimate removal of only the most egregious posts. Nevertheless, these videos are toxic clickbait that brings Tik Tok new users. Whether thirsty for controversy or merely seeking information on the pandemic, Tik Tok is absolved of the repercussions of the content users publish.
In response to the spread of false information, physicians have taken to Tik Tok as a means for encouraging COVID-19 vaccination, and it is not a moment too soon. According to a Northeastern University study, adults under 25 are most likely to believe false claims about COVID-19. Further, Pew Research Center concluded that 36% of Americans ages 18 to 29 received their news from a social media platform. This is particularly troubling given that these groups are also more likely to work frontline jobs, increasing their probability of contracting COVID-19. In order to reach this critical audience, doctors are posting Tik Toks revealing truths about COVID-19, such as the virus’s original cause, new strains, vaccine side effects, and medical justifications for the two-dose vaccine procedure. As a society, we must consider which entities are responsible for ensuring that accurate medical and public health information reaches the public, and who is liable if misinformation prevails.
While it is possible a successful repeal of Section 230 would discourage the spread of false information on the Internet, it is the responsibility of lawmakers to ensure any legislation replacing Section 230 of the Communications Decency Act imparts accountability on social media platforms to regulate content for threatening truths, while still ensuring freedom of speech in the digital age. As it stands, the permissive ability of Tik Tok to autonomously regulate anti-vaxx conspiracy theories promotes the posting of dangerous falsities. Our inability to hold interactive computer service providers accountable for their shared content poses a direct threat to public health, indicating it may be time to repeal Section 230.
Student Bio: Kyla Goolsby is a second-year law student at Suffolk University Law School and is pursuing a concentration in Health and Biomedical Law. Kyla is also a staff member for the Journal of High Technology Law.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.