By Thomas Wood
At its height, nearly 6 million people every week listened to Alex Jones’ rants on his online program, Infowars, about the rising threat of “the globalists,” “Satanists,” and “animal-human hybrids”. Included in that millions of people is the 45th President of the United States, Donald Trump, who went on Infowars during the 2016 Presidential campaign and said on-air that Jones’ reputation was “amazing.” The most infamous aspect of Alex Jones’ Infowars content has been his false claims since 2012 that the parents of the Sandy Hook mass shooting victims were assisting a government conspiracy designed to justify the enactment of gun control laws throughout the United States. Jones has called the massacre “synthetic”, “completely fake,” and “with actors.” While the vast majority of people would understand these statements to be false, Jones had successfully been able to use social media sites, including Facebook, Twitter, and YouTube to recruit new members aligned on the far-right to his cause.
Despite Infowars’ success on social media, Alex Jones has faced a cascade of catastrophic events in the past few months, including the mass blocking of his Infowars channels on most social media sites, and an onslaught of lawsuits brought by victims of mass shootings Jones has lied about. On April 16, 2018, the Pozner family, whose children were killed in Sandy Hook, sued Jones in his home state of Texas for defamation, defamation per se, and conspiracy. Last month, six other Sandy Hook families sued Jones, along with Infowars correspondent Owen Shroyer (who also repeatedly claimed on air that Sandy Hook was secretly a false flag operation), and Infowars, LLC for defamation and defamation per se.
On August 5-6, 2018, nearly six years after Jones first claimed the Sandy Hook shooting was a hoax, Apple, followed by Spotify, Google, YouTube, and Facebook began to take down Infowars channels from their platforms – not for defamation – but for alleged “hate speech” by Jones. Twitter, the one exception of the group, had refused to shut down Jones’ content during that time but instead placed Jones’ account on a time out for one week. Jack Dorsey, Twitter’s CEO, later reversed that decision and banned Alex Jones’ platforms for life on September 6 th, 2018. Arguably the most devastating event for Jones was on August 30th, 2018, when District Court Judge Scott Jenkins of the 53rd District Court of Texas rejected Jones’ argument that the Sandy Hook lawsuit filed by the Pozner family failed to meet the elements of defamation, thereby allowing the lawsuit to move forward.
Some commentators have raised the question of whether the social media companies who allowed Jones on their platform could also be held liable for knowing Jones was spreading defamation on their platform and doing nothing to prevent it.[i] This post examines whether individuals defamed by users on social media can bring successful tort claims against social media companies who know the content is defamatory yet fail to remove it in a reasonable period of time.
Applying Traditional Defamation Law to Social Media Platforms
If any of the Sandy Hook parents decided to bring a tort claim against social media companies, it is likely they would bring a claim for defamation. Generally, the elements of defamation are “(1) a publication that is (2) false, (3) defamatory, and (4) has a natural tendency to injure or that causes special damage.[ii]” Restatement of Torts §577 defines the first element of defamation, publication, as “communication intentionally or by a negligent act to one other than the person defamed.” Restatement (Second) of Torts §577(2) expands on this initial definition and adds that, “One who intentionally and unreasonably fails to remove defamatory matter that he knows to be exhibited on land or chattels in his possession or under his control is subject to liability for its continued publication.[iii]” Since the English common law, courts have historically imposed an affirmative duty requiring parties who knowingly have defamation on their premises to remove it in a reasonable time to avoid republication.[iv] In Hellar v. Bianco, the Court of Appeal of California held that it is up to the jury to determine whether, after knowledge of the defamatory statement’s existence, the property owner “negligently allowed the defamatory matter to remain for so long a time as to be chargeable with its republication.[v]”
Applying these principles to the Sandy Hook litigation, the families would likely make several arguments to hold companies like Facebook or Twitter liable for defamation. First, Social media companies knew Alex Jones’ comments about Sandy Hook were false. Mark Zuckerberg of Facebook and Jack Dorsey of Twitter have given misguided defenses as to why they refuse to ban Alex Jones, including Zuckerberg feigning ignorance that users are not defaming people intentionally, and Dorsey claiming social media companies should push the responsibility of evaluating objective truth to reporters.[vi] However, neither Zuckerberg nor Dorsey have ever tried to argue that Jones’ comments about Sandy Hook were factually accurate.
Second, social media companies, like other property owners, have an affirmative duty to remove posts that they know are defaming people, because this defamation exists on their platform (or “property”).
Third, social media companies breached that affirmative duty to remove defamation in a reasonable time by refusing to remove the false posts from platform for almost six years, which is an unreasonably long period of time to keep defamation on a platform. Finally, because of Facebook, Twitter, and YouTube’s breach, the parents of Sandy Hook faced emotional distress from online & personal threats, as well as incurring monetary damages from having to frequently relocate as a result of threats from Infowars/Alex Jones supporters.
As the 9th Circuit Court of Appels concluded in Barnes v. Yahoo!, Inc., a social media company’s failure to remove defamation from their platform could satisfy the traditional elements of defamation.[vii] However, under current federal law, social media companies are statutorily immune from defamation claims when the false material is created by a website’s user, not the website itself.
Section 230 of the Communications Decency Act
Section 230 of the Communications Decency Act of 1996, also known as the Cox-Wyden Amendment, provides immunity from defamation claims to interactive computer services when the defamation is created by an information content provider.[viii] An interactive computer service is defined as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server…” Social media companies qualify as an interactive computer service under the Communications Decency Act because it is a service that provides information to multiple users by giving them access to a computer server.[ix] An information content provider, however, is defined as “a person or entity that is responsible… for the creation or development of information provided through the Internet…” Thus, a social media website can only be held liable for defamation if the website itself is responsible for the development or creation of the defamatory material.[x]
The issue of social media liability for a user’s defamation was most recently decided by the California Supreme Court in Hassel v. Bird. In Hassel, a law firm was defamed by a user on Yelp.com. The firm successfully received a judgement against the user, as well as an injunction ordering the user and Yelp to remove the offending posts. Yelp refused to remove the post, arguing the injunction against Yelp was unlawful under the Communications Decency Act.[xi] The California Supreme Court held that since Yelp was an interactive computer service within the meaning of the Cox-Wyden Amendment, and since the firm’s claim involved a user’s defamation, Yelp did not have any obligation to comply with the trial court’s removal order.[xii]
Finally, it should be noted that one of the original authors of the immunity provision of the Communications Decency Act, Senator Rob Wyden (D-OR), has publicly stated that granting immunity for social media companies who knowingly fail to remove conspiracy theories was not what Section 230 was intended to accomplish. Specifically, Senator Wyden chastised social media CEOs who, in his view, failed to recognize “that an individual endorsing (or denying) the extermination of millions of people, or attacking the victims of horrific crimes or the parents of murdered children, is far more indecent than an individual posting pornography.[xiii]”
Congress can, and perhaps should, amend the language of Section 230 to allow for civil claims of negligence or defamation against social media companies, provided the company has knowledge that one of their content providers is spreading defamatory matter. At bare minimum, parties who successfully prove that a user defamed them in court should be able to receive an injunction ordering the social media platform to remove the defamatory matter at issue. While social media companies certainly should not be liable for every instance that an individual posts defamatory material on their websites, there is no doubt social media companies are engaging in defamation when the user is someone who has the ear of millions of Americans. To allow defamation to flourish on social media without legal consequences erodes the concept of objective truth considered so vital that American courts allowed persons to bring defamation lawsuits in the first place.
[i] Karen Tumulty, Why is Twitter Dragging their Feet on Alex Jones? | Morning Joe, Youtube.com (Aug. 8, 2018) https://www.youtube.com/watch?v=Z3jqwWakbPk&feature=youtu.be (“Alex Jones has definitely crossed [the] line and… the issue for Twitter is going to be… potentially this may be something that settles in court and [costs] them a lot of money.”).
[ii] See Taus v. Loftus, 40 Cal. 4th 683, 720 (2007); see also Rest. 2d Torts § 558 (listing elements of defamation as “(a) a false and defamatory statement concerning another; (b) an unprivileged publication to a third party; (c) fault amounting at least to negligence on the part of the publisher; and (d) either actionability of the statement irrespective of special harm or the existence of special harm caused by the publication”).
[iii] Several courts have applied the definition of publication described in Restatement (Second) of Torts §577(2). See Tacket v. GMC, 836 F.2d 1042, 1046 (7th Cir. 1987); Roberts v. McAfee, Inc., 660 F.3d 1156, 1168 (9th Cir. 2011); Nasuti v. Kimball, 2010 U.S. Dist. LEXIS 65764 (D. Mass. 2010); Boston v. Athearn, 329 Ga. App. 890 (2014).
[iv] See Burns v. Dean, 1 King’s Bench 818 (holding club owners liable for knowingly allowing a defamatory statement to be put upon their walls in a position in which it could be read by anyone who entered the club, under the theory is that property owners may be liable for republication of another writer’s libel); see also Hellar v. Bianco, 111 Cal. App. 2d 424, 426 (1952) (declaring that, “Persons who invite the public to their premises owe a duty to others not to knowingly permit the walls to be occupied with defamatory matter”).
[v] See Hellar, 111 Cal. App. 2d at 426 (1952).
[vi] See Kara Swisher, Zuckerberg: The Recode Interview, Recode.net (July 18, 2018), https://perma.cc/HG58-TR8S (stating of holocaust denial, “I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong…”). See also Jack Dorsey, Twitter.com, (Aug. 7, 2018), https://perma.cc/S68J-8N2F (arguing Twitter should not be responsible for blocking Jones’ false statements, but that instead “it’s critical journalists… refute such information… so people can form their own opinions.”).
[vii] See Barnes v. Yahoo!, Inc., 565 F.3d 560 (9th Cir. 2009) (holding Yahoo could have committed defamation when the website refused to remove fictitious Yahoo profiles created by plaintiff’s ex-boyfriend, but that Yahoo was immune from such a claim under federal law).
[viii] See 47 U.S.C. § 230(c)(1) (1996) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”).
[ix] See Klayman v. Zuckerberg, 753 F3d 1354 (D.C. Cir. 2014).
[x] See Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 409 (6th Cir. 2014).
[xi] See Hassel v. Bird, 420 P.3d 776, (Cal. 2018).
[xii] See id. at 788.
Student Bio: Thomas Wood is a second-year student at Suffolk University Law School and staff member of the Journal of High Technology Law (JHTL). He holds a Bachelor of Science in Criminal Justice from the University of Massachusetts Lowell, with a minor in Political Science.
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.