Facebook’s QAnon Ban: Too Little Too Late?

By: Maura Arnold

Masks are being used to silence children so we cannot hear their cries for help. A D.C. pizzeria is the headquarters for a child trafficking ring. This ring is being led by the Clintons, Barack Obama, Chrissy Teigen, and others. These are just a few of the conspiracy theories QAnon has pedaled to its loyal followers.

The far-right conspiracy group began in 2017 with the core belief that well-known politicians and celebrities were engaged in a sex abuse ring and that there was a “deep state” conspiracy to discredit President Trump. While its core beliefs are far-fetched, QAnon draws in large numbers of supporters by appealing to real issues such as protecting children and promoting health and wellness. More recently, QAnon has gained mainstream attention by earning endorsements from Congressional candidates including Republican Marjorie Taylor Greene.

While QAnon claims are baseless, their circulation has very real consequences. In 2019, the FBI warned that QAnon followers were a domestic terrorism threat and there have been many incidents of violence traced back to the conspiracy believers. Recently, a group of men, many of whom support QAnon theories, were arrested by the FBI for their plot to kidnap Michigan Governor, Gretchen Whitmer. The group planned to take control of the capitol building and take hostages while putting the Governor on trial for treason.

Social media has been instrumental in growing QAnon’s base of supporters. While the group’s theories began on a popular message board for right-wing extremists, they have now shifted onto mainstream platforms such as Facebook and Instagram through palatable initiatives. For years, activists have called for Facebook to crack down on QAnon’s supporters’ promotion of violence, racism, anti-Semitism, and misinformation.

In August of 2020, Facebook finally took their first step toward curbing the spread of conspiracy theories on their sites. Facebook reported removing over 1,500 groups and pages related to QAnon, but that did not stop the spread of its theories. Many cite the main loophole in Facebook’s restrictions as the reason for this.  Facebook’s efforts only targeted people, groups, and pages that “represented” QAnon. QAnon followers easily found ways around this, blasting out the hashtag #SavetheChildren to attract new followers without any obvious or direct ties to QAnon. After all, this movement has its roots in the dark corners of the web.

In October of 2020, Facebook announced that it had not gone far enough in eliminating far-right conspiracies from the site and announced a new policy that would remove any person, page, or Instagram account that openly identified with QAnon. This new ban was likely prompted in part by the arrest of those involved in the plot to kidnap the governor of Michigan. Going forward, Facebook should be more proactive in finding and removing QAnon content, rather than relying on users to report the content to Facebook. Facebook has admitted, however, that it will take “days and weeks” to remove QAnon content from their sites.

Many do not think removing any person, page, or Instagram account associated with QAnon will be any more effective than targeting accounts, groups, and pages that “represented” QAnon. Social media companies walk a fine line of monitoring and eradicating misinformation from their sites while preserving the First Amendment right to free speech. Even experts on the issue do not have a clear or comprehensive plan on how social media platforms can curb extremists. One popular idea is “for Facebook to stop its automated recommendation systems from suggesting groups supporting QAnon and other conspiracies.” Some have suggested eliminating conspiracies all together, and others have suggested limiting conspiracies and providing information on why the conspiracies are wrong and dangerous.

On one hand, some could argue that taking steps towards proactively eliminating the spread of these conspiracy theories on mainstream social media is a step in the right direction. Even just publicly condemning the spread of misinformation helps the unwary from falling into a misinformation trap. On the other hand, some could argue that in a sense, the cat is already out of the bag. Just as they began, the QAnon following could retreat back to the dark corners of the web where they thrive largely unchecked. In 2020, however, QAnon has a much larger following than when it started in 2017.  These followers may disappear into the dark areas of the web, making them more difficult to track and apprehend. While there is an interest in preserving free speech online, there is also an interest in protecting the public from rhetoric that inspires crime and violence. With QAnon, inciting crime and violence is not just a hypothetical threat, but a very real problem.

Student Bio: Maura Arnold is currently a second-year law student at Suffolk University Law School. She is a staffer on the Journal of High Technology Law. Prior to law school, Maura received a Bachelor of Arts Degree in English Literature and Italian Language from the College of the Holy Cross.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

Print Friendly, PDF & Email