Cracking Down: The U.K. Online Safety Bill Takes Aim at Big Tech

By: Claire Remillard

The age of self-regulation for Big Tech social media seems to be officially coming to an end.  On March 17, 2022, the Online Safety Bill, which would impose new regulatory requirements on tech companies, was introduced into the United Kingdom (“U.K.”) Parliament.  The proposed legislation was conceived in response to tech companies’ failure to control harmful content on social media sites and search engines.  The U.K. Internet regulator Ofcom would be responsible for issuing information notices to tech companies determining whether each site is performing its online safety functions.  The Bill also ensures Ofcom has the ability to properly interview and monitor each company through watchdogs and interviews.  This provision addresses concerns that current monitoring is not effective because the government lacks authority to enforce a company’s own promises in its Terms of Service and Community Guidelines.  The legislation will also crack down on user fraud, fraudulent advertising, and the ability of anonymous users to harass others. Cyber-flashing will also become criminalized under the new Bill.

The Bill places a duty of care on companies with requirements to remove illegal content as well as “legal but harmful” content.  The legislation goes further by imposing criminal liability on tech executives for Big Tech Social Media companies like Twitter, TikTok, Meta, and YouTube if they fail to comply with Ofcom’s information requests.  Executives could be held personally liable and even face jail time in as little as two months.  In addition to criminal liability, fines would also be imposed on companies in the event of a violation.  Fines could be up to 10% of a company’s annual global turnover, which could be upwards of $10 billion for companies like Meta based on revenue reports from 2021.  The U.K. government website lists the ultimate penalty for failed compliance as a complete block of the social media site.

Concern has escalated in the last few years about the negative impacts of social media on mental health, especially in teenagers and children.  The existence of harmful content such as pornography, displays of traumatic violence, and depictions of other illegal conduct are detrimental to children and adults alike.  Big Tech companies have claimed they do the best they can to control harmful content on social media platforms and search engines.  However, the U.K. government website includes examples of harmful content which has remained online unchecked such as terrorism materials and content encouraging self-harm and eating disorders.  The encouragement of self-harm and eating disorders in teenagers has run rampant internationally, leading to increased rates of mental health issues such as depression, anxiety, and even elevated risks of suicide.

Some have already raised concerns about the efficacy of the legislation because of speculation about the whether the Bill will be enforceable.  New provisions have been added to the Bill that were not originally proposed when the legislation was conceived of over a year ago. Another issue is what the definition of “legal but harmful” will encompass, and whether this distinction could encompass too little or too much of online content.  The U.K. government recently defined “harm” as “reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals.”  Concern remains following this clarification due to the fear that the definition is still to vague, and that companies would have a difficult time interpreting what could possibly fall under this category.  Too little regulation has led to the current state of chaos online, but too much regulation could lead to something close to companies fully censoring online content.

However, the U.K government website maintains that the Bill will actually enhance tech users’ ability to speak freely, with a provision which would allow users to appeal a social media site’s decision to remove a post.  The goal of this provision is to allow people to express themselves freely as well as encourage the protection of journalism and political debate on online platforms. All of these provisions will require companies to impose new regulations proactively and will probably necessitate major changes be made to existing algorithms.  This portion of the legislation is key to the criminal liability of executives because it will require them to disclose algorithm methods and what measures are taken to shield users from harmful content.

This legislation joins other attempts to reign in the unregulated “Wild West” of internet technology. For example, the European Union (“E.U.”) recently agreed to pass the Digital Market Act, which is intended to crack down on what is seen as monopolistic behavior by Big Tech companies like Google and Apple.  Antitrust investigations seem to be more and more prevalent in the context of tech giants, including a recent joint investigation by the U.K. and the E.U. into an advertising deal between Google and Meta.  The United States also seems to be aware of this potential antitrust investigation, with a lawsuit already pending in the state of Texas alleging similar antitrust violations.

It is honestly impressive that Big Tech has gotten away with “marking their own homework” for this long.  The U.K. Department for Digital, Culture, Media and Sports Secretary Nadine Dorries insists this avenue is as justified as requiring seat belts to be worn in cars.  The Secretary went on to say, “[g]iven all the risks online, it’s only sensible we ensure [] basic protections for the digital age.”  All things must come to end, although many would agree that the evils hopefully eradicated by the Online Safety Bill are a welcomed change.  If enforcement is successful, perhaps other Western countries will follow the U.K.’s lead in the addition of safeguards for online content.

 

Student Bio: Claire Remillard is a second-year law student at Suffolk University. Claire is a Staffer on the Journal of High Technology Law, the Vice President of the Health and Biomedical Law Society, and the Assistant Coordinator of Mentorship for the Women’s Law Association. Claire received a bachelor’s degree in Cellular and Molecular Biology from West Chester University.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email