TikTok might stop ticking: Ofcom’s new regulation seeks to protect users from harmful online content while aiming to pin down video-sharing platforms.

By: Tianyue Liao

Video-sharing platforms (“VSPs”), could now face a fine of up to £250,000 or 5% of turnover, and Ofcom now has the power to force non-compliant companies to shut down their UK operations permanently.  The VSPs that are currently under regulation are those whose European operations are primarily based in the UK–which includes TikTok, Twitch, Snapchat, and OnlyFans.  As a media and communications regulator, Ofcom is the first regulator in Europe to establish guidance governing these platforms.  Now, what is the new Ofcom guideline?  And how essential and rigorous is it?

Under the internet’s free and open nature, various VSPs use the internet as a vehicle for delivering innovative and amusing content, with little specific regulatory oversight in place.  However, we must acknowledge that these benefits have come at a price.  New research conducted for Ofcom indicates that 79% of adult internet users have concerns about going online.  Specifically, among UK adult internet users: around seven in ten (69%) report concerns about harmful content or conduct, with around a quarter (26%) saying they have personally experienced some form of harm.  These VSPs not only encourage addictive behavior, but also pose risks to young users.  To illustrate, research conducted for the National Society for the Prevention of Cruelty to Children (“NSPCC”), found that 30% of users under 18 years old have reported recent exposure to violent contact or behavior online.  Around 20% report exposure to sexual content and bullying.  A quarter of them report having been contacted over social media by adults they didn’t know, a third of whom were users under 13 years old.  Hence, Ofcom plans to take initiatives to address growing concerns about the protection of young users online.    Dame Melanie Dawes, Ofcom chief executive, said, “Online videos play a huge role in our lives now, particularly for children.  But many people see hateful, violent, or inappropriate material while using them.  The platforms where these videos are shared now have a legal duty to take steps to protect their users.”

Taking it by and large, Ofcom rules require VSP providers to take measures to protect users from both restricted material and relevant harmful material.  Restricted material refers to: i) videos which have an R18 certificate: a legally-restricted classification, primarily for explicit videos of consenting sex or strong fetish material involving adults; ii) videos containing material not suitable for British Board of Film Classification (“BBFC”), classification: including but limited to materials that breach criminal law, that appear to risk harm to individuals and society, illegal drug use, or other non-consensual sexual violent behavior; iii) material that might impair the physical, mental, or moral development of children under 18 years old.  In order to derive a greater understanding of this concept, Ofcom commissioned a research study into the risks and harms to children being online, using VSPs.  Respectively, the following harms are pertinent in determining what measures to take in order to protect children from materials that might impair the physical, mental, or moral development: sexual material including pornography, sexting, grooming and meeting strangers online; self-injurious content causing physical harms such as the promotion of eating disorders or suicide; mental health and wellbeing factors such as depression, anxiety, or social withdrawal; and aggression such as hate speech or cyberbullying.  Harmful materials are those likely to cause violence or hatred against a group of people based on sex, race, religion, political affiliation, disability, etc.  Moreover, they include materials that would be a criminal offense under laws relating to terrorism, child sexual abuse material, and xenophobia.

On the one hand, some analysts said the Ofcom guidelines were not “meaningful” and that Ofcom’s regulatory framework doesn’t seem pioneering.  The most obvious argument is that various VSPs are acutely aware of the potentially catastrophic result of illegal content and plenty of them are already taking actions to regulate their contents.  For example, almost all of the VSPs have terms and conditions that function to prevent harmful content from being uploaded, and each also has flagging or report systems in place.  In other words, it seems like the new guidance is simply asking VSPs to follow what they have already achieved.  On the other hand, the 69 page guidance is extremely detailed and comprehensive, from elucidating harmful materials to determining the accurate measures of protection for each type of VSP.  The key here is to look at the VSPs’ systems set to prevent the spread of harmful content and asks whether enough has been done to effectively protect users.  Even though VSPs are applying some modifications to protect users from content related to child sexual abuse, racism, and other illegal and inappropriate material, these measures evidently fall short of Ofcom’s standard.

For instance, it is undeniable that users under 18 years old are subject to the strictest access control measures.  For the age verification standard, Ofcom indicates that self-declaration tick boxes, or disclaimers, don’t necessarily require the user to be over 18.  Furthermore, the user’s personal information such as name, address, and date of birth, are not valid forms of age verification, as they are often unauthenticated.  Ofcom requires that age verification relies on data resources.  Precisely, this involves biometric analysis (analysis of facial features, fingerprints, and retinal patterns); behavioral analysis (analysis of time and location of web use); linguistic analysis (analysis of written language structure to evaluate age); profiling (analysis of user’s past online activity or browsing history); third-party attribution and use of data held by third-party organizations (credit card companies); and parental control software and mechanism.

Unequivocally, Ofcom enacted a series of extremely rigorous and extensively detailed guidance in how these measures should operate in order to eradicate harmful materials and protect young users.  In a nutshell, regarding the size and nature of their company, VSPs are required to implement protective measures that meet the standards, and these shall cover: terms and conditions; reporting and flagging mechanisms; systems for viewers to rate harmful material; tagging restricted material; age assurance systems; parental controls systems; complaints process; dispute resolution procedure; and media literacy tools and information.

To further ensure that providers have implemented appropriate measures to mitigate the risk of harmful material, Ofcom plans to continually stay engaged throughout the regime and conduct supervisory activities in the early stages.  Demonstrably, Ofcom is implementing five prescriptive principles of support implementation that are intended as applicable suggestions of how compliance could be achieved.  To highlight the five principles, they are: effective, uncomplicated, transparent, fair, and evolving.  Effective measures, such as the use of terms and conditions, should be enforced in a way that achieves the objective of protecting users.  Measures employed by platforms should be easy for all users to understand and engage with, taking into consideration vulnerabilities such as physical or mental health problems and specific personal circumstances such as cultures, age, or literacy skills.  The intended and actual outcomes of any measures taken by a platform should be clear and transparent to users and other interested stakeholders.  Measures should be designed and implemented in a way that does not unduly discriminate between users, introduce bias, or result in inconsistent application.  Also, measures should be regularly inspected and updated to stay in line with technological advancements and user behaviors so that the intended purposes are able to be accurately carried out.  Ofcom would also accept complaints from users of VSPs via an online webform.  So, once Ofcom discovers high volumes of complaints and a lack of engagement with Ofcom, it will investigate the issue to determine what sanction should be imposed.

Ofcom must crackdown on harmful materials that are still easily accessible on platforms before it’s too late.  Ofcom must also work internationally with other regulators and conduct further research in these areas so that it can carry out stringent enforcement if a VSP provider breaches its obligations.  We should now simply wait and see what actions VSPs and Ofcom end up taking, and this might very well involve tackling a much wider range of online harms in the near future.  All in all, these issues require long-term collaboration in trying to figure out the most practicable yet ethical way to monetize the ever-evolving video-sharing industry.

Student Bio:  Tianyue Liao is a second-year law student at Suffolk University Law School. She is a staffer on the Journal of High Technology Law and a member of the Suffolk National Trial Team. Prior to law school, Tianyue received a Bachelor of Arts in International Relations and French from Mount Holyoke College.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email