By: Melanie Rosen

In 2017, companies began pulling their advertisements from YouTube because they were making money off of videos that child predators were using for their own personal enjoyment.  YouTube responded by shutting down hundreds of accounts making predatory comments.  However, YouTube assumed that with the massive amount of content constantly coming in, companies would forgive them quickly, and come back.  YouTube was correct, and within a few months, the brands that pulled their advertisements from YouTube were back on the platform.

YouTube does have community guidelines regarding policy and safety, which include no nudity or sexual conduct, no harmful or dangerous conduct, no hateful conduct, no violent or graphic conduct, no harassment and cyberbullying, and more.  When these guidelines are violated YouTube takes action ranging from suspending a creator’s privileges to account termination.  YouTube also has an option where community participants can help police YouTube by reporting videos or users.  With all of these safety guidelines, it could be assumed that YouTube has a handle on maintaining the safety of its platform, but this assumption would be incorrect.

It is now 2019, and YouTube is facing the same problem, but this time people are not being as forgiving.  On February 17, 2019, Matt Watson posted a video on YouTube exposing how YouTube is facilitating the sexual exploitation of children, and that exploitation is being monetized.  Watson’s video has gone viral, receiving over three million views and counting, and inspiring the hashtag #YouTubeWakeUp.

Here is how child predators are facilitating and getting away with sexually exploiting children.  YouTube was established in 2005 and purchased by Google in 2006.  Google created an algorithm where the website will analyze the type of video content being searched, and will then recommend more videos similar to the previous content.  This concept was created with the intent to keep users on the website for hours and is more commonly known as going down the wormhole.  When child predators are searching for videos of children, the algorithm will pick up this data and then create an entire sidebar filled with videos of little girls.  The child predators are uploading these videos of little girls onto their own personal accounts to share with other pedophiles.  Once the videos are uploaded, the pedophiles are using the comments section to communicate with one another.  They will leave timestamps in the comments section where they feel the most sexually exciting parts of the videos can be found, they are leaving their own personal contact information, and some are leaving direct links to child porn.  Even worse, when the wormhole is complete, and the child predators return to their accounts, their recommended section on the home page will be solely videos of children.

Before Watson exposed what he calls the “soft-core pedophilia ring,” companies were making a profit off of these videos through advertisements.  When clicking on one of these videos, advertisements may pop-up on the side or before the video is presented.  These videos, are making money because companies are paying them to promote or mention their products in their videos.  And the companies are making money through their advertisements.  Since Watson’s video was released, many major brands, such as Disney, AT&T, Nestle, and Fortnite, have pulled their advertisements from YouTube.

YouTube has chosen to respond to this situation by terminating over 400 channels, disabling comments on tens of millions of videos, and reporting illegal comments to law enforcement.  However, YouTube has 300 hours of video being uploaded every minute, almost 5 billion videos being watched every single day and almost 30 million visitors per day.  With this much content coming in, YouTube is struggling to find a solution to this pedophile problem.  Though the above algorithm created by Google is also supposed to detect inappropriate comments, Google’s solution has only been to disable the comments on those videos, but not remove the videos themselves from YouTube.

As of right now, YouTube has announced that they will be suspending commenting on any videos featuring minors that could attract predatory behavior.  Though, in the long run, this will most likely not affect YouTube’s revenue, and the companies will most likely come back.  It is vital for Google to find a solution to this problem, and establish a safe environment for everyone.

Student Bio: Melanie Rosen is a second-year student at Suffolk University Law School and is a staff member of the Journal of High Technology Law. She attended the University of New Hampshire, where she received a Bachelor of Arts in Sociology and Justice Studies, as well as a Master of Arts in Justice Studies.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.


Print Friendly, PDF & Email
Skip to toolbar