“Blood On Their Hands”: Can CEOs and Legislation Fix the Damage To Children?

By: Talya Torres

 

On January 31, 2024, five tech CEOs were subpoenaed by the Senate Judiciary Committee to testify before Congress. The hearing was organized to question the CEOs on their part in big tech and the “online child sexual exploitation crisis.” The individuals subpoenaed were Linda Yaccarino of X (formerly known as Twitter). Mark Zuckerberg of Meta (formerly known as Facebook), Shou Zi Chew of TikTok, Evan Spiegel of Snap, and Jason Citron of Discord.

 

The platforms subpoenaed were all accused of facilitating the sexual exploitation of children and willfully ignoring the harm they caused. While all of the CEOs were called to testify, leaders of Meta and TikTok, Zuckerberg and Chew, received the most questions and time in the hearing. Lawmakers accused CEOs of having “blood on their hands” for failing to protect children from sexual predators online, with some lawmakers comparing the companies to cigarette makers. One major statistic cited from the National Center for Missing and Exploited Children showed “skyrocketing growth in financial ‘sextortion,’ in which a predator tricks a minor into sending explicit photos and videos.”

 

Many of the CEOs pushed back against these comments, stating that they had invested billions of dollars to strengthen the safety of children on their platforms and that many of them supported bills to protect children and their privacy online. After Senator Josh Hawley of Missouri argued with Zuckerberg on a number of issues related to the abuse and sexual exploitation of children facilitated through Meta’s sites, Zuckerberg stood and faced the parents of victims, stating that he was “sorry for everything you have all been through,” and that “[n]o one should go through the things that your families have suffered.”

 

The CEOs were not the only ones up for critique, as a portion of the Communications Decency Act was also criticized. Section 230 of the Communications Decency Act gives service providers “broad immunity” over lawsuits created due to users’ posts and states that the “provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Senator Dick Durbin, the chairman of the Senate Judiciary Committee argued that Congress has to “look in the mirror” and take the blame for the bill’s failure to protect kids online, stating that it now allows “the most profitable industry in the history of capitalism” to operate “without fear of liability for unsafe practices.”

 

After the four-hour hearing ended, no clear resolution to theissues surfaced. A possible resolution brought up during the hearing is the Senate’s proposed bill to expand protection for children called the “Kids Online Safety Act” or KOSA. KOSA would require tech companies to “exercise reasonable care” to avoid “causing or exacerbating problems such as depression, bullying, and harassment” when creating new features, limit who is allowed to talk to “youths” on their accounts, and limit features that keep young users online such as “infinite scrolling.” It also would require apps to have parental controls, would make it easier for “youths” to delete their accounts, and allow parents to report harm to the companies. Specifically for users younger than 17, services must take “reasonable measures” to “protect” against and “mitigate” various “harms” such as “anxiety, depression, suicide, eating disorders, substance abuse, ‘addiction-like behaviors,’ physical violence, online bullying, harassment, sexual exploitation, and abuse, ‘financial harms,’ and promotion of ‘narcotic drugs,’ tobacco products, alcohol, or gambling.” The vague list of harms is not supplemented with any specific actions that should be taken.

 

There is also the Earn It Act which would “roll back Section 230 protections when platforms facilitate content that violates civil and state criminal laws on child sexual exploitation” and the Stop Child Sexual Abuse Material (CSAM) Act which works to “create a new cause of action for victims and their families to sue over such material.”

 

The outrage and move towards legislation for all of these issues is incredibly valid, with the families of dozens of children who committed suicide due to social media sat in the background of all of the footage of the hearing, their children’s faces shown on large photographs held up. Social media is a problem for kids, the endless scrolling and images of seemingly perfect people are presented to them 24/7. However, while these issues as well asthe outrage behind it all are valid, many question if there are any ways to truly protect children from social media. Even if age restrictions are put in place, kids can easily put down a different date of birth to get onto sites and avoid restrictions.

 

Further, when it comes to parent-created restrictions, these would only protect children with parents who know how to use these tools. While the bill has also gained support from major backers such as President Biden and the CEOs of X, Microsoft, and Snap, others have brought up concerns about the bill and its possible chilling effect on constitutionally protected speech. The language of the proposed acts is vague, and the similarly vague duty of care created toward children could lead to harming protected speech.

 

 

Student Bio: Talya Torres is a second-year law student at Suffolk University Law School. She is a staff writer on the Journal of High Technology Law and is a member of the Executive Board of the Child and Family Law Association (CAFLA). Talya received a Bachelor of Arts in Journalism from the University of Massachusetts, Amherst.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email