From Brussels to Silicon Valley: Decoding the EU Digital Services Act and Its American Potential

By: Sarah McLaughlin

 

In August 2023, the European Union passed the Digital Services Act (DSA) into law. The goal of the DSA take a two-pronged approach in that it aims to promote a safer digital space that protects the fundamental rights of users and establish a level playing field to foster online innovation within the EU. Its objectives encompass enhancing accountability, transparency, and user protection within the digital sphere. Key provisions of the DSA include addressing the liability of online platforms, establishing content moderation obligations, imposing transparency requirements, and outlining enforcement mechanisms. These measures aim to strike a delicate balance between fostering innovation and safeguarding user rights. Under the DSA, any platform with over 45 million monthly users is required to comply, meaning sites such as Google, Amazon, YouTube and all major social networking sites are subject to restrictions on demographic-based advertising and ads targeting children and must comply with outside auditing and data sharing with authorities.

 

How each platform will meet the new restrictions is largely up to interpretation within each platform’s purpose and capacity. For example, TikTok is making the “for you” algorithm optional for its European users, meaning those who choose to toggle it “off” will see general local and global content rather than content specifically targeted. Google is expanding its Ad Transparency Center to provide users with a searchable repository to learn more about the ad content distributed on a particular user’s machine. Each EU member state is responsible for appointing a commission of investigators to monitor compliance with the DSA and has the power to open an investigation into any platform that is suspected of infringement— which can include information requests, interviews, and site inspections. Should a platform be found guilty of infringement, the commission can impose a fine of up to 6% of the platform’s global turnover and, if the problem persists, a suspension from operating within the member state in question.

 

In contrast, the regulatory framework of online platforms in the United States takes a more patchwork approach, as there is no single law that governs internet privacy. When examining regulations governing content, most like that covered by the Digital Services Act, the United States legal system primarily relies on the protection afforded by the First Amendments right to freedom of expression and the provisions outlined in Section 230 of the Communications Decency Act. The First Amendment guarantees the right to freely express oneself online without government intervention. Section 230 upholds this principle but goes further in regulation by permitting platforms, irrespective of their size, to moderate user speech and content, thus following a similar freedom of expression idea but with added restrictions. Though Section 230, among other regulatory frameworks affecting internet usage, aims to protect users’rights from other users, there is little government oversight to monitor the protection of users from the platforms themselves in the United States.

 

In the EU, the DSA influences not only tech giants but also has begun to impact smaller platforms, shaping content moderation practices, data protection standards, and market competitiveness. By safeguarding user rights and tackling harmful content, the DSA aims to foster a safer and more transparent digital environment. Conversely, in the US, regulatory fragmentation poses challenges for compliance and enforcement, while debates over market dominance (most popularly, the far reach of Amazon) possibly underscore the need for a more cohesive regulatory framework. The DSA and Section 230 of the Communications Decency Act emphasize the importance of accountability, transparency, and user protection. However, disparities arise in approaches to liability, enforcement mechanisms, and the cultural factors shaping regulatory frameworks. While the DSA places greater emphasis on platform accountability and imposes obligations on the platforms themselves and appointed oversight committees, US regulations prioritize intermediary immunity and aim their liability focus more on the individual users themselves.

 

Looking ahead, there are opportunities for cross-border collaboration and harmonization, yet challenges persist in reconciling regulatory disparities. Should the United States adopt similar legislation, it will be interesting to watch how government investigation and reporting similar to the oversight committees mandated by the DSA would interact with freedom of expression rights guaranteed by the Constitution, specifically since many of these major platforms are headquartered in the United States. The United States has historically been “hands off” in its involvement in the regulation of citizen activity, an attitude that has become informally enshrined in much political candor. Outside overall public approval, further restricting content might have a major financial impact on these companies should the principles of the EU DSA be applied to American users. In some instances, platforms like X (formerly Twitter) are considering removal altogether from countries under the regulation of the DSA to avoid having to expend resources to ensure compliance, thus opening the possibility for major financial loss.

 

As digital landscapes continue to evolve, traditionally hands-off governments like the United States may need to consider similar legislative frameworks such as the DSA to better monitor the growing online world. If the DSA can be successfully implemented in the EU without any major hits to the platforms being affected, then it might assist in making a strong argument for implementing similar regulations in the US without offending First Amendment freedom of speech rights. The expansion of Section 230 to include platform liability, rather than just user liability, is a logical next step in protecting American internet users from data sharing and privacy concerns that were not conceivable a generation prior.

 

Student Bio: Sarah McLaughlin is a second-year law student at Suffolk University Law School. She is a staff writer on the Journal of High Technology Law and is the current President of the Environmental Law Society. Sarah received a Bachelor of Arts Degree in Political Science and Justice Studies from the University of New Hampshire.

 

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.  

Print Friendly, PDF & Email