Conflicts in Writing a New Future with Artificial Intelligence

By: Allison Nickerson

 

Artificial Intelligence (AI) is an unavoidable part of modern life, every time someone asks Siri a question or enters a Google search, they are using AI technology. While often extremely helpful for common tasks, there is an increasing concern aboutjust how far AI can go. AI was intended to complete specific tasks limited to what information the system has programmed in it. However, over time the improvement of technology expandedthe capabilities of AI allowing these programs to generate new material once given prompts by the user.  The increasing use and capabilities worry some artists and industry experts over issuessuch as privacy and the ability of AI to take human jobs. For many artists the worry is not only the ability of AI to create, but the fact their work is being taken without their permission to train the software.  This training consists of downloading work from human authors into the AI system, so it learns the style of the work.  Training AI with copyrighted materials allows the program to create new works that mimic different types of visual art, music, and writing styles.  Currently, there is little legal protection for these artists because there is no comprehensivefederal law restricting AI, just aspects such as privacy and security.

 

Due to the partial regulations when it comes to AI, there are holes in protections for artists and their work.  The issue of AI regulation will only worsen as its use is expanded.  The influx of lawsuits regarding the unlicensed use of artists’ work and the violation of their exclusive rights is allowing the court to begin to address this issue.  The major question still left to be answered is whether sites like ChatGPT can consume copyrighted material.  The decision hinges on whether the court believes the material generated is derivative work. If the court rules this use is derivative, it will be likely that AI using artists’work to train would not be permitted because the right to derivative work is left to the copyright holder. However, if AI use is not derivative, utilization of work would likely fall under fair use and be permissible so artists work would not be protected.  Many artists are fed up with AI using their work without compensation, and several are taking action. One such artist was writer and comedian Sarah Silverman; she became legally involved upon discovering that her new book is beingused to train AI programs without her permission. She claimed that her copyrighted materials were taken and used in violationof copyright protection to train ChatGPT. Her work wasobtained from “shadow library” sites that have been used in the past by AI programmers to take materials for program training. Silverman believes that copyright was infringed on, however many AI sites claim their use of copyrighted material falls within Fair Use. The AI sites claim their product falls under fair use because the product created is different from the originalwork, and their only purpose for using copyrighted work is to obtain data.

 

The contention between Silverman and the AI companies cumulated in July 2023 when the author filed a class action suitalong with two other authors (Christopher Golden and Richard Kadrey).  The plaintiffs filed suit against Chat GPT developer OpenAI and Mark Zuckerberg’s Meta for copyright infringement over claims that their artificial intelligence models were trained on their work without permission.  Authors have an issue with this use because it retains specific knowledge of their protected work in the training dataset and can output similar content.  Authors claim that their works were infringed because the AI programs are creating derivative works made without permission and in violation of their exclusive rights under the Copyright Act. Both Open AI and Meta individually filed motions to dismiss.  The AI developers argue that in restricting AI use the authors “misconceive the scope of copyright, failing to take into account the limitations and exceptions (including fair use) that properly leave room for innovations like the large language models now at the forefront of artificial intelligence.”

 

With all the impending litigation surrounding the issue of AI fair use, the court’s decision is vital in determining the future and direction of AI technologies. Without greater protection for artists, their work will continue to be used without their consent and be adapted in unintended ways. The courts must look for a moderate approach that allows AI companies to license artists’work for training purposes, doing so would allow artists to receive credit and profits while also allowing the work to be used in AI programming.  Licensing in this way is similar towhat is already being used in various other areas of copyright law such as in music and apps, so this would be an expansion of an already existing concept.

Student Bio: Allison Nickerson is a second-year law student at Suffolk University Law School.  She is a staff writer for the Journal of High Technology Law.  She graduated from North Carolina State University with a Bachelor of Arts in Political Science.  

 

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email