Chatbots as a Tool for Pro Se Litigants

By: Hayden McGuire

It is hard to dispute that artificial intelligence (“AI”) will have an enormous impact on daily life, and people’s interactions with the legal system are no exception.  Currently, some organizations are pushing the notion of fully automated lawyers.  For instance, the company DoNotPay is soon to pilot their AI assistant, which will give audio instructions on how to respond to the judge during a parking ticket hearing.  Others worry about the ethical implications of attorneys relying on AI to the detriment of their own due diligence.  Regardless of any one person’s position on the supplementation or replacement of lawyers using AI, it cannot be ignored that there are tools in existence that can, and are, used in the legal process.

One implementation of AI is with a chatbot, where the user supplies a query and gets a computer-generated response designed to mimic a human.  These have become commonplace on many websites in the last few years, acting as automated customer service for the user.  In the legal space, the company LawDroid has produced Copilot, a chatbot marketed as their “AI Legal Assistant [designed] to Research Legal Issues, Help Draft Emails and Letters, Summarize Documents and Keep You Happy.”

A significant value of chatbots in the legal context comes from the ability to simplify complex concepts and respond to clarifying questions.  Using these abilities of simplification and clarification may provide one method of increasing effective access to the judicial system for self-represented litigants (“SRLs”).  For people not versed in the language of the courts, even completing a form can be a daunting task.  When you factor in cultural or language barriers, that task can quickly become insurmountable.  Using a chatbot as a starting point for researching a legal issue, simplifying a legal concept, or to answer specific questions about a form or procedure, could prove to be an invaluable method of increasing access to justice.

ChatGPT is a chatbot, created by OpenAI, which has gained prominence since its public release in November 2022.  ChatGPT has already been used to write a scholarly article.  Suffolk University Law School’s Dean Andrew Perlman co-authored a treatise with ChatGPT, titled, Implications of Open AI’s Assistant for Legal Services and Society.  In the treatise, Dean Perlman supplied a variety of queries, requesting production of legal documents, asking questions about specific cases, and asking questions about AI’s role within law and society.

Within the questions asked, there were two that showed how the tool has potential for those lacking representation.  First, Dean Perlman requested information and guidance concerning a disagreement on a child’s IEP in Massachusetts, and then he asked for help on correcting a social security payment amount.  Each of these questions relate to an issue that can have disproportionate impacts on those who likely lack financial resources to obtain representation.  The special education system favors the wealthy, resulting in those with robust financial resources having far more options and power when negotiating with school systems.  Additionally, considering the rate that social security cuts poverty in elderly demographics, there is some indication that insufficient payments may bring a substantial number of people into or near poverty.

However, the exchange between Dean Perlman and ChatGPT, asking for assistance about a disagreement on a child’s IEP, makes clear the information it provided is a valuable starting point, but still incomplete.  ChatGPT responded with three options: attempt to directly request modification of the IEP with the school, use a third-party mediation to resolve the disagreement, and finally the option to file a due process complaint with the relevant government department.  At first, this appears to be good information, but when you compare it to MassLegalHelp’s information on this issue, it is clearly incomplete.

A parent who gets their legal advice exclusively through ChatGPT, or another chatbot, is likely to be underinformed.  This could result in a parent making poor decisions and losing the ability to exercise rights that would enable their child access to adequate education like utilizing the “stay-put” rule, accepting part of the drafted IEP or maintaining an old IEP.

Since the publication of Dean Perlman’s article, the response provided to the identical query has changed, and arguably has improved slightly.  It now tells the user the information is incomplete and recommends finding an attorney.  However, the response still makes no mention of “stay-put,” leaving a parent without critical information on their rights in this situation.

The experiences the public and pro se litigants have when attempting to exercise their rights, and utilize the court system, could be drastically improved through adoption of chatbots like ChatGPT, into a tool with a pro se litigant in mind.  The bot described it well:

Query: Write a blog conclusion about the benefits that chatbots could have for self-represented litigants and the public using the courts and exercising their rights.

Response: In conclusion, chatbots have the potential to revolutionize the way that self-represented litigants and the general public interact with the court system.  These AI-powered tools can provide quick and easy access to information, help people navigate the legal process, and even assist in the preparation of legal documents.  By making the court system more accessible and user-friendly, chatbots can empower people to exercise their rights and access justice more effectively.  Furthermore, chatbots can also help reduce the workload of court staff and attorneys, allowing them to focus on more complex and nuanced cases.  Overall, the implementation of chatbots in the court system could be a win-win for everyone involved, making the legal system more efficient and effective for all.

ChatGPT even condensed its response into a tweet when I asked.  “Chatbots can revolutionize court access for self-represented litigants & the public by providing quick & easy info, legal assistance, document preparation & reducing workload for court staff & attorneys.  Empowering people to access justice more effectively #legaltech #chatbot #justice.”

Of course, ChatGPT does not claim to be a legal information provider.  As I engaged with the chatbot, it continuously responded to my questions by recommending I get an attorney knowledgeable about my jurisdiction and legal issue – this is clearly a good thing.  Still, it cannot be ignored that technology like ChatGPT can, and should, be designed to act as a starting point for self-represented litigants navigating civil legal disputes.  Access to legal information is a key step in promoting access to justice.  Tools like ChatGPT are not perfect, but, when combined with other traditional means of information gathering like law libraries, legal self-help centers, and from court staff, they could prove to be an effective method in the dissemination of legal information.

The distribution and simplification of legal information and procedure for self-represented litigants is a key step in promoting access to justice.  Despite their current shortcomings, chatbots are one avenue that should not be overlooked in this process.

 

Student Bio: Hayden McGuire is a 2L at Suffolk University Law School and a staffer for the Journal of High Technology Law.  He received a Bachelor of Arts Degree in Politics, Philosophy and Economics from Suffolk University.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email