Summary
Artificial Intelligence is developing rapidly and quickly spreading throughout the healthcare industry. A newly developed AI-ECG could revolutionize cardiovascular care. The AI-ECG can potentially lower costs, increase accessibility, and detect heart diseases or conditions earlier. Given this high potential, policymakers must balance the benefits of AI with consumer and legal protection.
By: Kendall Milligan, JHBL Staffer
Introduction
In 2022, AI in the healthcare global market was nearly $15.1 billion and by 2030 experts expect it to be $187.95 billion.[1] Artificial Intelligence is a computerized system that can make predictions, recommendations, or decisions when given human-defined goals.[2] It analyzes data to automatically create models or algorithms which it then uses to suggest information or actions.[3] Researchers believe AI can revolutionize global treatment of cardiovascular disease by earlier detection of signs and symptoms of diseases.[4]
In pursuit of improved cardiovascular care, the Mayo Clinic developed an AI-enabled electrocardiogram (hereinafter AI-ECG).[5] The AI-ECG can identify heart conditions earlier than humans, is inexpensive to create, and has broad applicability across populations.[6] An electrocardiogram (hereinafter ECG) detects a person’s heartbeat through electrodes that track electrical signals in the heart.[7] The AI-ECG imports the data from the ECG into AI’s algorithms to interpret the ECG results.[8] Researchers currently use Mayo Clinic’s AI-ECG to predict a patient’s likelihood of having a heart disease or condition later in life.[9]
Background
The AI-ECG appeals to researchers and developers because of its current ability, potential for further development, low cost, and accessibility.[10] People are often unaware they have a heart condition until they suffer a heart attack, stroke, or cardiac arrest.[11] The AI-ECG’s prediction of a heart issue could lessen unexpected cardiovascular events by alerting physicians of the likely diagnosis and prompting treatment sooner.[12] The AI-ECG is also more accessible than typical monitoring devices because it is very small, portable, and inexpensive to make.[13]
Despite potential benefits of the AI-ECGs, lawmakers must place safeguards against AI to avoid potential errors and deception of consumers.[14] The U.S. Food and Drug Administration (hereinafter FDA) is responsible for the safety and effectiveness of drugs, medical devices, and certain types of medical software.[15] The FDA and Congress addressed developing technologies by creating new guidelines and regulations for AI-enabled medical devices.[16]
There are four main criticisms of the AI-ECG and AI medical devices in cardiology: “black box,” “overfitting,” slow regulatory processes, and balancing of developer’s financial interest and patient’s personal interest.[17] “Black box” refers to the nature of deep learning algorithms to make decisions in ways that humans cannot understand.[18] The lack of interpretability conflicts with clinicians’ evidence-based approach to medicine.[19] Clinicians are apprehensive to trust and use AI-enhanced devices, making it difficult to integrate these systems into comprehensive patient assessments.[20] The next problem, “overfitting,” refers to how AI-enhanced devices learn and identify patterns from training data, including irrelevant details.[21] “Overfitting” causes a lack of generalizability and leads to future problems when algorithms are used in certain areas for certain populations.[22]
Additionally, because there is a high interest in AI, large companies consistently create new devices.[23] This rapid development can make it difficult for regulators, particularly the FDA, to create regulations that address AI developments.[24] Developers have difficulty knowing best practices for creating AI devices which can cause potential compliance issues.[25] Researchers argue that a slower pace can lead to inequitable access, and lack of safety and effectiveness.[26] Finally, scholars argue that because AI is a high-profit market, institutions looking to develop AI mechanisms must balance the desire to make a profitable product with altruistic goal of improving patient care and health outcomes.[27]
Analysis
The rapid development of AI created a new wave of healthcare with possibilities to provide better care globally.[28] AI has the potential to revolutionize cardiovascular diagnosis and treatment processes.[29] The AI-ECG is a particularly exciting development in cardiology because clinicians could use the small device globally, as well as use the algorithms to analyze ECG data to predict heart conditions.[30] Not only does this help to diagnose before an event like a heart attack occurs, but it also provides services to populations where it is typically inaccessible.[31]
AI integration into healthcare should also be met with caution.[32] Developers must ensure they create and deploy the algorithms in a fair and equitable manner to avoid biases that may disproportionately affect certain populations including minorities, marginalized groups, and areas where healthcare is less accessible.[33] While “black box” and “overfitting” may leave clinicians hesitant to implement the AI-enhanced device, accessibility and balancing of interests could threaten patient’s care.[34] Lack of regulations to address AI advancements can result in delayed adoption of devices by clinicians and leave developers with little guidance.[35] Without this guidance, developers can create AI devices with improper algorithms.[36] Moreover, balancing the interests of developers and patients is crucial.[37] While developers aim to revolutionize the healthcare sphere, there is also an incentive for return on investment.[38] Interest in monetary gains can lead to development of only profitable types of AI-enhanced mechanisms, rather than developing AI to address healthcare needs.[39]
Conclusion
The AI-ECG holds immense promise for improving cardiovascular care.[40] Addressing the challenges of transparency, “black box,” “overfitting,” regulation, and ethics is essential to realize its full potential.[41] If improving patient care is not the focus of AI development, AI could make medical services more inaccessible and create distrust between clinicians and patients.[42]
Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHBL or Suffolk University Law School.
Kendall Milligan is interested in health law and litigation, with a particular focus on mental health. Kendall graduated from Trinity College in 2021 as a double major in Psychology and Human Rights.
__________________________________________________________________________________________________________________________________________
[1] See Gamith Adasuriya & Shouvik Haldar, Next Generation ECG: The Impact of Artificial Intelligence and Machine Learning, 17 Current Cardiovascular Risk Rep. 143, 143 (2023).
[2] See id. (defining artificial intelligence).
[3] See Haider J. Warraich et al., FDA Perspective on the Regulation of Artificial Intelligence in Health Care and Biomedicine, JAMA E1, E1 (2024).
[4] See Adasuriya & Haldar, supra note 1, at 146. Researchers conducted various studies globally which yielded positive results including Taiwan, Nigeria, and the United Kingdom. Id.
[5] See Terri Malloy, Spotlight on Early Detection of 3 Heat Diseases Using ECG-AI, Mayo Clinic (Feb. 22, 2024) https://newsnetwork.mayoclinic.org/discussion/spotlight-on-early-detection-of-3-heart-diseases-using-ecg-ai/ [https://perma.cc/W5KD-DBFH] (notifying public of new AI-ECG).
[6] See Adasuriya & Haldar, supra note 1, at 145. Researchers note AI’s potential to streamline the process of identifying symptoms, pre-conditions, and diagnosing. Id.
[7] See Electrocardiogram (ECG or EKG), Mayo Clinic, https://www.mayoclinic.org/tests-procedures/ekg [https://perma.cc/PLN8-DQFE] (last visited Nov. 22, 2024) (describing ECG); see also Adasuriya & Haldar, supra note 1, at 144 (defining ECG). The ECG records electrical activity of a heart through electrodes which help create a visual representation of the cardiac cycle. Adasuriya & Haldar, supra note 1, at 144.
[8] See Malloy, supra note 5. The AI-ECG utilizes both machine learning and the subfield of machine learning, deep learning. Id. Both forms of AI use algorithms to analyze large amounts of data to identify different patterns and tendencies. Id. In cardiology, physicians may utilize deep learning for disease phenotyping, outcome prediction, and help show best decisions for complex issues. Id.
[9] See id. People often experience a wide variety of symptoms that can be difficult to detect and identify as an indicator of a heart condition. Id. The algorithms, researchers argue, will aid this in the ability to compute the long list of symptoms and predict if someone has a heart condition more accurately than humans. Id.
[10] See id. (highlighting benefits of AI-ECG).
[11] See id. Because a person may experience symptoms that could be signs of a variety of conditions besides heart conditions, people often seek care too late, or clinicians fail to identify the heart condition. Id. This can result in people receiving a diagnosis only after a severe episode such as a heart attack. Id.
[12] See Malloy, supra note 6 (noting AI-enhanced medical devices success diagnosing earlier).
[13] See id. Researchers argue the use of these in Nigeria demonstrate the easy accessibility and therefore high positive potential impact of the medical device. Id.
[14] See Douglas McNair et al., Artificial Intelligence in Healthcare: The Hope, the Hype, the Promise, the Peril 198-99 (Michael Matheny et al., 2022); see also Warraich et al., supra note 2, at E1-E2. Because of AI’s global reach, the FDA and Congress placed regulations to ensure they comply with global distribution and development standards. Id.
[15] See Adasuriya & Haldar, supra note 1. Since the FDA’s first authorization of a partially enabled medical device in 1995, the agency has approved nearly 1,000 AI-enabled medical devices. Id.
[16] See Warraich et al., supra note 2. In healthcare alone, lawmakers enforce regulations under the FDA, the Cosmetic Act, the Health Insurance Portability and Accountability Act, the Federal Trade Commission Act, Federal Trade Commission Health Breach Notification Rule, Common law, and state tort law. Id. This discussion will primarily focus on the U.S Food and Drug Administration’s role in regulating the safety and effectiveness of medical devices, in this case, the AI-ECG. Id. Policymakers must also consider consumer hesitation to new AI technology. Id. From January 2021 through 2022, the FDA created and finalized guidance documents to help developers remain compliant without having to keep returning to the FDA for approval of the newly developed AI mechanisms. Id. at E2. Similarly, Congress created a risk-based approach that has different levels for regulation of AI-enabled medical devices. Id. at E3.
[17] See Adasuriya & Haldar, supra note 1, at 148 (identifying “black box” and “overfitting”); see also Warraich et al, supra note 2, at E5 (noting high financial gains in AI).
[18] See Adasuriya & Haldar, supra note 1, at 148. Researchers coined the term “black box” to describe AI’s lack of transparency in how its algorithms identify factors, create patterns from these factors, and how it makes determinations. Id.
[19] See id. (comparing clinician’s approach and “black box”).
[20] See id. Because clinicians are typically trained under an evidence-based approach, a lack of ability to read and understand the data may worry clinicians. Id.
[21] See id. (highlighting that AI-devices often incorporate noises and other irrelevant). Id. Incorporating this data into its pool can lead to incorrect data. Id. For example, incorporating the noise in the room makes it difficult to apply across two different offices, making it nearly impossible to generalize across populations. Id.
[22] See Adasuriya & Haldar, supra note 1. Sometimes AI’s algorithms create models that will only predict an outcome if the patient’s data matches exactly to the training data which can cause inaccuracies. Id.
[23] See generally id. (analyzing large companies’ role developing AI).
[24] See id. at 148 (identifying difficulty creating regulations for new devices).
[25] See id. (highlighting developers’ hurdles).
[26] See Warraich et. al, supra note 2, at E5. Lack of regulatory overview can lead to AI-enhanced devices producing unreliable results. Id.
[27] See id. For example, scholars theorize that even though AI developments could help provide preventive care for people living in areas with lack of health care, they theorize this will likely not occur because it is not a profitable venture. Id. Moreover, scholars caution AI developers to not diminish the patient-doctor relationship and interactions. Id.
[28] See id. (theorizing AI’s potential).
[29] See Warraich et al., supra note 2 (articulating AI’s potential benefits in healthcare).
[30] See Malloy, supra note 7 (emphasizing need and benefits AI-ECG provides).
[31] See Warraich et al., supra note 2, at E5. The AI-ECG’s small size and portability creates a potential for clinicians to travel across populations and test individuals for these conditions. Id.
[32] See id. (comparing high potential with high risks of AI).
[33] See id. (directing developers to consider public interest and inequities in healthcare).
[34] See Adasuriya & Haldar, supra note 1, at 148. Researchers warn that because AI is a highly profitable industry, developers may overlook public or medical interest to create devices in a way that focuses on profits. Id.
[35] See id. If developers are unable to get past outdated regulatory procedures or must continuously go to the FDA for approval, they may not be able to create new devices. Id. This could exacerbate the gap in healthcare provided between communities. Id.
[36] See id. As AI has global applicability, developer must comply with both domestic and international regulations. Id. If the FDA is slower to provide regulations than these devices are developed, the devices may not be in compliance with either domestic or international regulations. Id.
[37] See id. Researchers caution developers to balance their want for return on investments in a booming industry with public interests. Id.
[38] See Warraich et al., supra note 2, at E5 (noting AI lucrative business venture).
[39] See id. While public interest may want improvement in providing healthcare to lower socioeconomic communities, developers could focus on creating profitable devices that do not address this. Id.
[40] See Malloy, supra note 7 (elaborating on AI-ECG’s particular benefits in cardiovascular care).
[41] See Adasuriya & Haldar, supra note 1, at 148 (explaining shortfalls of AI); see also Warraich et al, supra note 2, at E5 (predicting potential inadequacies of AI development in healthcare).
[42] See Warraich et al, supra note 2, at E5 (articulating potential impact if developers focus only on profits).