Mindhacker: Security & Privacy Risks of Brain-Computer Interfaces

By: April Garbuz

Before you know it, you could have a computer chip implanted into your brain that can directly control your phone. These computer chips, known as Brain-Computer Interfaces (“BCIs”), work by measuring brain activity and wirelessly translating the neural signals – essentially merging humans and machines. While this technology has been in development for decades, Elon Musk and his company Neuralink are now working to develop a new BCI (“The Link”) for the masses. The Link is stitched by a sewing-machine-like robot into a person’s dark matter using thin electrode-studded wires, which enable the user to send a text message or use an app without lifting a finger. Through advancements like The Link, there could be major breakthroughs in curing neurological conditions such as Alzheimer’s, dementia, and even spinal cord injuries. Still, this fusion of human and artificial intelligence raises significant privacy concerns and may expose our brains to cyberattacks.

For better or for worse, BCIs make it easier for humans to communicate with computers. This ability can be used in medical technologies, user authentication processes, gaming and entertainment systems, and smartphone-based applications. In the medical field, this capability has been leveraged for over fifteen years, allowing for incredible strides such as enabling a paralyzed patient to operate a prosthetic with his mind.

Of the potential usages, BCI smartphone-based apps likely pose the greatest security risk because they are prone to an attack on the paired smartphone. For example, third-party apps can access private data attained from BCIs and stored in mobile devices, which can then be illegally transferred by malware to external servers. Even absent malicious developers extracting private user information, third-party BCI application developers can get unrestricted access to data through application programming interfaces (“APIs”). Because app developers have complete discretion over the stimuli that can be shown to the user, a malicious developer could use the app to send any stimuli they choose, collect users’ brain signals as they react, and analyze the signals to extract private information. If third-party apps are granted access to back-end data from the Link, it will be difficult to track exactly what information stays in your brain and on your device, and what information is transmitted to the app’s vendor.

Further, while Neuralink boasts that The Link will allow users to be “in control” with Bluetooth connection and control any mouse or keyboard, it is this Bluetooth capability that makes the device vulnerable to the threat of network-level cyberattacks. It is important to note that cybersecurity experts suggest avoiding using Bluetooth to communicate sensitive information which begs the question – are neural signals too sensitive to transfer via Bluetooth? In addition to the security concerns, Bluetooth-connected computer chips in the brain raise serious privacy concerns. Once a user gives an app permission to access Bluetooth, the app can infer the location of the device based on its proximity to Bluetooth beacons that broadcast one-way Bluetooth signals. These wireless transmissions could be used to track geographical movements, a form of electronic surveillance that will likely lead to unreasonable searches and privacy infringements.

To curb some of the dangers of BCI cyberattacks, experts suggest applying the principle of device segregation, separating network segments based on role and functionality, but this seems unlikely as Neuralink’s business plan centers on compatibility with Apple iOS or Google Android. Given the serious potential for misuse, raw brainwave signals from BCI devices should not be given directly to apps in the interest of protecting private data from both attacks and possible mining.

Despite the serious privacy and security issues, Neuralink recently successfully implanted the chip in pigs and hopes to begin human clinical trials in the coming year. With the seeming inevitability of this technology, we must decide who should have access to individuals’ neural data, which purposes neural data can be used for, and what risks are associated with the misuse of that data. Even after undergoing the proper testing and receiving approval from the FDA, Neuralink must ensure that adequate cybersecurity protections are in place before making The Link available to the public.

Student Bio: April Garbuz is a second-year law student at Suffolk University Law School pursuing a concentration in Intellectual Property. She is also a Staff Member of the Journal of High Technology Law and holds a B.S. in Physiology and Neurobiology from the University of Connecticut.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

Print Friendly, PDF & Email