Menü Schließen

The Regulation of Emerging Neurotechnologies

Since the decade of the brain in the 90ies, billions have been invested in neuroscientific research and the development of neurotechnologies which, roughly, measure and detect neuronal processes such as fMRI, or alter brain process such as brain stimulation methods. Their advancement, increasingly driven by AI, enables some astonishing medical applications, but also raises numerous ethical, legal, and some quite intruiging philosophical questions that have led to the emerging scholarly field of neuroethics. It is worthy to note, however, that brain stimulation methods were studied already in the late 1950ies, sometimes in unethical experiments, and that the EEG was discovered a century ago in 1923. Grand political visions about „civilizing humanity“ by changing their brains were formulated in the 1970ies. Not everything neuro is necessarily novel.  In the lasat years, growing worries about the misuse of neurotechnologies – sometimes fueled by sensational headlines and overhype of technology – have, among others, prompted work about an international soft law document to regulate neurotechnologies at UNESCO, potentially serving as a first step for a future binding international document. Regulatory debates aconcern various questions. I have worked on several of them in several international research projects.

Some contemporary debates concern human rights & neurotechnologies. Fears that existing human rights are unable to address challenges raised by neurotechnologies prompted calls for novel human rights (sometimes called „neurorights“). The fear has some basis in reality insofar as existing rights often fall short of providing adequate protection to the human mind, an aspect that permeates various legal fields, not only human rights, and that is related to specific legal problems with mental process (such as problems of proof and causation). Neuroscience and neurotechnologies put the legal protection of the human mind on the table. Apart from that, however, the narrative of the deficiency of existing rights is largely unfounded and even  dangerous as it undercuts the protection afforded by existing rights. Core existing rights form several layers of protection around the human person without leaving substantive gaps, provided they are properly interpreted and applied. What this means and implies are the more interesting and legally relevant questions; how strong are rights, do they need to be further developed, and under which conditions might they be outweighed by competing rights or interests, e.g. of governments to use neurotechnologies for safety and security measures? I argue for a principled – and unconditional – protection of the human mind, or at least some parts of it, in a modernized version of the forum internum. For this debate, you may want to read:

Moreover, many problems – especially those arising from private actors such as large digital companies – are neither situated nor adequately solved at the level of human rights law as they concern relations between private parties. States should address them at the level of positive domestic law.

One set of questions concerns (mental) privacy and protections against unwanted detection and measuring of brain processes, especially worrisome if it affords inferences about mental states of persons („mind reading“) outside of the medical context. This is partly a matter of data protection law, and one solution in Europe is to place neural data, perhaps with some qualifiers, in the category of specially protected data in the GDPR (as we suggested here). However, as breaches of data protection law are ubiquitous and the GDPR only weakly enforced, I suggest adopting a novel criminal offense, mainly to deter well-calculated big-data breaches related to brain data. I call this offense mind probing (forthcoming). It would complement another offense protecting the mind I suggested some years ago:

  • Bublitz/Merkel, Crimes Against Minds: On Mental Manipulations, Harms and a Human Right to Mental Self-Determination. Criminal Law, Philosophy 8, 51–77 (2014) [Link].

Another intriguing aspect is the blending of the human mind, the organic brain, neurotech implants, and the AI software running on them. This assemblage of biological and artificial hard and software, code and device, irritates and transgresses the boundaries of the body, the person, and the artefact. This has
legal implications sketched out in a few papers. The first examines the boundaries of the person and whether implants may become part of the body and the person.

  • Bublitz, The body of law: boundaries, extensions, and the human right to physical integrity in the biotechnical age, Journal of Law and the Biosciences, Volume 9, Issue 2, July-December 2022, lsac032 [Link]

Drawing on those findings, the following paper argues that some AI-devices become part of the person, including the AI itself. The blending with AI creates the most intimate connection between persons and AI conceivable. To avoid objectification and commodification of the person, I argue for the vanishing of
third-party rights (eg copyright) in implants. In that regard, a line of reasoning supporting the sovereignty of the person leads from the abolition of slavery to curbing IP rights in neurotechnologies.

  • Bublitz, Might artificial intelligence become part of the person, and what are the key ethical and legal implications? AI and Society 39, 1095–1106 (2024). [Link]

Finally, whether non-medical consumer neurotech for various purposes – from gaming to human enhancement – should be allowed on markets is still an open question. The EU has recently set conditions for non-medical brain stimulation devices, but whether they are adequate is a matter of debate. More here:

  • Bublitz/Ligthart, The novel EU regulation of non-medical neurotech (forthcoming).

Given the many and still hard to ascertain side-effects of novel technologies such as smartphones, a cautious and restrictive approach seems preferable, even for the price of slowing innovation. Good reasons speak for a moratorium at least for non-medical invasive neurotechnologies, which are not yet available, but might be ready for markets in the coming years. Regulating them now might be a wise approach.