Scientific advances are rapidly making science-fiction concepts such as mind-reading a reality — and raising thorny questions for ethicists, who are considering how to regulate brain-reading techniques to protect human rights such as privacy.
On 13 July, neuroscientists, ethicists and government ministers discussed the topic at a Paris meeting organized by UNESCO, the United Nations scientific and cultural agency. Delegates plotted the next steps in governing such ‘neurotechnologies’ — techniques and devices that directly interact with the brain to monitor or change its activity. The technologies often use electrical or imaging techniques, and run the gamut from medically approved devices, such as brain implants for treating Parkinson’s disease, to commercial products such as wearables used in virtual reality (VR) to gather brain data or to allow users to control software.
The brain-reading devices helping paralysed people to move, talk and touch
How to regulate neurotechnology “is not a technological discussion — it’s a societal one, it’s a legal one”, Gabriela Ramos, UNESCO’s assistant director-general for social and human sciences, told the meeting.
Advances in neurotechnology include a neuroimaging technique that can decode the contents of people’s thoughts, and implanted brain–computer interfaces (BCIs) that can convert people’s thoughts of handwriting into text1.
The field is growing fast — UNESCO’s latest report on neurotechnology, released at the meeting, showed that, worldwide, the number of neurotechnology-related patents filed annually doubled between 2015 and 2020. Investment rose 22-fold between 2010 and 2020, the report says, and neurotechnology is now a US$33-billion industry.
Devices abound
Table of Contents
One area in need of regulation is the potential for neurotechnologies to be used for profiling individuals and the Orwellian idea of manipulating people’s thoughts and behaviour. Mass-market brain-monitoring devices would be a powerful addition to a digital world in which corporate and political actors already use personal data for political or commercial gain, says Nita Farahany, an ethicist at Duke University in Durham, North Carolina, who attended the meeting.
Policymakers face the challenge of creating regulations that protect against potential harms of neurotechnologies without restricting research into their benefits. And medical and consumer products present distinct challenges, says Farahany.
Products intended for clinical use are largely governed by existing regulations for drugs and medical devices. For instance, a system that monitors the brain activity of people with epilepsy and stimulates their brains to suppress potential seizures is in clinical use2. More advanced devices — such as implanted BCIs, which allow people who are paralysed to control various external devices using only their thoughts — are in trials.
Mind-reading machines are here: is it time to worry?
But commercial devices are of more pressing concern to ethicists. Companies from start-ups to tech giants are developing wearable devices for widespread use that include headsets, earbuds and wristbands that record different forms of neural activity — and will give manufacturers access to that information.
The privacy of this data is a key issue. Rafael Yuste, a neuroscientist at Columbia University in New York City, told the meeting that an unpublished analysis by the Neurorights Foundation, which he co-founded, found that 18 companies offering consumer neurotechnologies have terms and conditions that require users to give the company ownership of their brain data. All but one of those firms reserve the right to share that data with third parties. “I would describe this as predatory,” Yuste says. “It reflects the lack of regulation.”
The need to regulate commercial devices is urgent, says Farahany, given that the potential market for these products is large and that companies might soon seek to profit from people’s neural data.
Need for neurorights
Another theme at the meeting was how the ability to record and manipulate neural activity challenges existing human rights. Some speakers argued that existing human rights — such as the right to privacy — cover this innovation, whereas others think changes are needed.
Two researchers discussed the idea of ‘neurorights’ — those that protect against third parties being able to access and affect a person’s neural activity.
“Neurorights can be both negative and positive freedoms,” Marcello Ienca, a philosopher at the Technical University of Munich, Germany, told the meeting. A negative freedom means, for instance, freedom from having your brain data intruded on without consent. A positive freedom might be people’s ability to equitably access a valuable medical technology.
Yuste and his colleagues propose five main neurorights: the right to mental privacy; protection against personality-changing manipulations; protected free will and decision-making; fair access to mental augmentation; and protection from biases in the algorithms that are central to neurotechnology.
Yuste and Ienca hope that their proposals will inform the debate about brain-reading regulation and the challenges to human-rights treaties.
Concrete change
Countries including Chile, Spain, Slovenia and Saudi Arabia have started developing regulations, and representatives discussed their nations’ work at the meeting. Chile stands out because in 2021 it became the first nation to update its constitution to acknowledge that neurotechnology needs legal oversight.
Carolina Gainza Cortés, Chile’s under-secretary for science and technology, said that it is developing new legislation, and lawmakers are discussing how to preserve human rights while allowing research into the technologies’ benefits.
The next step for UNESCO member states will be to vote in November on whether the organization should produce global guidelines on neurotechnology, similar to the guidelines UNESCO is finalizing for artificial intelligence, which help member states to implement legislation. “My hope is that we move from ethics principles to concrete, legal frameworks,” says Farahany.
“When it comes to neurotechnology, we are not too late,” Farahany told the meeting. “It hasn’t gone to scale yet across society.”