At a 13 July conference, the UN’s Educational, Scientific and Cultural Organization (UNESCO) launched an effort to frame a set of ethical standards around “neurotech,” the use of computers to connect with and analyze the human mind.
Investments in neurotechnology grew 20-fold from 2010 through 2020 to $7.3 billion annually, a UNESCO report found, and the field’s patents during the period rose from 418 to 1,511.
Neurotech investment will grow to $24 billion by the end of 2026, the report predicted.
Increasingly, neurotech will be using AI as a key tool.
“When you add AI, you’re putting neurotechnology on steroids,” Mariagrazia Squicciarini, primary author of the UNESCO report, said to the Financial Times.
AI can take neurotech beyond diagnosing neurological disorders into the realm of gathering and storing an individual’s neural data, which raises privacy concerns.
“The promise may come at a high cost in terms of human rights and fundamental freedoms, if abused,” Gabriela Ramos, UNESCO’s assistant director-general for social and human sciences, said in comments quoted by the FT.
“Neurotech can affect our identity, autonomy, privacy, sentiments, behaviors, and overall well-being,” she added. “Developments that many thought were science fiction a few years ago are here with us already and are poised to change the essence of what it means to be human.”
In the past year, researchers have claimed in four separate studies to have decoded words and images from the brains of human volunteers without using invasive techniques, Rafael Yuste, director of Columbia University’s Neurotechnology Center, told the UN conference.
The four research projects “incorporated advanced AI models to decode brain data,” Yuste said in a pre-conference FT interview. “The new algorithms will enable you to decode information that is highly sensitive, which makes the protection of mental privacy all the more urgent.”
Almost without exception, North American neurotech companies in their consumer user agreements stipulate that they take complete ownership of the data they collect from individuals’ brains, Yuste pointed out.
“We need to protect mental property,” he urged. “Otherwise, companies will start to bank brain data. They may not decode it today but AI will enable them to decode it tomorrow.”
In experiments with mice, Yuste’s own lab has manipulated brain function to cause hallucinations.
“Manipulation of human brain activity is something we should start discussing now,” he added. “It opens the possibility of a new type of human being where part of our mental processing happens outside the body.”
Squicciarini urges “a globally coordinated approach to regulation for neurotechnology, not only in medicine but also in the consumer market.”
TRENDPOST: Regulation of neurotech is likely to come faster than rules governing AI in general. The notion of anonymous corporations “reading your mind” or “downloading your brain” to their private data banks will alarm the public, which, in turn, will prompt politicians to act sooner than later.
However, “faster than AI in general” doesn’t necessarily mean “fast.” Neurotech companies will fight any regulation and will hire lobbyists to stymie the process of formulating rules for as long as possible.