HEALTHCARE & MEDICARE

If you could read my mind – wait, can you? – Healthcare Blog

Author: Kim Berard

One area of ​​tech/health technology that I have avoided writing about for years is brain-computer interfaces (BCI). Partly because I think they're a little creepy, but more importantly, I increasingly find Elon Musk, whose Neuralink is one of the leaders in the field, even more creepy. But an article in new york times magazine By Linda Kinstler Alarm bells went off in my head—and I sure hope no one heard them.

her articles, Big tech companies want direct access to our brains, Not just discussing some of the technological advancements in the field, which are admittedly impressive. No, what caught my attention was her larger point, which is that it's time – and it's past time – for us to start taking very seriously the privacy of what's going on in our heads.

Because we are at or rapidly approaching the moment when those private thoughts of ours will no longer be private.

The ostensible purpose of brain-computer interfaces is often to assist people with disabilities, such as those with paralysis. Being able to move a cursor or even a limb could change their lives. It even allows some people to talk and even see. All of these are great use cases and have some track record of success.

BCI tends to go one of two paths. People use external signals, such as through electroencephalography (EEG) and electrooculography (EOG), to try to decipher what your brain is doing. The other, like the one used by Neuralink, is an implant placed directly into the brain to sense and interrupt activity. The latter method has the advantage of more specific readings, but has the obvious disadvantage of requiring surgery and wiring in the brain.

There is a competition held every four years called Cybathlon, sponsored by ETH Zurich, that “serves as a platform to challenge teams from around the world to develop assistive technologies suitable for daily use by people with disabilities.” Its profile is at new york times To quote a second player who used the external signaling method but lost to a team using implants: “We're not in the same league as the Pittsburghers. They're playing chess, we're playing checkers.” He is now considering implants.

Okay, you say. I can protect my mental privacy just by not having implants, right? Not so fast.

A new paper is on scientific progress Discuss the progress of “Mind Subtitles”. ie:

By aligning the semantic features of text with those linearly decoded from human brain activity, we successfully generate descriptive text that represents the visual content experienced during perception and mental imagery… Together, these factors facilitate the direct translation of brain representations into text, resulting in an optimally aligned description of the visual semantic information decoded from the brain. These descriptions are well structured and accurately capture the individual components and their interrelationships without the use of a language network, thus suggesting the existence of fine-grained semantic information outside of this network. Our approach enables comprehensible explanations of internal thoughts, demonstrating the feasibility of brain-to-text communication based on non-verbal thoughts.

Alex Huth, a computational neuroscientist at the University of California, Berkeley, who conducted the research, said the model can predict in “detailed detail” what a person is looking at. “It's hard to do. It's amazing how much detail you can get.”

“Surprising” is one way to describe it. “Exciting” might be another. However, for some, the first thing that comes to mind may be “scary.”

Mind Captioning uses fMRI and artificial intelligence to perform mind captioning in which participants are fully aware of what is happening. None of the researchers think the technology can accurately tell what people are thinking. “No one has yet proven that you can do this,” Professor Huss said.

It’s this “yet” that worries me.

Dr. Kinstler points out that's not all we need to worry about: “Advances in optogenetics, a scientific technology that uses light to stimulate or inhibit individual genetically modified neurons, could allow scientists to 'write' into the brain, potentially changing human understanding and behavior.”

“What’s coming is the integration of artificial intelligence and neurotechnology into our everyday devices,” Nita Farahany, a professor of law and philosophy at Duke University who studies emerging technologies, told Dr. Kinstler. “Basically, what we're looking at is direct brain interaction with artificial intelligence. These things are going to be everywhere. It could mean that your sense of self is essentially overridden.”

Are you worried now?

Dr. Kinstler noted that several countries (not including the United States, of course) have passed neuroprivacy laws. California, Colorado, Montana, and Connecticut have passed neurodata privacy laws, but the Future of Privacy Forum details how each state differs, and there's not even a consensus on what exactly “neural data” is, let alone how to best protect it. As always, the technology is advancing far faster than regulations.

“While many are concerned about technology that can 'read minds,' such a tool itself does not currently exist, and in many cases non-neural data can reveal the same information,” wrote Jameson Spivack, deputy director of artificial intelligence at FPF. “Thus, focusing too narrowly on 'thoughts' or 'brain activity' may exclude some of the most sensitive and intimate personal characteristics that people want to protect. In finding the right balance, legislators should be clear about the potential uses or outcomes they want to focus on.”

That said, we don't even have a good definition of the problem yet.

Dr. Kinstler described how people have been talking about this issue for decades, but little legislative/regulatory progress has been made. We may be at a point where the debate is no longer academic. Professor Farahani warns that having the ability to control one's thoughts and feelings “is a prerequisite for any other concept of freedom, because if the scaffolding of thought itself is manipulated, disrupted, interfered with, then any other way in which you exercise your freedom is meaningless because then you are no longer a self-determining person.”

In the United States of 2025, this does not seem like a useless threat.

————

In this digital world, we are gradually losing our privacy. Our emails aren't private? Oh, okay. Are big tech companies tracking our shopping behavior? Well, we'll get a better offer. Social media mines our data to best manipulate us? Yes, but think of the followers we might gain. Can surveillance cameras track our every move? But we need it to fight crime!

We complain, but most people accept these (and other) privacy losses. But when it comes to the possibility of technology reading our thoughts, let alone directly manipulating them, we cannot continue to be indecisive.

Kim is a former electronics marketing executive for a large blues program and editor of The Late and Regretful tincture.ionow a regular THCB contributor

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button