Thursday, July 15, 2021

Update: Brain-machine interface technology

Brain-machine interface (BMI) technology is an area of long-term personal interest. BMI tech links brains with machines. That is usually done by implanting electrodes into brains and then linking electrical brain signals to computers that analyze the signals and translate them into coherent speech or mind-controlled machine movement. In essence, the technology fuses aspects of consciousness or mind with machines. The point is to allow people who cannot speak or move to do so through machines. Progress in this area is slow and incremental.

Part of the interest in BMI tech is looking for hints about the nature and biological basis of consciousness and possible insights into the centuries old mind-body problem. Depending on the expert one listens to, the mind-body problem is either one of the hardest, most complex problems that humans have ever attempted to solve, or it is just a matter of figuring out how to read electrical signals in the brain. Incremental advances in BMI tech strike me as generally in the figuring out how to read signals category, but maybe we still don’t fully understand the problem. Despite several decades of research in this area, BMI tech is still in its infancy. There could still be major surprises along the way.

A recent article in the New England Journal of Medicine describes another incremental advance. In a person with anarthria (the loss of the ability to articulate speech), scientists implanted an electrode array into the sensorimotor cortex of his brain. The scientists used the electrode array to record 22 hours of the patient’s brain activity while he attempted to say individual words from a set of 50 words. Deep-learning algorithms created computational models for detecting and classifying words from electrical patterns in the recorded cortical activity. These computational models and a natural-language computer model were used to generate probabilities of a next word based on the preceding words in a sequence. That was used to decode full sentences as the patient tried to say them. The electrodes transmitted brain signals to a computer that analyzed them and displayed the intended words on a computer screen.

The researchers reported their results as follows: 
We decoded sentences from the participant’s cortical activity in real time at a median rate of 15.2 words per minute, with a median word error rate of 25.6%. In post hoc analyses, we detected 98% of the attempts by the participant to produce individual words, and we classified words with 47.1% accuracy using cortical signals that were stable throughout the 81-week study period.

 

The patient chatting through his BMI set-up


A New York Times article elaborates on what is going on here.
In nearly half of the 9,000 times Pancho [the patient] tried to say single words, the algorithm got it right. When he tried saying sentences written on the screen, it did even better.

By funneling algorithm results through a kind of autocorrect language-prediction system, the computer correctly recognized individual words in the sentences nearly three-quarters of the time and perfectly decoded entire sentences more than half the time.

“To prove that you can decipher speech from the electrical signals in the speech motor area of your brain is groundbreaking,” said Dr. Fried-Oken, whose own research involves trying to detect signals using electrodes in a cap placed on the head, not implanted.

After a recent session, observed by The New York Times, Pancho, wearing a black fedora over a white knit hat to cover the [electrode] port, smiled and tilted his head slightly with the limited movement he has. In bursts of gravelly sound, he demonstrated a sentence composed of words in the study: “No, I am not thirsty.”

In interviews over several weeks for this article, he communicated through email exchanges using a head-controlled mouse to painstakingly type key-by-key, the method he usually relies on.

The brain implant’s recognition of his spoken words is “a life-changing experience,” he said.

“I just want to, I don’t know, get something good, because I always was told by doctors that I had 0 chance to get better,” Pancho typed during a video chat from the Northern California nursing home where he lives.

Later, he emailed: “Not to be able to communicate with anyone, to have a normal conversation and express yourself in any way, it’s devastating, very hard to live with.”

Context
This is another example of machines being able to read and translate brain signals into some form of coherence that other minds can receive and understand. Past BMI tech accomplishments include mouse to mouse communication over the internet about how to navigate a maze to get to food. In that study one mouse was in Brazil and the other in the US. Their brains were linked by signals transmitted from one brain to the internet then from the internet to the other brain. 

Another BMI increment was getting a fully paralyzed person to successfully fly a modern jet fighter simulator (an F-35, I think) through BMI tech. A US military program attempted to use commands from a human brain to a rat brain with some success. It was an attempt to weaponize the rodents for use in armed conflicts. In another research project, limited human to human brain (mind?) communication was accomplished using recorded and decoded magnetic pulses that were converted to electrical signals and decoded by computers.

Clearly, this technology is still both complex and primitive. The computer algorithms have to be able to teach themselves how to read brain signals. That accomplishment appears to be beyond the ability of the human mind alone, maybe because the signal to noise ratio is too low for humans alone to work with. Progress just inches forward. 

Despite that, there are no limits on how far BMI tech can go that I am aware of. Decoding brain signals takes a lot of computer power and sophisticated programming, but there seems to be enough of that so far. Until some kind of a technological brick wall is hit, continued slow progress can be expected for the foreseeable future.

The fascinating question still remains unanswered. Is the brain the same thing as the mind (or consciousness, intelligence or sentience), or is there something more to it? So far, all the BMI data seems to be compatible with brain = mind, or maybe brain + CNS + PNS = mind.[1] Either one would be a possible solution to the old mind-body problem. As data slowly accumulates, it seems that room for something other than body being needed for mind gets smaller and smaller. Room for a God of the gaps seems to be decreasing as knowledge increases.


Footnote: 
1. CNS = central nervous system; PNS = peripheral nervous system

It is possible that brain + CNS + PNS + all or nearly all other cells and tissues = mind, maybe even brain + CNS + PNS + all or nearly all other cells and tissues + other people and the environment = mind.

For example, sometimes an amputee feels pain from a limb that has been amputated. The brain is heavily and intimately connected to almost the entire body and it can ‘remember’ something that just isn’t there. And, humans are inherently social creatures. Social structures or institutions can and do shape or control how we perceive and think about reality. What are the physical-biological-social limits of the mind, e.g., just the brain, or something(s) more than just that? Is the question even answerable?

And, also note that the brain isn’t just neurons. There are other cells there that can and do modulate neuron activity, and presumably that affects (is part of?) the mind too. 


The CNS is in yellow, the PNS is in blue 
humans are complicated little machines


No comments:

Post a Comment