Arguing for Brain Machine Interfaces
Written for Pi newspaper
Amped up on scifi with films ranging from epics like Solaris to the big-time Hollywood hits Inception and Pacific Rim, it’s sometimes hard to distinguish fact from fiction. Interfacing devices with our brain to manipulate our environment is becoming less of a dream and more a way to capture capabilities otherwise inconceivable, bringing functionality to damaged bodies and bypassing a need for the sweat & sinew of the brawn.
At the start of the millennium, we appeared to be breaking the horizon in developing brain machine interfaces (BMIs) not only for amputations, but severe paralyses such as quadriplegia or amyloid lateral sclerosis (ALS). Giving the possibility of reading neural activity from our cortex, we imagined the newfound field of neural engineering.
By no means a foreign idea, neuroscience and engineering have long lived in friendly closeness. Spanning WWII from Turing and John von Neumann’s aspirations of artificial brains to Norbert Wiener’s formulation of cybernetics ("the scientific study of control and communication in the animal and the machine"), we brought information theory and artificial intelligence into the gap between biology and mechanics.
Today brain machine interfaces (while still extremely limited and generally difficult: read my older post here) range from sensors picking up body motion via magnetic, infra-red, camera, force sensor, accelerometer and switch; to taking signals from the brain. We’ve moved from external electroencephalographic (EEG) nets used in the 90s to surgery-intensive electrocorticography (ECoG) used for epilepsy patients from 2004. We’re improving direct interface to a point where not only monkeys possess robotic control, but human patients implanted with microelectrodes can control complex robot arms from a small sample of signal neurons.
The risks for cranial surgery are still high however: surgery for subdural or epidural ECoG’s can result in haemorrhage, cerebral infarction, edema or subdural hematoma. As a result, surgeries that don’t involve dangerous sharpies near our prized organ are preferred: such as surgeries rerouting nerves into muscles for amplification and subsequent interface. With schemes such as Robots for Humanity, paralysed patients can gain independence by training and operating robots for general needs such as shaving and lifting items, and can fly quadrotors to explore their environment using the sensors attached to a tiny part of their body.
Such interfaces monitor some external physical output, such as in Stephen Hawking’s speech method: the twitch of a cheek to select letters. Although understandably more laborious than a brain interface, the technology is more reliable and better understood. To illustrate: we can send electrical signals to a brain implant for a ‘feedback’ mechanism, but we can’t be sure it will be understood correctly by the brain. We understand skin feedback a lot more: thus body interfaces can ‘play back’ pressures, temperatures or textures onto a sensitive area, allowing fine-precision control of robotics. Using such method, a robot arm can pick up something as heavy as a chair or delicate as a light bulb.
With the ability for visual and mental interfacing of this kind, potential jobs open up in new market niches. Taking aside the spying applications, interfacing with aerial drones (basically a scaled up version of a quadrotor, similar to the potential idea by Amazon’s aerial delivery team) could lead to a host of possibilities. Delivering ‘postwomen/men’ to places in the world lacking roads (affecting roughly 1 billion people during each year) or ecological monitoring in sensitive zones.
Not only used for increasing functionality or formulating our future, BMIs can take back what we’ve lost by helping through acts such as rehabilitation. Stroke patients with hand weakness have been found to improve substantially using BMI assistance. Meanwhile assist-devices such as speech synthesizers restore the essential act of communicating. Either through a device similar to Hawking’s or via implanted electrodes, they give back a person’s voice.
We’re constantly improving, and growth in engineering is following along with the biological. Integrating electrodes into the speech-related motor cortex, we can see neurite growth occurring over implanted electrode cones allowing signals to be picked up. Neural decoding for eventual speech synthesis – a learning process both by machine and human to transmit signals that will be interpreted into words: progress. We have come a long way in a short time, to an age when we realize the scifi dreams of our youth aren’t completely off the mark – but we’re still far from soaring with our imagination.
Refs (linked in text)
https://www.expert-reviews.com/doi/pdf/10.1586/erd.13.3
https://www.nature.com/nature/journal/v485/n7398/abs/nature11076.html
https://www.willowgarage.com/blog/2011/07/13/robots-humanity
https://onlinelibrary.wiley.com/doi/10.1002/ana.23879/abstract
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2944670/
https://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0008218
https://www.huffingtonpost.co.uk/2013/12/02/amazon-drones-prime-air_n_4370340.html