The earliest computers could not be more removed from our bodies if they tried. These hulking beasts had to be housed in buildings that took up a city block. The vacuum tubes were about as far a cry from the neurons that make up our nervous system as you could imagine.
Today things are very, very different. Computer chips contain many billions of transistors, the electrical components that allow a computer chip to perform computations. Current commercial chips have transistors that are 14 nanometers in width. That’s already smaller than a lot of the cellular components of the brain.
The bottom line is that our electronic computing technology is now at the point where it is small and power-efficient enough to be implanted in a human brain. The challenge now is to get the chips and neurons to talk to each other. This is called a “brain-computer interface” (BCI) and one day soon it may even replace your smartphone or computer.
Enter the Matrix
Cyberpunk fiction and movies like The Matrix have shown us worlds where people have ports on the backs of their necks and directly plug into computer systems. There’s no keyboard, no screen, and no separation between the thoughts of the user and the computer acting on the mental command you give it.
It sounds like science fiction, but the principle is sound. Your brain is an electrochemical device and responds to electrical stimulation. We’ve used direct electrical stimulation in a crude way to map out the different functions of brain areas and to measure and record neural activity.
Speaking the Lingo
One of the key problems with making computers and human brains talk to each other directly is the fact that computers speak in binary code and neurons speak in, well, something else.
The brain is not digital, it’s not electronic, and we have a long way to go before we can really understand what’s going on at a very low level.
This is a pretty healthy area of research and some amazing breakthroughs have happened in the last few years. Notably, Dr. Yang Dan and her team have done some incredible things with neural decoding. All the way back in 1999 she did a groundbreaking study where she was able to take the neural signals in the visual cortex of a cat’s brain and translate it to a digital image.
If we can learn to understand what neurons are actually saying, it means we can create computer hardware and software that can do something with that information. It also means that we can talk back, which opens a whole other set of possibilities.
Brain computer interfaces are essential to another type of technology that can change human lives for the better – powered prosthetics. While I have a page dedicated to the topic of powered prosthetics, I’ll briefly mention them here too.
Powered prosthetics are artificial limbs that use electronics and electrical power to do their job. Common production models are myoelectric, which means that they take the electrical signals from residual muscles and use them to move the limb itself.
That has been a pretty major breakthrough, but it is now becoming possible to take signals from the brain itself to move robotic limbs. In 2012, a quadriplegic woman was fitted with a brain-computer interface system that let her move a robotic arm with only signals from her brain. Since she is paralyzed from the neck down there are no muscle signals for a myoelectric device to read. Using the system she can feed herself and do many other tasks the rest of us take for granted.
That Computer Feeling
What we haven’t quite yet figured out is how to send sensory information back into the mind directly. While scientists have been able to reroute residual nerves from a limb stump to get sensory information, going directly to the brain is not yet an option.
Hear No Evil, See No Evil
There have been some successes in sensory transmission close to, but not quite into, the brain. Cochlear implants are practical devices that bypass the outer ear and directly stimulate the inner mechanisms of hearing. We also now have visual prostheses that step in to stimulate the retina and optic nerve directly, without the eye playing a role.
Of the two, cochlear implants are much further along. Retinal implants still lack visual resolution, but now that the concept is proven as practical, it’s just a matter of improving and refining the technology. Between the work that people like Yang Dan have done and the devices that are being developed to help people with sensory disabilities, it may not be long before sounds and images are sent directly to our visual and auditory cortices.
Making the Connection
Think of how close we are to our modern computing technology. Smartphones are pretty much the standard now. People lean on them heavily for just about everything. Wearable computers have now also started to emerge in the commercial market. Virtual reality and augmented reality headsets can now be bought anywhere. You can get smartwatches and fitness monitors with equal ease.
The next step is for these technologies to move inside our bodies. In the context of this article, they could move into our brains. Imagine that instead of augmented reality-glasses you one day have a tiny chip inside each eye that overlays digital information within your field of view, or perhaps a chip in your visual cortex itself.
These mind-machine interfaces may spell the end of all other input and output devices such as screens, keyboards, and mice.
Present-day augmented reality technology, such as the Microsoft Hololens, are already solving the problem of what to display within the visual field; direct brain-to-computer interfaces will just provide a new channel to relay that information, bypassing the eyes entirely.
Be the Machine
Another interesting application of this technology may be to make virtual reality as real as anything else. If BCI technology can also send information to the sensory parts of the brain we may gain the ability to slip into a virtual body that moves and feels without our real bodies having to do the same.
Current-generation virtual reality requires that you strap sensors to your body and walk on a treadmill or otherwise control your VR body with your real one. When we dream, however, we are in a sort of neural VR where your real body doesn’t move but your dream body can run, jump, and feel in any way it wants to. A BCI could use the same mechanism to allow for fully-immersive VR or AR objects that you can feel, smell, and taste.
If you can fully feel a virtual body, why not another physical body? Imagine being able to tele-operate a robotic frame on another planet (with some time delay if you’re still on Earth), at the bottom of the ocean, or inside a disaster zone. What about embodying a car or plane? BCIs might provide the best control system yet for a range of machines, where it really feels like the vehicle is your own body.
Human surgeons of the future may take direct control of robotic surgery systems such as the present day Da Vinci Surgical System.
Instead of using a physical haptic-feedback control system, they would use the robot’s limbs just as they use their own.
Into the Brain
Obviously, despite all the amazing things BCIs could do for us, there are more than a few potential pitfalls too.
There is no network-connected system that can’t be hacked; the only thing that changes is how hard that hacking is. Can you imagine having your sight hacked? Being spammed directly to your brain? For BCIs to be safe there has to be some sort of failsafe system in place.
We also still don’t know what the effect will be of long-term brain implants at scale. We have a pretty good idea from current systems, but when it becomes mainstream and billions of people have BCIs, new diseases or disorders may arise as a result.
There’s also the issue of upgrades. How exactly will hardware that’s in your cranium be accessed? How and when will it be installed. Will some people be stuck with obsolete technology in their heads for the rest of their lives?
Resistance is Futile
I don’t think the widespread use of BCIs is so much a question of “if” but rather of “when”. As soon as the technology is feasible, it will be too useful to ignore. So all we can do is see how exactly our brains will be augmented one day – for good or ill.