The brain-computer interface (BCI) allows people to use their thoughts to control not only themselves, but the world around them. Every action our body performs begins with a thought, and with every thought comes an electrical signal. The electrical signals can be received by the Brain-Computer Interface (BCI), consisting of an electroencephalograph (EEG) or an implanted electrode, which can then be translated, and then sent to the performing hardware to produce the desired action.
BCI is an alternative system built on artificial mechanisms and acts as a bridge between the brain and external devices. The aim of BCI is to convey human intentions to external devices by directly extracting brain signals. Some prototypes can translate brain activity into text or instructions for a computer, and in theory, as the technology improves, we’ll see people using BCIs to write memos or reports at work. In recent years, advances in machine learning (ML) have enabled the development of more advanced BCI spellers, devices that allow people to communicate with computers using their thoughts. In the next few years, we might be able to control our PowerPoint presentation or Excel files using only our brains.
Eventually, the brain and computers would be highly integrated when BCI will enable a bidirectional communication between a brain and an external device, bidirectional generally includes direct neural readout and feedback and direct neural write-in. BCI technology by providing direct communication between the brain and external devices will enable new ways of human interacting with their devices.
Researchers are now taking brain and computer integration to next level. They are employing brain-computer interface (BCI) technology for synthetic telepathy or mind to mind communication. People use written, verbal, non-verbal–even Emoji–communication in day-to-day activities presently but researchers are working towards a future which could bypass all of these to communicate Brain-to-Brain.
BCIs will be networked together for direct messaging between brains, a form of computer assisted telepathy. This brain-to-brain communication is being enabled by brain-to-brain interface, or BBI. A brain-to-brain interface records the signals in one person’s brain, and then sends these signals through a computer in order to transmit them into the brain of another person. This process allows the second person to “read” the mind of the first or, in other words, have their brain fire in a similar pattern to the original person.
We would be able to send messages by our thought processes alone. Facebook chief executive Mark Zuckerberg said that he envisions a world where people — presumably Facebook users — don’t need these types of communication intermediaries. Instead, they’ll communicate brain-to-brain, using telepathy.
This Brain to computer intelligence communication, experts predict would lead toward collective intelligence. Dr. Michio Kaku, theoretical physicist, speaks of the recent tests conducted by many neurologists, where the thoughts could project images into digital format. Although these pictures are rather crude, they offer a glimpse of what we are dealing with, the magnitude of what the human mind can perform.
However, there are many challenges before telepathy becomes a reality. So far, we still can’t transmit complex ideas between people, mainly because we still don’t know how the brain encodes complex ideas. Weird as it may sound, science still can’t explain consciousness, or the particular brain cells and their firing patterns that make up each individual thought. This is what’s limiting how far we can push this technology.
Practical direct brain to brain communication currently exists at a simple ‘morse-code’ level. It has been demonstrated that a subject using a non-invasive BCI can transmit thought generated stimulus over the internet into a second BCI-using operator, where the data is translated into stimulus projected into that subject’s visual cortex.
In one experiment, the data was interpreted by the receiver as flashes of light within the peripheral vision. The first operator could control the transmissions and successfully sent messages to the second through this method. Other civilian institutions have shown that not only intended speech, but covert speech or ‘the inner voice’, could also be detected by BCI, and be translated by software.
At the University of California at Berkeley, a team of cognitive scientists have managed to reconstruct clips of movies their subjects were watching, based solely on measurements of their brainwaves. “You could not see the close-up details,” wrote the theoretical physicist Michio Kaku after watching one of the “movies,” “[but] you could clearly identify the kind of object you were seeing.”
‘Telepathy’ technology remains so crude that it’s unlikely to have any practical impact,” wrote Mark Harris at the MIT Technology Review. The practical brain to brain communications is still many years way as it requires advancement in many technologies including creating detailed maps of brain, efficient encoders of brain thoughts and wearable computer-to-brain interfaces (CBI).
The Brain to Brain communications also raises the concerns of privacy, someone reading your mind or sending disturbing or abusive thoughts to you or worse still hacking your brains. Military is interested in applying these new tools for psychological warfare.
Brain to Brain Communication progress
Telepathy in Rats
Back in 2013, the first study in which two brains were successfully joined to collaborate and complete a task was published in Scientific Reports. First, Miguel Pais-Vieira and his colleagues trained rats to perform a basic task: the animals were trained to press one of two levers, with the correct lever signaled with a light. The correct choice gave them access to water. Once the rats could successfully complete this task four out of five times, they were assigned as either the encoder—the one sending signals—or the decoder, the one receiving them.
Scientists at Duke University fitted invasive implants consisting of microelectrode arrays in the brains of each of two lab rats. The two electrodes were connected via computers, one of the microelectronic array served as encoder that transmitted brain impulses from one rat, while the other electrode which served as decoder of brain impulses. When one of the rats was taught to press one of two levers, the second rat, who had not been trained, also seemed to know which lever to push after it received neural signals from the first rat, via the implant.
Encoder rats were surgically implanted with recording wires that measured activity in the motor areas of their brain, while decoder rats were implanted with stimulating wires in the same area. Each one was kept in a separate container, and only the encoder rats were shown the light signal on the levers. As the encoder rats chose a lever, neurons in their brain started firing.
The BBI recorded this activity, transformed it, and used it to stimulate an equivalent pattern into the brain of the decoder rat. The decoder rat had to correctly press a lever based on this stimulation. (Water was only given if both animals successfully pushed the right lever.) The researchers found that both rats pushed the correct lever 62 percent of the time, or more than chance probability.
Brain to Brain Communication in Humans
In November of 2014, the first real-time BBI for humans was developed by Rajesh Rao and colleagues at the University of Washington. The human device was non-invasive, meaning surgery wasn’t required. This device transferred the movement signals from the encoder straight to the motor area of the brain of the decoder, without using a computer. In the study, Rao and his team used an electroencephalography (EEG), placing recording wires on the scalp of the encoding person. Then the scientists used transcranial magnetic stimulation (TMS) on the decoding person’s brain, sending little magnetic pulses through their skull to activate a specific region of their brain. This caused the second person to take the action that the first person meant to—for example, to press a button.
But, cool as this sounds, there was a major limitation to the study. The decoder wasn’t consciously aware of the signal they received. They weren’t able to actively process the incoming neural information—meaning only movement was transferred, not thoughts. Instead, their hand simply moved when stimulated, as though a puppeteer was controlling their limbs.
Fortunately, a study using BBIs to transfer information between people swiftly followed. The same researchers at The University of Washington then designed a game with pairs of participants, similar to 20 Questions. In the game, the encoder was given an object that the decoder wasn’t familiar with. The goal was for the decoder to successfully guess the object through a series of yes or no questions. But unlike in 20 Questions, the encoder responded by looking LED flashing lights, one signifying yes and the other no. The visual response generated in the encoder’s brain was transmitted to the visual areas of the brain of the decoder
Experiments in direct brain-to-brain communication have also shown that stimulus for intended motor function can also be transmitted successfully. In one trial, two subjects cooperatively played a video game where one BCI operator observed the screen and the second
in a different location operated the game controller. The first operator controlled the hand movements of the second in order to interact with the game, and achieved successful signal transmission-to-motor response levels of up to 83%.
A team of researchers from Starlab Barcelona, Spain, and from Axilum Robotics, Strasbourg, France, transmitted the words “hola” and “ciao” generated by brain of one researcher in India over internet and that were successfully decoded by brain of another researcher in France located 5,000 miles apart.
The researchers utilized completely noninvasive technologies, standard EEG (electroencephalogram) from Neuroelectrics was utilized as Brain Computer Interface (BCI) to convert sender’s thoughts about moving his hand or feet into binary message. At the other end robot-assisted and image-guided transcranial magnetic stimulation (TMS) technology was utilized as a computer-to-brain interface (CBI). TMS stimulated a region in the visual cortex that created flashes of light at the bottom of the visual field of the researcher, and allowed him to decode the binary message. The researchers also reported that they have achieved an accuracy rate of 85%.
Researchers believe that in the coming decades, it could be used to help stroke victims, extreme paraplegics, and sufferers of ‘locked-in syndrome’ to speak and move again, using their brains to transmit instructions either to other people or to artificial limbs.
The research was partly funded by European Commission’s Future and Emerging Technology, or FET, program, with €2.7 million ($3.5 million) grant.
Scientists Have Connected The Brains of 3 People, Enabling Them to Share Thoughts
In Oct 2019, Neuroscientists reported have successfully hooked up a three-way brain connection to allow three people share their thoughts – and in this case, play a Tetris-style game. It works through a combination of electroencephalograms (EEGs), for recording the electrical impulses that indicate brain activity, and transcranial magnetic stimulation (TMS), where neurons are stimulated using magnetic fields.
“We present BrainNet which, to our knowledge, is the first multi-person non-invasive direct brain-to-brain interface for collaborative problem solving,” write the researchers. “The interface allows three human subjects to collaborate and solve a task using direct brain-to-brain communication.” In the experiment set up by the scientists, two ‘senders’ were connected to EEG electrodes and asked to play a Tetris-style game involving falling blocks. They had to decide whether each block needed rotating or not. To do this, they were asked to stare at one of two flashing LEDs at either side of the screen – one flashing at 15 Hz and the other at 17 Hz – which produced different signals in the brain that the EEG could pick up on.
These choices were then relayed to a single ‘receiver’ through a TMS cap that could generate phantom flashes of light in the receiver’s mind, known as phosphenes. The receiver couldn’t see the whole game area, but had to rotate the falling block if a light flash signal was sent. Across five different groups of three people, the researchers hit an average accuracy level of 81.25 percent, which is decent for a first try. To add an extra layer of complexity to the game, the senders could add a second round of feedback indicating whether the receiver had made the right call. Receivers were able to detect which of the senders was most reliable based on brain communications alone, which the researchers say shows promise for developing systems that deal with more real world scenarios where human unreliability would be a factor.
And while the current system can only transmit one ‘bit’ (or flash) of data at a time, the team from the University of Washington and Carnegie Mellon University thinks the setup can be expanded in the future. Our results raise the possibility of future brain-to-brain interfaces that enable cooperative problem solving by humans using a ‘social network’ of connected brains,” writes the team.
Military thrust on Telepathy among soldiers
Meanwhile the tools for more invasive—and perhaps more efficient—brain interfacing are developing rapidly. Elon Musk recently announced the development of a robotically implantable BCI containing 3,000 electrodes to provide extensive interaction between computers and nerve cells in the brain. While impressive in scope and sophistication, these efforts are dwarfed by government plans.
One area where BCI technology could potentially prove useful for today’s military personnel would be synthetic telepathy among human operators. Pentagon is spending millions of dollars to develop Brain-to-brain communication hat may be utilized by soldiers on battlefield. The U.S. Military is developing a chip allowing a high-resolution human/computer interface, that would allow people to connect their brains directly to computers. “The goal is to achieve this communications link in a biocompatible device no larger than one cubic centimeter in size, roughly the volume of two nickels stacked back to back,” a DARPA official said.
In 2008, the US Army gave researchers from UC Irvine, the University of Maryland and Carnegie Mellon University a $4 million grant to create “synthetic telepathy,” or a system to compose a message in the brain and send that message to a particular individual or object (like a radio), also just with the power of thought .
In 2009, DARPA’s “Silent Talk” program awarded grants to research institutions to “allow user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals,” an application that could greatly facilitate covert
communication. An external analysis highlights the potential use of BCI technology to develop shared consciousness within and across units, improve collective awareness of combat challenges, and provide combatants with insights into perspectives and internal deliberations of multiple operators.
The Defense Advanced Research Projects Agency (DARPA) has been leading engineering efforts to develop an implantable neural interface capable of engaging one million nerve cells simultaneously. While these BCIs are not being developed specifically for brain–to-brain interfacing, it is not difficult to imagine that they could be recruited for such purposes. By increasing the capacity of advanced neural interfaces to engage more than one million neurons in parallel, DARPA aims to enable rich two-way communication with the brain at a scale that will help deepen our understanding of that organ’s underlying biology, complexity, and function.”
In Nov 2020, it was reported that the US Army in collaboration with multiple universities is researching ‘telepathic brain signal communication technology for soldiers that could help them silently communicate in the future. Funded by the US Army Research Office, new research successfully separated brain signals that influence action or behaviour from signals that do not. “Using an algorithm and complex mathematics, the team was able to identify which brain signals were directing motion, or behaviour-relevant signals, and then remove those signals from the other brain signals,” C4isrnet reported. Separating brain signals could be the first step towards successfully decoding action-based signals and intentions.
The research programme is led by University of Southern California researchers, along with colleagues in Los Angeles, Berkeley, Duke University, and several UK universities. The researchers aim to discover if machine can provide feedback to soldier’s brains — to allow them to take corrective action before something takes place — “a capability that could protect the health of a war fighter”. The US Army is providing $6.25 million in funding over five years for the research. “Here, we’re not only measuring signals, but we’re interpreting them,” said Hamid Krim, a programme manager for the Army Research Office. The researchers have performed successful tests with monkeys. “More work is to be done, as any sort of battle-ready machine-human interface using brain signals is likely decades away,” Krim was quoted as saying. “At the end of the day, that is the original intent mainly: to have the computer actually being in a full duplex communication mode with the brain.”
Brain-to-brain communication demo receives DARPA funding in Jan 2021
Wireless communication directly between brains is one step closer to reality thanks to $8 million in Department of Defense follow-up funding for Rice University neuroengineers. The Defense Advanced Research Projects Agency (DARPA), which funded the team’s proof-of-principle research toward a wireless brain link in 2018, has asked for a preclinical demonstration of the technology that could set the stage for human tests as early as 2022.
“We started this in a very exploratory phase,” said Rice’s Jacob Robinson, lead investigator on the MOANA Project, which ultimately hopes to create a dual-function, wireless headset capable of both “reading” and “writing” brain activity to help restore lost sensory function, all without the need for surgery. MOANA, which is short for “magnetic, optical and acoustic neural access,” will use light to decode neural activity in one brain and magnetic fields to encode that activity in another brain, all in less than one-twentieth of a second.
“We spent the last year trying to see if the physics works, if we could actually transmit enough information through a skull to detect and stimulate activity in brain cells grown in a dish,” said Robinson, an associate professor of electrical and computer engineering and core faculty member of the Rice Neuroengineering Initiative.
“What we’ve shown is that there is promise,” he said. “With the little bit of light that we are able to collect through the skull, we were able to reconstruct the activity of cells that were grown in the lab. Similarly, we showed we could stimulate lab-grown cells in a very precise way with magnetic fields and magnetic nanoparticles.” Robinson, who’s orchestrating the efforts of 16 research groups from four states, said the second round of DARPA funding will allow the team to “develop this further into a system and to demonstrate that this system can work in a real brain, beginning with rodents.”
If the demonstrations are successful, he said the team could begin working with human patients within two years. “Most immediately, we’re thinking about ways we can help patients who are blind,” Robinson said. “In individuals who have lost the ability to see, scientists have shown that stimulating parts of the brain associated with vision can give those patients a sense of vision, even though their eyes no longer work.” The MOANA team includes 15 co-investigators from Rice, Baylor College of Medicine, the Jan and Dan Duncan Neurological Research Institute at Texas Children’s Hospital, Duke University, Columbia University, the Massachusetts Institute of Technology and Yale’s John B. Pierce Laboratory. The project is funded through DARPA’s Next-Generation Nonsurgical Neurotechnology (N3) program.