Home / Technology / BioScience / Brain-computer interfaces are employed for neuroprosthetics, and now in military and defense to control robotic soldiers or fly planes & UAVs with thoughts.

Brain-computer interfaces are employed for neuroprosthetics, and now in military and defense to control robotic soldiers or fly planes & UAVs with thoughts.

New era has arrived when people  use only their thoughts to control not only themselves, but the world around them. Every action our body performs begins with a thought, and with every thought comes an electrical signal. The electrical signals can be received by the Brain-Computer Interface (BCI), which can electroencephalograph (EEG) or an implanted electrode, which can then be translated, and then sent to the performing hardware to produce the desired action.

 

Brain-computer interfaces are being applied in neuroprosthetics, through which paralyzed persons are able to control robotic arms, neurogaming where one can control keyboard, mouse etc using their thoughts and play games, neuroanalysis (psychology), and now in military and defense to control robotic soldiers or fly planes with thoughts.In one DARPA experiment, a quadriplegic first controlled an artificial limb and then flew a flight simulator.

 

Russian scientists  have developed  the first electric car in the world that will be controlled by brain  through an innovative system of direction by ‘telekinesis’ that will allow the exchange of information between the brain and the vehicle’s control systems. The ‘neuromobile’, as its creators call it, will allow people with limited mobility to cover long distances without help. Eventually, brain-computer interfaces could let people control augmented reality and virtual reality experiences with their mind instead of a screen or controller. Facebook’s CEO and CTO teased these details of this “direct brain interface” technology over the last two days at F8.

 

The military is  interested in developing mind-controlled weaponry and remotely-piloted aircraft, which could make  their operations and reactions far faster. The troops of tomorrow may be able to pull the trigger using only their minds. Mind controlled weapons could allow  soldiers to understand and act faster with the incredible speed of cyber warfare, missiles and other threats.

 

The US military’s Defense Advanced Research Projects Agency (DARPA) has created a brain-computer interface that enables a person to control everything from a swarm of drones to an advanced fighter jet using nothing but their thoughts and a special brain chip. In 2015, the basic principle of flying a plane using a surgically implanted microchip was demonstrated, but continued development of the brain-computer interface (BCI) has created a two-way connection enabling the pilot to not only send commands to the craft but also to receive signals.

 

Future systems might monitor the users’ nervous system and compensate for stress, fatigue, or injury.  “As of today, signals from the brain can be used to command and control… not just one aircraft but three simultaneous types of aircraft,” Justin Sanchez, director of DARPA’s biological technology office, said in Sep 2018 at the agency’s D60 Symposium in National Harbor, Maryland. “The signals from those aircraft can be delivered directly back to the brain so that the brain of that user [or pilot] can also perceive the environment,” Sanchez said at the symposium, which celebrated DARPA’s 60th birthday. “It’s taken a number of years to try and figure this out.”

https://www.youtube.com/watch?v=iy1HdFqTr6A

Brain-Computer Interface (BCI) technology

A Brain-Computer Interface (BCI) is a setup permitting the control of external devices by decoding brain activity. The users intentions, reflected by signals of the central nervous system (CNS), are translated by the BCI system, which returns a desired output: communication via computer or control of an external device. Brain-Computer Interface (BCI) technology relies on the real-time detection of specific neural patterns in order to circumvent the brain’s normal output channels of peripheral nerves and muscles and thus, to implement a direct mind-control of external devices.

 

Electroencephalography (EEG) has been extensively used for decoding brain activity since it is non-invasive, cheap, portable, and has high temporal resolution to allow real-time operation. Due to its poor spatial specificity, BCIs based on EEG can require extensive training and multiple trials to decode brain activity (consequently slowing down the operation of the BCI).

 

On the other hand, BCIs based on functional magnetic resonance imaging (fMRI) are more accurate owing to its superior spatial resolution and sensitivity to underlying neuronal processes which are functionally localized. However, due to its relatively low temporal resolution, high cost, and lack of portability, fMRI is unlikely to be used for routine BCI.

 

Researchers from Auburn University, UCLA and others  have proposed a new approach for transferring the capabilities of fMRI to EEG, which includes simultaneous EEG/fMRI sessions for finding a mapping from EEG to fMRI, followed by a BCI run from only EEG data, but driven by fMRI-like features obtained from the mapping identified previously

 

Consumer-grade EEG sensors and BCI systems are now available on the market, while BCI softwares are available for free and open-source and EEG signal processing algorithms improved.

 

Japanese, Russian, Chinese Researchers develop brain-to-vehicle technology

Nissan  has developed its ‘brain-to-vehicle’ (B2V) technology that would redefine “how people interact with their cars” and challenges conventional thinking on a future of autonomous vehicles. “When most people think about autonomous driving, they have a very impersonal vision of the future, where humans relinquish control to the machines,” Nissan executive vice president Daniele Schillaci said. “Yet B2V technology does the opposite, by using signals from their own brain to make the drive even more exciting and enjoyable.”

 

Nissan said it has been researching how to “predict a driver’s actions and detect discomfort” through the use of “brain decoding technology”. “By catching signs that the driver’s brain is about to initiate a movement – such as turning the steering wheel or pushing the accelerator pedal – driver assist technologies can begin the action more quickly,” Nissan said. “This can improve reaction times and enhance manual driving.” “By anticipating intended movement, the systems can take actions – such as turning the steering wheel or slowing the car – 0.2 to 0.5 seconds faster than the driver, while remaining largely imperceptible,” Nissan said.

 

B2V technology could also be used to allow drivers to alter the environment in which they are driving, Nissan said. “The technology can use augmented reality to adjust what the driver sees and create a more relaxing environment,” it said. The mind controlled full-scale electric car  has been designed by specialists from Russian university N.E. Lobachevsky situated in Nizhniy Novgorod. The creators of the prototype said that they had to develop a system for recording brain signals of different modalities and special algorithms to allow the car to decipher commands. The most innovative part of the vehicle,  is its control system, as it includes the use of special algorithms capable of reading different signal modes and transferring them to the machine’s control system .

 

Specifically, the algorithms will classify “a set of data on the physical state of the person” received by the control system to “discern the mental instructions of the pilot”, as explained at the university, reports RG. In other words, the system “will determine the variant of maneuver in a situation of circulation in which the driver thought”. Then the driver’s mental commands will be transmitted to the actuators of the car’s control system.

 

The developers believe that the unique car will go into mass production within three years and will be used primarily by handicapped citizens. The creators also stressed that the Russian neuro-mobile will prove to be not only advanced in terms of technology, but will also be economical and affordable. “A significant advantage is the low cost 550 000 — 990 000 rubles [$9000 to $16,000] while the purchase of foreign cars with similar characteristics will cost 50%.

 

Researchers at Nankai University in Tianjin, China are working alongside Chinese Automaker Great Wall Motor to design a car which can be controlled by the mind alone. During the test the subject using a 16 sensor headset vehicle, was able to command the car to accelerate, break and open and shut the doors. “There are two starting points of this project. The first one is to provide a driving method without using hands or feet for the disabled who are unable to move freely; and  secondly, to provide healthy people with a new and more intellectualized driving mode,” researcher Zhang Zhao told Reuters.

 

According to the researchers, the ultimate plan could be to integrate this technology with driverless cars, so it is more of a complementary service than an alternative to physical driving. Professor Duan Feng, who led the project, told Reuters, “In the end, cars, whether driverless or not, and machines are serving for people. Under such circumstances, people’s intentions must be recognized. In our project, it makes the cars better serve human beings.

 

A team of researchers at the Free University of Berlin has also explored brain interfaces to steer vehicles. The German-based team, led by artificial lab professor Dr. Raul Rojas, used a headset and electroencephalography (EEG) sensors designed by bioinformatics company Emotiv. The system was able to interpret the driver’s thoughts such as desire to turn left, right, accelerate and brake, and create computer commands.

 

Facebook’s Typing by brain project

Facebook announced that it’s working on a “typing by brain” project. Regina Dugan, the head of Facebook’s R&D division Building 8, explained to conference attendees that the goal is to eventually allow people to type at 100 words per minute, 5X faster than typing on a phone, with just your mind.

 

The team plans to use optical imaging to scan your brain’s speech center a hundred times per second to detect you speaking silently in your head, and translate it into text. The brain-computer interface will decode signals from the brain’s speech center at the remarkable rate of 100 words per minute.

 

“This isn’t about decoding random thoughts. This is about decoding the words you’ve already decided to share by sending them to the speech center of your brain. “Our brains produce enough data to stream 4 HD movies every second. The problem that the best way we have to get information out into the world – speech – can only transmit about the same amount of data as a 1980s modem. We’re working on a system that will let you type straight from your brain about 5x faster than you can type on your phone today,” said Mark Zuckerberg.

 

Facebook has research agreement with APL  on developing a silent speech interface that will allow users to type 100 words per minute — five times faster than typing on a smartphone — using only their thoughts. Earlier this year, in collaboration with Johns Hopkins Medicine, APL demonstrated the ability to decode semantic information — information about the meanings of words — from neural signals measured using electrodes placed on the surface of the brain in patients undergoing treatment for epilepsy.

 

Researchers recorded ECoG while patients named objects from 12 different semantic categories, such as animals, foods and vehicles. “By learning the relationship between the semantic attributes associated with objects and the neural activity recorded when patients named these objects, we found that new objects could be decoded with very high accuracies,” said Michael Wolmetz, a cognitive neuroscientist at the Johns Hopkins Applied Physics Laboratory, and one of the paper’s authors. “Using these methods, we observed how different semantic dimensions — whether an object is manmade or natural, how large it typically is, whether it’s edible, for example — were organized in each person’s brain.”

DARPA’s Mind Controlled Prosthetic Arm

DARPA’s Prosthetic Arm is set to take over a real arm, letting the receiver control it with their thoughts, fired up by brain cells. The robotic arm is connected by wires that link up to the wearer’s motor cortex—the part of the brain that controls muscle movement—and sensory cortex, which identifies tactile sensations when you touch things. In essence, they claim, it allows its user to feel things with their robotic hand.

 

In their research, the HAPTIX team is implanting electrodes in a patient’s muscles between the elbow and shoulder, as well as in individual nerve fascicles that correspond to wrist and finger control. According to the release, the researchers are also looking to develop minimally invasive procedures to implant electrodes in the spinal cord. The HAPTIX researchers seek to acquire and decode neural signals that could provide intuitive prosthetic control and restore sensory feedback using these neural interface systems.

 

“We want to re-establish communication between the motor parts of the nervous system and the prosthetic hand through the use of implantable electronics,” Weber said in a press release. The HAPTIX program is in its second phase, which is scheduled to continue through 2018. The third phase is scheduled for 2019, when transradial amputees will be allowed to take home a HAPTIX-controlled system for extended trials outside the laboratory, the press release noted.

 

Researchers at the University of Pittsburgh were able to increase the maneuverability of the mind controlled robotic arm from seven dimensions (7D) to 10 dimensions (10D). The extra dimensions come from four hand movements–finger abduction, a scoop, thumb extension and a pinch—enabling one to pick up, grasp and move a range of objects much more precisely than with the previous 7D control. This in turn help paralyzed persons to control a robotic arm with a range of complex hand movements. “The ultimate goal for HAPTIX is to create a device that is safe, effective, and reliable enough for use in everyday activities,” explains Doug Weber, the DARPA HAPTIX program manager.

 

California Institute of Technology’s BCI allow Paralyzed Man to Drink Beer on his own

A paralyzed man named Erik Sorto has been able to drink beer, shake hands and even play “rock, paper and scissors,” thanks to a robotic arm controlled solely by his mind.

 

For this experiment California Institute of Technology neuroscientist Richard Andersen implanted the electrodes of BCI in different area of the brain, the posterior parietal cortex, which is located on the top of the brain near the back. The parietal cortex is a center of higher-level cognition that processes the planning of movements, rather than the details of how movements are executed. An implant in this area, allow the goal of an action to be conveyed directly to the robotic limb, producing more natural fluid motions as well as reducing the number of neural signals needed to control its movement.

 

The implants differ from those in Braingate, which placed electrodes in the motor cortex, the part of the brain directs voluntary physical activity. Since motor cortex directly controls many different muscles, so for any one gesture, patients had to painstakingly focus on which muscles to activate for each specific component of the gesture. With these implants the patients could still control a robotic limb, however the movement was delayed and jerky.

Entertainment and gaming

Entertainment and gaming applications have opened the market for nonmedical brain computer interfaces. Various games are presented like in where helicopters are made to fly to any point in either a 2D or 3D virtual world. BrainArena, allows the players to play a collaborative or competitive football game by means of two BCIs. They can score goals by imagining left or right hand movements.

 

Emotiv EPOC allows one to control keyboard and mouse of your laptop as well as move characters in games. MUSE is to allow you to control your iPhone or Android device with your mind power. With ThynkWare, anyone can use their thoughts to control their smartphones, tablets, home, office, tv, robots, and even clothing.

Mind-controlled telepresence robot

A relatively new field of research is Telepresence that allows a human operator to have an at-a-distance presence in a remote environment via a brain-actuated robot.

 

A telepresence robot developed at the École Polytechnique Fédérale de Lausanne (EPFL) that can be controlled by thought may give people with severe motor disabilities a greater level of independence. Successfully put through its paces by 19 people scattered around Central Europe – nine of whom are quadriplegic and all of whom were hooked up to a brain-machine interface – the robot handled obstacle detection and avoidance on its own while the person controlling it gave general navigation instructions.

 

 

BNCI Horizon 2020 project

European Commission has released a roadmap on BCI: “BNCI Horizon 2020 project”, with the objective of providing a global perspective on the BCI field now and in the future. Many of the applications are centered on the needs of the disabled community for serious injuries. BCIs will provide people with more awareness of their own biological and mental state. BCIs will also promote amelioration of lost function.

 

Hence, interfacing with the brain directly will be on the forefront of both societal and medical evolution. Relevant application areas include social interaction and recreation, occupational safety, quality of life, independent living in old age, and (occupational) rehabilitation. In the year 2025, there is expected to be a broad range of brain-controlled applications which, according to the BCI roadmap, will be standard in medical treatment and therapy and also in monitoring personal health. BCI technology has been advancing at the rapid phase, so that it has now become possible to externally control computers, smartphones, or even vehicles, with thought.

 

 

References and Resources also include:

About Rajesh Uppal

Check Also

Parylene Photonics: A New Frontier in Biomedical Technology

Imagine a future where tiny, flexible implants deliver light deep within the body, stimulating nerves, …

error: Content is protected !!