Trending News
Home / Technology / Electronics & EW / Memristors for Neuromorphic processors to Quantum computing

Memristors for Neuromorphic processors to Quantum computing

Future trillions of predicted interconnected devices within the Internet of Things framework will generate an incredible amount of data that will need processing. Traditionally, the processing of data in electronics has relied on integrated circuits (chips) featuring vast numbers of transistors – microscopic switches that control the flow of electrical current by turning it on or off. The size of transistors has reduced to meet the increasing demands of technology, but are now reaching their physical limit, with – for example – the processing chips that power smartphones containing an average of five billion transistors.

 

Memristors could hold the key to a new era in electronics, being both smaller and simpler in form than transistors, low-energy, and with the ability to retain data by ‘remembering’ the amount of charge that has passed through them – potentially resulting in computers that switch on and off instantly and never forget. Because memristors store and process information in the same location, they can get around the biggest bottleneck for computing speed and power: the connection between memory and processor.  In a conventional computer, logic and memory functions are located at different parts of the circuit.

 

They can also store multiple memory states.  The University of Southampton has previously demonstrated a new memristor technology that can store up to 128 discernible memory states per switch, almost four times more than previously reported. With a faster speed, lower power consumption and a higher density of information per volume, memristors offer many advantages over the old transistor.   Memristor chips will soon be integrated in textiles, windows, even coffee cups and any imaginable items used in daily life.

 

Memristors

Memrisistors  are nanoscale devices with unique properties: a variable resistance and the ability to remember the resistance even when the power is off.  A memristor acts a lot like a resistor but with one big difference: it can change resistance depending on the amount and direction of the voltage applied and can remember its resistance even when the voltage is turned off. When the electric power supply is turned off, the memristor remembers its most recent resistance until it is turned on again.

 

A single memristor can perform the same logic functions as multiple transistors, making them a promising way to increase computer power. Memristors could also prove to be a faster, smaller, more energy-efficient alternative to flash storage. Memristors do not require a silicon layer, unlike transistors and therefore they are not affected by the same limitations of current microchip manufacturing technology.  Their circuits require fewer transistors, allowing more components (and more computing power) to be packed into the same physical space while also using less power to function.

 

Memristors use ions to store data, besides the flow of electrons. Today, many scientists, electronics and computer engineers expect that the memristor will forever change the world of electronics and revolutionize computing. It might lead to the transition from electrons to ions and mark the beginning of a new era in electronics called “ionics”.

 

But despite the keen interest in memristors, scientists have lacked a detailed understanding of how these devices work and have yet to develop a standard toolset to study them. Now, NIST scientists have identified such a toolset and used it to more deeply probe how memristors operate. Their findings could lead to more efficient operation of the devices and suggest ways to minimize the leakage of current.

 

Themis Prodromakis, Professor at the University of Southampton said: “For decades we have followed the pattern that computers should have separate processor and memory units, but these are now struggling to cope with the masses of data in the public domain. Soon the span of functionality in future Internet of Things (IoT) systems will be much wider than what we know from today’s smartphones, tablets or smart watches.

 

Inspired by how mammals see, a new “memristor” computer circuit prototype at the University of Michigan has the potential to process complex data, such as images and video orders of magnitude, faster and with much less power than today’s most advanced systems. Faster image processing could have big implications for autonomous systems such as self-driving cars, says Wei Lu, UM professor of electrical engineering and computer science.

 

Neural Networks

From Deep Blue to AlphaGo, artificial intelligence and machine learning are booming, and neural networks have become the hot research direction. However, due to the size limit of complementary metal–oxide–semiconductor (CMOS) transistors, von Neumann-based computing systems are facing multiple challenges. As the number of transistors required by the neural network increases, the development of neural networks based on the von Neumann computer is limited by volume and energy consumption. As the fourth basic circuit element, memristor shines in the field of neuromorphic computing. The new computer architecture based on memristor is widely considered as a substitute for the von Neumann architecture and has great potential to deal with the neural network and big data era challenge.

 

This new technology can act like the short-term memory of nerve cells allowing creating computers that operate in a way similar to the synapses in our brains.  Memristor arrays have the capacity to be trained rather than directly programmed, using learning rules.

 

Brain-Inspired Robot Controller Uses Memristors for Fast, Efficient Movement

In a paper published today in Science Robotics, a team of researchers from the University of Southern California in Los Angeles and the Air Force Research Laboratory in Rome, N.Y., demonstrate a simple self-balancing robot that uses memristors to form a highly effective analog control system, inspired by the functional structure of the human brain.

 

By adding a memristor to an analog circuit with inputs from a gyroscope and an accelerometer, the researchers, led by Wei Wu, an associate professor of electrical engineering at USC, created a completely analog and completely physical Kalman filter to remove noise from the sensor signal. In addition, they used a second memristor can be used to turn that sensor data into a proportional-derivative (PD) controller. Next they put those two components together to build an analogy system that can do a bunch of the work required to keep an inverted pendulum robot upright far more efficiently than a traditional system. The difference in performance is readily apparent:

 

The memristors substantially reduce the cycle time, so the robot can balance much more smoothly. Specifically, cycle time is reduced from 3,034 microseconds to just 6 microseconds.

 

“The human brain consists of the cerebrum, the cerebellum, and the brainstem. The cerebrum is a major part of the brain in charge of vision, hearing, and thinking, whereas the cerebellum plays an important role in motion control. Through this cooperation of the cerebrum and the cerebellum, the human brain can conduct multiple tasks simultaneously with extremely low power consumption. Inspired by this, we developed a hybrid analog-digital computation platform, in which the digital component runs the high-level algorithm, whereas the analog component is responsible for sensor fusion and motion control.”

Researchers Unveil Electronics That Mimic The Human Brain In Efficient Learning

In 2020, a team at the University of Massachusetts Amherst has discovered, while on their way to better understanding protein nanowires, how to use these biological, electricity conducting filaments to make a neuromorphic memristor, or “memory transistor,” device. It runs extremely efficiently on very low power, as brains do, to carry signals between neurons. Details are in Nature Communications.

 

As first author Tianda Fu, a Ph.D. candidate in electrical and computer engineering, explains, one of the biggest hurdles to neuromorphic computing, and one that made it seem unreachable, is that most conventional computers operate at over 1 volt, while the brain sends signals called action potentials between neurons at around 80 millivolts – many times lower. Today, a decade after early experiments, memristor voltage has been achieved in the range similar to conventional computer, but getting below that seemed improbable, he adds.

 

Fu reports that using protein nanowires developed at UMass Amherst from the bacterium Geobacter by microbiologist and co-author Derek Lovely, he has now conducted experiments where memristors have reached neurological voltages. Those tests were carried out in the lab of electrical and computer engineering researcher and co-author Jun Yao. Yao says, “This is the first time that a device can function at the same voltage level as the brain. People probably didn’t even dare to hope that we could create a device that is as power-efficient as the biological counterparts in a brain, but now we have realistic evidence of ultra-low power computing capabilities. It’s a concept breakthrough and we think it’s going to cause a lot of exploration in electronics that work in the biological voltage regime.”

 

Lovely points out that Geobacter’s electrically conductive protein nanowires offer many advantages over expensive silicon nanowires, which require toxic chemicals and high-energy processes to produce. Protein nanowires also are more stable in water or bodily fluids, an important feature for biomedical applications. For this work, the researchers shear nanowires off the bacteria so only the conductive protein is used, he adds. Fu says that he and Yao had set out to put the purified nanowires through their paces, to see what they are capable of at different voltages, for example. They experimented with a pulsing on-off pattern of positive-negative charge sent through a tiny metal thread in a memristor, which creates an electrical switch.

 

They used a metal thread because protein nanowires facilitate metal reduction, changing metal ion reactivity and electron transfer properties. Lovely says this microbial ability is not surprising, because wild bacterial nanowires breathe and chemically reduce metals to get their energy the way we breathe oxygen. As the on-off pulses create changes in the metal filaments, new branching and connections are created in the tiny device, which is 100 times smaller than the diameter of a human hair, Yao explains. It creates an effect similar to learning – new connections – in a real brain. He adds, “You can modulate the conductivity, or the plasticity of the nanowire-memristor synapse so it can emulate biological components for brain-inspired computing. Compared to a conventional computer, this device has a learning capability that is not software-based.”

 

Fu recalls, “In the first experiments we did, the nanowire performance was not satisfying, but it was enough for us to keep going.” Over two years, he saw improvement until one fateful day when his and Yao’s eyes were riveted by voltage measurements appearing on a computer screen. “I remember the day we saw this great performance. We watched the computer as current voltage sweep was being measured. It kept doing down and down and we said to each other, ‘Wow, it’s working.’ It was very surprising and very encouraging.” Fu, Yao, Lovely and colleagues plan to follow up this discovery with more research on mechanisms, and to “fully explore the chemistry, biology and electronics” of protein nanowires in memristors, Fu says, plus possible applications, which might include a device to monitor heart rate, for example. Yao adds, “This offers hope in the feasibility that one day this device can talk to actual neurons in biological systems.”

 

 

 

First Programmable Memristor Computer

The first programmable memristor computer—not just a memristor array operated through an external computer—has been developed at the University of Michigan. This is especially important for machine-learning algorithms that deal with lots of data to do things like identify objects in photos and videos—or predict which hospital patients are at higher risk of infection. Already, programmers prefer to run these algorithms on graphical processing units rather than a computer’s main processor, the central processing unit.

 

It could lead to the processing of artificial intelligence directly on small, energy-constrained devices such as smartphones and sensors. A smartphone AI processor would mean that voice commands would no longer have to be sent to the cloud for interpretation, speeding up response time. “Everyone wants to put an AI processor on smartphones, but you don’t want your cell phone battery to drain very quickly,” said Wei Lu, U-M professor of electrical and computer engineering and senior author of the study in Nature Electronics.

 

“GPUs and very customized and optimized digital circuits are considered to be about 10-100 times better than CPUs in terms of power and throughput.” Lu said. “Memristor AI processors could be another 10-100 times better.” GPUs perform better at machine learning tasks because they have thousands of small cores for running calculations all at once, as opposed to the string of calculations waiting their turn on one of the few powerful cores in a CPU.

 

“GPUs and very customized and optimized digital circuits are considered to be about 10-100 times better than CPUs in terms of power and throughput.” Lu said. “Memristor AI processors could be another 10-100 times better.” GPUs perform better at machine learning tasks because they have thousands of small cores for running calculations all at once, as opposed to the string of calculations waiting their turn on one of the few powerful cores in a CPU.

 

A memristor array takes this even further. Each memristor is able to do its own calculation, allowing thousands of operations within a core to be performed at once. In this experimental-scale computer, there were more than 5,800 memristors. A commercial design could include millions of them. Memristor arrays are especially suited to machine learning problems. The reason for this is the way that machine learning algorithms turn data into vectors—essentially, lists of data points. In predicting a patient’s risk of infection in a hospital, for instance, this vector might list numerical representations of a patient’s risk factors.

 

Then, machine learning algorithms compare these “input” vectors with “feature” vectors stored in memory. These feature vectors represent certain traits of the data (such as the presence of an underlying disease). If matched, the system knows that the input data has that trait. The vectors are stored in matrices, which are like the spreadsheets of mathematics, and these matrices can be mapped directly onto the memristor arrays.

 

What’s more, as data is fed through the array, the bulk of the mathematical processing occurs through the natural resistances in the memristors, eliminating the need to move feature vectors in and out of the memory to perform the computations. This makes the arrays highly efficient at complicated matrix calculations. Earlier studies demonstrated the potential of memristor arrays for speeding up machine learning, but they needed external computing elements to function.

 

Building a programmable memristor computer

To build the first programmable memristor computer, Lu’s team worked with associate professor Zhengya Zhang and professor Michael Flynn, both of electrical and computer engineering at U-M, to design a chip that could integrate the memristor array with all the other elements needed to program and run it. Those components included a conventional digital processor and communication channels, as well as digital/analog converters to serve as interpreters between the analog memristor array and the rest of the computer.

 

Lu’s team then integrated the memristor array directly on the chip at U-M’s Lurie Nanofabrication Facility. They also developed software to map machine learning algorithms onto the matrix-like structure of the memristor array. The team demonstrated the device with three bread-and-butter machine learning algorithms:

  • Perceptron, which is used to classify information. They were able to identify imperfect Greek letters with 100% accuracy
  • Sparse coding, which compresses and categorizes data, particularly images. The computer was able to find the most efficient way to reconstruct images in a set and identified patterns with 100% accuracy
  • Two-layer neural network, designed to find patterns in complex data. This two-layer network found commonalities and differentiating factors in breast cancer screening data and then classified each case as malignant or benign with 94.6% accuracy.

 

There are challenges in scaling up for commercial use—memristors can’t yet be made as identical as they need to be and the information stored in the array isn’t entirely reliable because it runs on analog’s continuum rather than the digital either/or. These are future directions of Lu’s group.

 

Lu plans to commercialize this technology. The study is titled, “A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations.” The research is funded by the Defense Advanced Research Projects Agency, the center for Applications Driving Architectures (ADA), and the National Science Foundation.

 

Memristor could implement brain-inspired device to power artificial systems

Artificial neural networks (ANNs) exhibit learning abilities and can perform tasks which are difficult for conventional computing systems, such as pattern recognition, on-line learning and classification. Practical ANN implementations are currently hampered by the lack of efficient hardware synapses; a key component that every ANN requires in large numbers. In the study, published in Nature Communications, the Southampton research team experimentally demonstrated an ANN that used memristor synapses supporting sophisticated learning rules in order to carry out reversible learning of noisy input data.

 

“Plastic synaptic connections are a key computational element of both the brain and brain-inspired neuromorphic systems. Outnumbering neurons by approximately 1,000 to 1 in the human brain, synapses have to perform their main function, namely interconnecting neural cells via an often modifiable coupling strength (a weight), within extremely tight volume and power budgets,” write the authors. Lead author Dr Alex Serb, from Electronics and Computer Science at the University of Southampton, said: “If we want to build artificial systems that can mimic the brain in function and power we need to use hundreds of billions, perhaps even trillions of artificial synapses, many of which must be able to implement learning rules of varying degrees of complexity.

 

“Memristors offer a possible route towards that end by supporting many fundamental features of learning synapses (memory storage, on-line learning, computationally powerful learning rule implementation, two-terminal structure) in extremely compact volumes and at exceptionally low energy costs. If artificial brains are ever going to become reality, therefore, memristive synapses have to succeed.”

 

“We show that we can use voltage timing to gradually increase or decrease the electrical conductance in this memristor-based system,” Wei Lu, a computer engineer at the University of Michigan said. “In our brains, similar changes in synapse conductance essentially give rise to long term memory.” Our work establishes such a technological paradigm shift, proving that nanoscale memristors can indeed be used to formulate in-silico neural circuits for processing big-data in real-time; a key challenge of modern society.

 

“We have shown that such hardware platforms can independently adapt to its environment without any human intervention and are very resilient in processing even noisy data in real-time reliably. This new type of hardware could find a diverse range of applications in pervasive sensing technologies to fuel real-time monitoring in harsh or inaccessible environments; a highly desirable capability for enabling the Internet of Things vision.”

 

Acting like synapses in the brain, the metal-oxide memristor array was capable of learning and re-learning input patterns in an unsupervised manner within a probabilistic winner-take-all (WTA) network. This is extremely useful for enabling low-power embedded processors (needed for the Internet of Things) that can process in real-time big data without any prior knowledge of the data.

 

Next-gen computing: Memristor chips that see patterns over pixel

“The tasks we ask of today’s computers have grown in complexity,” Lu said. “In this ‘big data’ era, computers require costly, constant and slow communications between their processor and memory to retrieve large amounts data. This makes them large, expensive and power-hungry.”

 

But like neural networks in a biological brain, networks of memristors can perform many operations at the same time, without having to move data around. As a result, they could enable new platforms that process a vast number of signals in parallel and are capable of advanced machine learning. Memristors are good candidates for deep neural networks, a branch of machine learning, which trains computers to execute processes without being explicitly programmed to do so.

 

“We need our next-generation electronics to be able to quickly process complex data in a dynamic environment. You can’t just write a program to do that. Sometimes you don’t even have a pre-defined task,” Lu said. “To make our systems smarter, we need to find ways for them to process a lot of data more efficiently. Our approach to accomplish that is inspired by neuroscience.”

 

A mammal’s brain is able to generate sweeping, split-second impressions of what the eyes take in. One reason is because they can quickly recognize different arrangements of shapes. Humans do this using only a limited number of neurons that become active, Lu says. Both neuroscientists and computer scientists call the process “sparse coding.”

 

“When we take a look at a chair we will recognize it because its characteristics correspond to our stored mental picture of a chair,” Lu said. “Although not all chairs are the same and some may differ from a mental prototype that serves as a standard, each chair retains some of the key characteristics necessary for easy recognition. Basically, the object is correctly recognized the moment it is properly classified—when ‘stored’ in the appropriate category in our heads.”

 

Lu’s next-generation computer components use pattern recognition to shortcut the energy-intensive process conventional systems use to dissect images. In this new work, he and his colleagues demonstrate an algorithm that relies on a technique called “sparse coding” to coax their 32-by-32 array of memristors to efficiently analyze and recreate several photos.

 

Similarly, Lu’s electronic system is designed to detect the patterns very efficiently—and to use as few features as possible to describe the original input. In our brains, different neurons recognize different patterns, Lu says. “When we see an image, the neurons that recognize it will become more active,” he said. “The neurons will also compete with each other to naturally create an efficient representation. We’re implementing this approach in our electronic system.”

 

The researchers trained their system to learn a “dictionary” of images. Trained on a set of grayscale image patterns, their memristor network was able to reconstruct images of famous paintings and photos and other test patterns. If their system can be scaled up, they expect to be able to process and analyze video in real time in a compact system that can be directly integrated with sensors or cameras.

 

Russian Scientists Test New Material for Neurocomputers

Russian scientists have tested a new material for neurocomputers that can store and process data in a similar way to human brain neurons.  Scientists have proposed new materials in which the bipolar effect of resistive switchings (BERS) can be realized. Significantly, these materials could serve as the basis for developing a computer based on memristors that can store and process data in a similar way to human brain neurons.

 

The BERS phenomenon is currently a popular area of research around the world, both in the fundamental and the applied sciences. It can be used for developing nonvolatile two-terminal memory cells, as well as for memristor, the fourth fundamental element in electronics. Memristors could serve as the basis for a new approach to data processing, the so-called membrane computing. Membrane computing is a new method of data processing in which short-term (RAM) and long-term (hard drive) memories are operated by elements that are similar to neurons in the human brain.

 

The effect of resistive switching is experienced when, subject to an external electric field, a material’s conductivity changes by several degrees thereby realizing two metastable conditions, high resistive and low resistive conditions. If the nature of the switching depends on the direction of the electric field, the effect is called bipolar. The results of the research were published in Materials Letters.

 

The physical mechanism of the switching itself depends on the material type. This may include the forming of conductive channels via the migration of metal ions, the forming of Schottky barriers, metal–insulator phase transitions, and other processes.

 

The scientists are from the Solid-State Physics and Nanosystems Department at the Institute of Laser and Plasma Technology of the National Research Nuclear University MEPhI (Moscow Engineering Physics Institute), and were working in cooperation with researchers from the Russian Academy of Sciences’ Solid-State Physics Institute and the Institute of Microelectronics Technology and High Purity Materials.

 

The National Research Nuclear University MEPhI (Moscow Engineering Physics Institute) is currently searching for new materials that can demonstrate BERS. Earlier it was found that BERS can be observed in systems with a strong electron correlation, e.g., materials with large magnetoresistance and high-temperature superconductors.

 

Eventually, the scientists decided in favor of epitaxial fields that form on the surface of a single-crystalline substrate of strontium titanate (epitaxy is a regular and organized growth of one crystalline substance on another). The researchers proved that these fields can be used to create memristors for a new generation of computers.

 

“The innovation in this research is in applying the lithography which allows developing the technology for miniaturization of resistive memory elements,” commented Andrei Ivanov, Associate Professor of the Solid-State Physics and Nanosystems Department at MEPhI.

 

Photonic Integrated-Based Memristor Gives AI Applicability in Quantum Computing

By developing a photonic quantum memristor, researchers at the University of Vienna, the National Research Council (CNR), and the Polytechnic University of Milan may have found a way to link artificial intelligence (AI) and quantum computing. The researchers engineered a device that works the same as a memristor, while encoding and transmitting quantum information and acting on quantum states.

 

 

References and Resources also include

https://spectrum.ieee.org/automaton/robotics/robotics-hardware/braininspired-robot-controller-uses-memristors-for-fast-efficient-movement?utm_source=dlvr.it&utm_medium=facebook&fbclid=IwAR35VshLzT82KJs88KYkiEqafKCwZJF_-yRXZHb7-9YgocrBQwim87LzbAI

 

 

 

Cite This Article

 
International Defense Security & Technology (March 25, 2023) Memristors for Neuromorphic processors to Quantum computing. Retrieved from https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/.
"Memristors for Neuromorphic processors to Quantum computing." International Defense Security & Technology - March 25, 2023, https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/
International Defense Security & Technology October 15, 2022 Memristors for Neuromorphic processors to Quantum computing., viewed March 25, 2023,<https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/>
International Defense Security & Technology - Memristors for Neuromorphic processors to Quantum computing. [Internet]. [Accessed March 25, 2023]. Available from: https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/
"Memristors for Neuromorphic processors to Quantum computing." International Defense Security & Technology - Accessed March 25, 2023. https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/
"Memristors for Neuromorphic processors to Quantum computing." International Defense Security & Technology [Online]. Available: https://idstch.com/technology/electronics/memristors-for-neuromorphic-processors-to-quantum-computing/. [Accessed: March 25, 2023]

About Rajesh Uppal

Check Also

DARPA NGMM for manufacturing Next Generation 3DHI (three-dimensional heterogeneous integration) microsystems

Microelectronics are essential to technology competition in economic and national security realms. Microelectronics are designated …

error: Content is protected !!