Robots have already become an indispensable part of our lives. Robots have revolutionized auto manufacturing, making plants safer and products more reliable — and reducing the number of people involved in the process. However, even inside a modern auto plant, robots have not been able to replace the human touch — at least, not in some areas. People are better than robots at manipulating complex shapes and threading them together — exactly the skills required to attach parts to engines. Another example in quality control, while Robots with sensors can test the spot welds, the people can run their hands over the surface of the metal body, feeling for imperfections.
Currently, most robots are relatively rigid machines which make unnatural movements. Soft robotics differ from traditional counterparts in some important ways: Soft robots have little or no hard internal structures. Instead they use a combination of muscularity and deformation to grasp things and move about. Rather than using motors, cables or gears, soft robots are often animated by pressurized air or liquids. In many cases soft robotics designs mimic natural, evolved biological forms hence also called bio-inspired robots. This, combined with their soft exteriors, can make soft robots more suitable for interaction with living things or even for use as human exoskeletons.
For humans, touch plays a vital role when we move our bodies. Touch, combined with sight, is crucial for tasks such as picking up objects – hard or soft, light or heavy, warm or cold – without damaging them. In the field of robotic manipulation, in which a robot hand or gripper has to pick up an object, adding the sense of touch could remove uncertainties in dealing with soft, fragile and deformable objects.
Medical applications are now a main driver behind the demand for flexible and robust force sensing. For example, smart skin could be used to restore sensory feedback to patients with skin damage or peripheral neuropathy (numbness or tingling). It could also be used to give prosthetic hands basic touch-sensing ability.
Soft Robotics arms with human touch can come in handy when carrying these soldiers without causing injury. “We have lost medics throughout the years because they have the courage to go forward and rescue their comrades under fire. With the newer technology, with the robotic vehicles we are using even today to examine and to detonate IEDs [improvised explosive devices], those same vehicles can go forward and retrieve casualties,” Major General Steve Jones, commander of the Army Medical Department Center, said. Evacuating casualties was only one of the roles for robots in battlefield medicine that Jones discussed. Another option is delivering medical supplies to dangerous areas, supporting troops operating behind enemy lines.
However providing sense of touch has proved quite a challenge. Quantifying touch in engineering terms not only requires the precise knowledge of the amount of external force applied to a touch sensor, but you also need to know the force’s exact position, its angle, and how it will interact with the object being manipulated. Then there is the question about how many of these sensors a robot would need. Understanding the physical mechanisms of touch sensing in the biological world provides great insights when it comes to designing the robotic equivalent, a smart skin. But a significant barrier for the development of smart skin is the electronics required. Developing a robot skin that could contain hundreds or even thousands of touch sensors is a challenging engineering task.
Recently, researchers from MIT and Harvard have developed a scalable tactile glove and combined it with artificial intelligence. Sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight, and explore the typical tactile patterns that emerge while grasping them. The researchers created a glove with 548 sensors assembled on a knitted fabric containing a piezoresistive film (which also reacts to pressure or strain) connected by a network of conductive thread electrodes.
Scientists develop robot that can feel
Group of roboticists in the Department of Biomedical Engineering at the Georgia Institute of Technology in Atlanta, has developed a robot arm that moves and finds objects by touch. In a paper published in the International Journal of Robotics Research, the Georgia Tech group described a robot arm that was able to reach into a cluttered environment and use “touch,” along with computer vision, to complete exacting tasks.
Dr. Kemp said the researchers using digital simulations and a simple set of primitive robot behaviors were able to develop algorithms used gave the arm qualities that seemed to mimic human behavior. For example, the robot was able to bend, compress and slide objects. Also, given parameters designed to limit how hard it could press on an object, the arm was able to pivot around objects automatically.
The arm was designed to essentially have “springs” at its joints, making it “compliant,” a term roboticists use to define components that are more flexible and less precise than conventional robotic mechanisms. Compliance has become increasingly important as a new generation of safer robots has emerged.The robot also has an fabric based artificial “skin” equipped with force sensors and thermal sensors that can sense pressure or touch enabling the home care robot to lightly touch different materials and identify it.
According to Georgia Tech, Director of the Healthcare Robotics Lab at Georgia Tech Charles C. Kemp said that, “These environments tend to have clutter. In a home, you can have lots of objects on a shelf, and the robot can’t see beyond that first row of objects.” The combination of the sensors can help the home care robot to know the difference between wood and metal. The experts from IEEE Spectrum indicate that the technique copies the way how the human skin uses thermal conductivity to classify different materials.
New robot has a human touch
A group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, has published a paper describing how stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand.
“Most robots today have sensors on the outside of the body that detect things from the surface,” Doctoral student Huichan Zhao is lead author of “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides,” said. “Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
Optical waveguides have been in use since the early 1970s for numerous sensing functions, including tactile, position and acoustic. Fabrication was originally a complicated process, but the advent over the last 20 years of soft lithography and 3-D printing has led to development of elastomeric sensors that are easily produced and incorporated into a soft robotic application.
Shepherd’s group employed a four-step soft lithography process to produce the core (through which light propagates), and the cladding (outer surface of the waveguide), which also houses the LED (light-emitting diode) and the photodiode. The more the prosthetic hand deforms, the more light is lost through the core. That variable loss of light, as detected by the photodiode, is what allows the prosthesis to “sense” its surroundings.
“If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor,” Shepherd said. “The amount of loss is dependent on how it’s bent.” The group used its optoelectronic prosthesis to perform a variety of tasks, including grasping and probing for both shape and texture. Most notably, the hand was able to scan three tomatoes and determine, by softness, which was the ripest.
This work was supported by a grant from Air Force Office of Scientific Research, and made use of the Cornell NanoScale Science and Technology Facility and the Cornell Center for Materials Research, both of which are supported by the National Science Foundation.
Soft Robotic Fingers Recognize Objects by Feel
Rus and her team at Distributed Robotics Lab at CSAIL have created bendable and stretchable robotic fingers made out of silicone rubber that can lift and handle objects as thin as a piece of paper and as delicate as an egg. Rus incorporated “bend sensors” into the silicone fingers so that they can send back information on the location and curvature of the object being grasped. Then, the robot can pick up an unfamiliar object and use the data to compare to already existing clusters of data points from past objects.
“By embedding flexible bend sensors into each finger, we got an idea of how much the finger bends, and we can close the loop from how much pressure we apply,” says Katzschmann. “In our case, we were using a piston based closed pneumatic system.” Currently, the robot can acquire three data points from a single grasp, meaning the robot’s algorithms can distinguish between objects which are very similar in size. The researchers hope that further advances in sensors will someday enable the system to distinguish between dozens of diverse objects.
References and Resources also include: