Materials engineering has reached a significant new milestone with the development of an artificial skin capable of replicating the complex perceptual abilities of human skin. This innovation, conceived by a team of researchers at the University of Cambridge, allows robots to acquire a sense of touch, distinguishing stimuli such as pressure, temperature, and damage detection. This advancement, published in Science Robotics, is no longer a mere science fiction concept but a concrete reality with profound technological implications.
The technology behind multimodal perception
Conventional electronic skins commonly employ multiple sensors for each type of stimulus, leading to issues related to interference and fragility.
The distinctive element of the innovation, devised by the Cambridge research team, lies in the simplicity and effectiveness of the material adopted: a specific sensor integrated into a hydrogel. This hydrogel is a gelatinous substance characterized by its softness, elasticity, and electrical conductivity, and it can be molded into any shape. The material’s inherent flexibility allows the electronic skin to be wrapped around complex structures like robotic hands, ensuring unprecedented adaptability.
This skin’s ability to “feel” stems from electrical impedance tomography techniques, through which researchers gained access to an impressive network of 863,040 conductive pathways within the hydrogel membrane.
The sensory information, though complex and overlapping, is structured using data-driven techniques. This means a machine learning model (Artificial Intelligence) is trained to recognize and distinguish different signals, enabling the skin to precisely identify at least six distinct types of multimodal stimuli.
From robotic hands to sensitive systems
To demonstrate the versatility of this innovation, the hydrogel was molded into the shape and size of an adult human hand. Among the innovative functionalities it proved capable of performing are the prediction of environmental conditions through variations in temperature or pressure, localization of human touch, and the generation of proprioceptive data.
This new approach opens up new directions for the information-driven design of single-layer skins in sensitive systems. The potential impact is enormous, promising significant advancements in robotics, the development of advanced prosthetics, and human-machine interaction, bringing us ever closer to a future where robots can “feel” the world around them with unprecedented sensitivity.