London, Jun 3 (IANS): A team of researchers, led by one of India-origin, has developed an electronic skin which can learn from feeling "pain" and could help create a new generation of smart robots with human-like sensitivity.
The team from the University of Glasgow developed the artificial skin with a new type of processing system based on asynaptic transistors, which mimics the brain's neural pathways in order to learn.
In earlier generations of electronic skin, that input data would be sent to a computer to be processed.
In the new e-skin, described in the journal Science Robotics, instead, a circuit built into the skin acts as an artificial synapse, reducing the input down into a simple spike of voltage whose frequency varies according to the level of pressure applied to the skin, speeding up the process of reaction.
The team used the varying output of that voltage spike to teach the skin appropriate responses to simulated pain, which would trigger the robot hand to react. By setting a threshold of input voltage to cause a reaction, the team could make the robot hand recoil from a sharp jab in the centre of its palm.
"What we've been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn't need to send messages back and forth to a central processor before taking action. Instead, it greatly accelerates the process of responding to touch by cutting down the amount of computation required," said Professor Ravinder Dahiya, of the University's James Watt School of Engineering.
"We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli," he added.
To build an electronic skin capable of a computationally efficient, synapse-like response, the researchers printed a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a flexible plastic surface. Then, they connected the synaptic transistor with the skin sensor present over the palm of a fully-articulated, human-shaped robot hand.
When the sensor is touched, it registers a change in its electrical resistance -- a small change corresponds to a light touch, and harder touch creates a larger change in resistance. This input is designed to mimic the way sensory neurons work in the human body.
In other words, it learned to move away from a source of simulated discomfort through a process of onboard information processing that mimics how the human nervous system works.