'Mini brains' help robots recognise pain


Tuesday, 27 October, 2020


'Mini brains' help robots recognise pain

Using a brain-inspired approach, scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed a way for robots to have the artificial intelligence (AI) to recognise pain and to self-repair when damaged.

Described in the journal Nature Communications, the NTU system has AI-enabled sensor nodes to process and respond to ‘pain’ arising from pressure exerted by a physical force. It also allows a robot to detect and repair its own damage when ‘injured’, without the need for human intervention.

Currently, robots use a network of sensors to generate information about their immediate environment. For example, a disaster rescue robot uses camera and microphone sensors to locate a survivor under debris and then pulls the person out with guidance from touch sensors on their arms. A factory robot working on an assembly line uses vision to guide its arm to the right location and touch sensors to determine if the object is slipping when picked up.

Today’s sensors typically do not process information but send it to a single large, powerful, central processing unit where learning occurs. As a result, existing robots are usually heavily wired, which can result in delayed response times. They are also susceptible to damage that will require maintenance and repair, which can be long and costly.

The NTU approach embeds AI into the network of sensor nodes, connected to multiple small, less powerful, processing units that act like ‘mini brains’ distributed on the robotic skin. This means learning happens locally and the wiring requirements and response time for the robot are reduced 5–10 times compared to conventional robots, the scientists say. In addition, combining the system with a type of self-healing ion gel material means that the robots, when damaged, can recover their mechanical functions without human intervention.

“For robots to work together with humans one day, one concern is how to ensure they will interact safely with us,” said Associate Professor Arindam Basu, co-lead author of the study. “For that reason, scientists around the world have been finding ways to bring a sense of awareness to robots, such as being able to ‘feel’ pain, to react to it and to withstand harsh operating conditions. However, the complexity of putting together the multitude of sensors required and the resultant fragility of such a system is a major barrier for widespread adoption.

“Our work has demonstrated the feasibility of a robotic system that is capable of processing information efficiently with minimal wiring and circuits. By reducing the number of electronic components required, our system should become affordable and scalable. This will help accelerate the adoption of a new generation of robots in the marketplace.”

To teach the robot how to recognise pain and learn damaging stimuli, the research team fashioned memtransistors, which are ‘brain-like’ electronic devices capable of memory and information processing, as artificial pain receptors and synapses. Through lab experiments, the team demonstrated how the robot was able to learn to respond to injury in real time. They also showed that the robot continued to respond to pressure even after damage, proving the robustness of the system.

When ‘injured’ with a cut from a sharp object, the robot quickly loses mechanical function. But the molecules in the self-healing ion gel begin to interact, causing the robot to ‘stitch’ its ‘wound’ together and to restore its function while maintaining high responsiveness.

“The self-healing properties of these novel devices help the robotic system to repeatedly stitch itself together when injured with a cut or scratch, even at room temperature,” said Rohit Abraham John, first author of the study. “This mimics how our biological system works, much like the way human skin heals on its own after a cut.

“In our tests, our robot can ‘survive’ and respond to unintentional mechanical damage arising from minor injuries such as scratches and bumps, while continuing to work effectively. If such a system were used with robots in real-world settings, it could contribute to savings in maintenance.”

Associate Professor Nripan Mathews, co-lead author of the study, concluded, “Conventional robots carry out tasks in a structured programmable manner, but ours can perceive their environment, learning and adapting behaviour accordingly. Most researchers focus on making more and more sensitive sensors, but do not focus on the challenges of how they can make decisions effectively. Such research is necessary for the next generation of robots to interact effectively with humans.

“In this work, our team has taken an approach that is off the beaten path, by applying new learning materials, devices and fabrication methods for robots to mimic the human neurobiological functions. While still at a prototype stage, our findings have laid down important frameworks for the field, pointing the way forward for researchers to tackle these challenges.”

The research team is now looking to collaborate with industry partners and government research labs to enhance their system for larger scale applications.

Please follow us and share on Twitter and Facebook. You can also subscribe for FREE to our weekly newsletter and bimonthly magazine.

Related Articles

Ultra-thin fibres can turn clothes into wearable electronics

Researchers have developed ultra-thin semiconductor fibres that can be woven into fabrics,...

Sound-powered sensors stand to save millions of batteries

Researchers at ETH Zurich have developed a sensor that utilises energy from sound waves to...

Nordic-powered sensors and Memfault remotely supervise machinery lubrication management

The GreaseBoss Endpoint employs edge processing capabilities of nRF52833 SoC and Memfault's...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd