Vision-guided artificial skin gives robots a sense of touch
Japanese researchers have constructed a 3D vision-guided artificial skin for robots that enables tactile sensing with high performance, opening doors to applications in medicine, health care and industry. Their work has been published in IEEE Transactions on Robotics.
Robots can be found in a wide variety of roles in medicine, rehabilitation, agriculture and marine navigation. Since a lot of these roles require human contact, robots are expected to become adept at interacting with humans in a safe and intelligent manner. One way to accomplish this goal is by endowing robots with the ability to perceive touch; accordingly, attempts have been made to develop artificial ‘skins’ capable of inducing tactile sensations and allowing robots to be more aware of their surrounding environment. However, the endeavour remains challenging.
“The main challenge lies in mimicking the inherent complexity of natural skin structure that has a particularly high density of mechanoreceptors with specialised functions such as sensing pressure, vibrations, temperature and pain,” said Associate Professor Van Anh Ho from the Japan Advanced Institute of Science and Technology (JAIST). “All approaches so far have only focused on developing a skin-like structure with a matrix of different sensors without considering the bulk of wires, electronic components and the risk of damage from frequent contact.”
Working together with JAIST doctoral student Lac Van Duong, Prof Ho developed a high-performance, vision-based artificial sensing system that is low cost, has a relative simple structure and is scalable. Named TacLINK, the system can process tactile information and even determine contact force and contact geometry upon interacting with the surroundings.
The researchers based the structure of TacLINK essentially on a transparent acrylic tube (serving as a rigid bone frame) covered by a continuous soft artificial skin with a sensing area of about 500 cm2. They used silicone rubber to fabricate the artificial skin due to its high elasticity and smoothness. Moreover, the material could be inflated to change its form and stiffness. The researchers printed an array of markers on the surface of this skin to track its deformation instead of embedding sensors or electronic components inside the skin. This greatly reduced its bulkiness, cost and chances of possible damage.
The vision system consisted of two co-axial cameras arranged to form a stereo camera that tracked the 3D displacement of the markers on the inner wall of the skin. In addition, researchers employed a finite element model (FEM) to estimate the structural stiffness of the skin. By combining the data from both these sources, they were able to reconstruct the contact geometry and contact force distribution simultaneously. Unlike in previous studies, this method worked for multiple contact points.
Prof Ho is hopeful about the creation of a future generation of touch-sensing-enabled robotic devices, saying, “The artificial skin used in our study can be easily fabricated by the casting method and can, therefore, be implemented on other parts of robots, such as fingers, legs, chests and heads, and even for smart prosthetics for humans, allowing a disabled person to perceive sensations the same way as a normal human.
“In addition, it can also be used to design various sensory devices in medicine, health care and industry. In fact, it is especially suited for the development of robotic systems in the post-COVID era to enable remote service with robotic avatars.”
Printed/flexible sensors can measure a wide range of parameters and can be employed in emerging...
If your concerns are around the risks of COVID-19 and the proximity of your co-workers, then you...
Singaporean scientists have developed a way for robots to have the artificial intelligence (AI)...