Robotland Bookstore

Tuesday, October 02, 2012

Robots with Skin

Prof. Dan Popa and his Next Generation Robotics research group at the University of Texas at Arlington are investigating fundamental design questions for multi-functional robotic skin sensors. Funded by NSF the researcher will find out how to optimize robotic skin sensors placement onto assistive robotic devices, how to have the robot and human "learn" how to use the skin sensors efficiently, and quantitatively assess the impact of this assistive technology to humans. The approach is to design and fabricate integrated micro-scale sensors in conjunction with iterative simulation and experimental studies of the performance of physical human-robot interaction enabled by this technology.
Credit: Vladimir Lumelsky, NASA GSFC
Co-robots of the future will share their living spaces with humans, and, like people, will wear sensor skins and clothing that must be interconnected, fitted, cleaned, repaired, and replaced. In addition to aesthetic purposes that increase societal acceptance, these sensorized garments will also enhance robot perception of the environment, and enable extraordinary levels of safety, cooperation, and therapy for humans. The research proposed here will unlock near-term and also unforeseen applications of robotic skin with broad applicability, and especially to home assistance, medical rehabilitation, and prosthetics.

Credit: RoboSKIN
A European consortium led by Prof. Giorgio Cannata, University of Genova behind the RoboSKIN project is investigating and developing a range of new robot capabilities based on the tactile feedback provided by a robotic skin from large areas of the robot body. Up to now, a principled investigation of these topics has been limited by the lack of tactile sensing technologies enabling large scale experimental activities, since so far skin technologies and embedded tactile sensors have been mostly demonstrated only at the prototypal stage. The new capabilities will improve the ability of robots to operate effectively and safely in unconstrained environments and also their ability to communicate and co-operate with each other and with humans.

German researchers at the Excellence Cluster CoTeSys at the Technical University Munich (TUM) have developed small hexagonal plates, which when joined together, provide a tactile-sensitive skin for autonomous robots.  This will not only help robots to better navigate in their environments, it will also enable robot self-perception for the first time. A single robotic arm has already been partially equipped with sensors and proves that the concept works.
The video below shows the newly developed Tactile Module in comparison with a human hand. On the backside the local controller is visible next to the four combined power and data ports. One BMA150 3-axis accelerometer (black box middle), six PT1000 temperature sensors (blue boxes) and four GP2S60 proximity sensors (outer black boxes) are visible on the front side. In the second part the KUKA robotic arm reacts towards the different modalities of a single sensor - proximity, acceleration and temperature.

No comments: