|Credit: Vladimir Lumelsky, NASA GSFC|
CoTeSys at the Technical University Munich (TUM) have developed small hexagonal plates, which when joined together, provide a tactile-sensitive skin for autonomous robots. This will not only help robots to better navigate in their environments, it will also enable robot self-perception for the first time. A single robotic arm has already been partially equipped with sensors and proves that the concept works.
The video below shows the newly developed Tactile Module in comparison with a human hand. On the backside the local controller is visible next to the four combined power and data ports. One BMA150 3-axis accelerometer (black box middle), six PT1000 temperature sensors (blue boxes) and four GP2S60 proximity sensors (outer black boxes) are visible on the front side. In the second part the KUKA robotic arm reacts towards the different modalities of a single sensor - proximity, acceleration and temperature.