Show HN:将此触摸传感器放在机器人上并学习超精确的任务 Show HN: Put this touch sensor on a robot and learn super precise tasks

原始链接: https://any-skin.github.io

该论文介绍了“AnySkin”,这是一种新型触觉传感器,旨在通过解决与灵活性、耐用性和易于更换相关的问题来改进以前的传感器。 与视觉或本体感觉相比,触觉在机器人技术中尚未得到充分利用。 设计师从“ReSkin”中汲取灵感,但进行了改进,例如将传感硬件与皮肤界面分离,从而更容易实现,类似于连接手机壳。 此外,与许多当前的解决方案不同,该设备可以学习跨不同实例的操作技能(跨实例泛化)。 提出了三项主要进展:设计更简单的制造工艺和工具,用于创建非粘性、坚固、可互换的磁性触觉传感器; 使用 AnySkin 传感器分析摩擦和训练动作; 最后,展示使用一种 AnySkin 创建的模型对各种其他 AnySkin 的适用性,并与 DIGIT 和 ReSkin 等已建立的触觉系统进行对比。

The paper introduces 'AnySkin', a novel tactile sensor that aims to improve upon previous sensors by addressing issues related to flexibility, durability, and ease of replacement. Compared to vision or proprioception, touch has been underutilized in robotics. The designers took inspiration from 'ReSkin' but made improvements such as separating the sensing hardware from the skin interface, allowing for easier implementation similar to attaching a phone case. Moreover, this device can learn manipulation skills across different instances (cross-instance generalization), unlike many current solutions. Three main advancements were presented: designing a simpler manufacturing process and tools for creating non-adhesive, robust, interchangeable magnetic tactile sensors; analyzing friction and training maneuvers using the AnySkin sensor; and lastly, demonstrating the applicability of models created using one AnySkin to various other AnySkins, contrasted against established tactile systems like DIGIT and ReSkin.


While tactile sensing is widely accepted as an important and useful sensing modality, its use pales in comparison to other sensory modalities like vision and proprioception. AnySkin addresses the critical challenges of versatility, replaceability, and data reusability, which have so far impeded the development of an effective solution.

Building on the simplistic design of ReSkin, and decoupling the sensing electronics from the sensing interface, AnySkin simplifies integration making it as straightforward as putting on a phone case and connecting a charger. Furthermore, AnySkin is the first sensor with cross-instance generalizability of learned manipulation policies.

This work makes three key contributions: first, we introduce a streamlined fabrication process and a design tool for creating an adhesive-free, durable and easily replaceable magnetic tactile sensor; second, we characterize slip detection and policy learning with a AnySkin sensor; and finally, we demonstrate the generalizability of models trained on one instance of AnySkin to new instances, and compare it with popular existing tactile solutions like DIGIT and ReSkin.

相关文章
联系我们 contact @ memedata.com