论文标题
增强具有触觉感应的内手物体的可通用的6D姿势跟踪
Enhancing Generalizable 6D Pose Tracking of an In-Hand Object with Tactile Sensing
论文作者
论文摘要
当操纵对象完成复杂的任务时,人类依靠视觉和触摸来跟踪对象的6D姿势。但是,机器人技术中的大多数现有对象姿势跟踪系统仅依赖于视觉信号,这阻碍了机器人有效操纵对象的能力。为了解决此限制,我们介绍了Teg-Track,这是一种触觉增强的6D姿势跟踪系统,可以跟踪以前看不见的对象。从连续的触觉信号中,TEG-Track不出现滑倒时从标记流中优化对象速度,或者在检测到滑板时使用滑板估计网络回归速度。估计的对象速度被整合到几何基因优化方案中,以增强现有的视觉姿势跟踪器。为了评估我们的方法并促进未来的研究,我们构建了一个现实世界中的数据集,用于视觉触诊内物体姿势跟踪。实验结果表明,TEG轨道始终增强了在合成和现实世界中的最新可概括的6D姿势跟踪器。我们的代码和数据集可在https://github.com/leolylyu/teg-track上找到。
When manipulating an object to accomplish complex tasks, humans rely on both vision and touch to keep track of the object's 6D pose. However, most existing object pose tracking systems in robotics rely exclusively on visual signals, which hinder a robot's ability to manipulate objects effectively. To address this limitation, we introduce TEG-Track, a tactile-enhanced 6D pose tracking system that can track previously unseen objects held in hand. From consecutive tactile signals, TEG-Track optimizes object velocities from marker flows when slippage does not occur, or regresses velocities using a slippage estimation network when slippage is detected. The estimated object velocities are integrated into a geometric-kinematic optimization scheme to enhance existing visual pose trackers. To evaluate our method and to facilitate future research, we construct a real-world dataset for visual-tactile in-hand object pose tracking. Experimental results demonstrate that TEG-Track consistently enhances state-of-the-art generalizable 6D pose trackers in synthetic and real-world scenarios. Our code and dataset are available at https://github.com/leolyliu/TEG-Track.