论文标题

在动态环境中基于事件的视觉跟踪

Event-based Visual Tracking in Dynamic Environments

论文作者

Perez-Salesa, Irene, Aldana-Lopez, Rodrigo, Sagues, Carlos

论文摘要

传统摄像机的功能可能会阻碍具有挑战性的运动和光条件下的视觉对象跟踪,该功能容易产生模糊的图像。事件摄像机是适合在这些条件下执行视觉任务的新型传感器。但是,由于其输出的性质,将它们应用于对象检测和跟踪是不平凡的。在这项工作中,我们提出了一个框架,以利用事件摄像机和现成的深度学习进行对象跟踪。我们表明,将事件数据重建为强度框架可改善在传统摄像机无法提供可接受结果的条件下的跟踪性能。

Visual object tracking under challenging conditions of motion and light can be hindered by the capabilities of conventional cameras, prone to producing images with motion blur. Event cameras are novel sensors suited to robustly perform vision tasks under these conditions. However, due to the nature of their output, applying them to object detection and tracking is non-trivial. In this work, we propose a framework to take advantage of both event cameras and off-the-shelf deep learning for object tracking. We show that reconstructing event data into intensity frames improves the tracking performance in conditions under which conventional cameras fail to provide acceptable results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源