Event-based 6-DoF object tracking with distance field reaching 130 Hz


This is the project page for the paper published in IEEE Robotics and Automation Letters (RA-L).

In this work, we present an event-based 6-DoF object tracking method using distance fields. By leveraging the high temporal resolution of event cameras, our system achieves a tracking speed of 130 Hz, making it suitable for high-speed robotic manipulation and visual servoing tasks.



Paper


[RA-L'26] Yufan Kang, Ryoichi Ishikawa, Guillaume Caron, Takeshi Oishi. Event-based 6-DoF object tracking with distance field reaching 130 Hz. IEEE Robotics and Automation Letters (RA-L), 2026.

Video


[Video Coming Soon]

Citation

BibTeX:

@article{kang2026event,
  title={Event-based 6-DoF object tracking with distance field reaching 130 Hz},
  author={Kang, Yufan and Ishikawa, Ryoichi and Caron, Guillaume and Oishi, Takeshi},
  journal={IEEE Robotics and Automation Letters},
  year={2026},
  publisher={IEEE}
}

Plain Text:

Kang Y, Ishikawa R, Caron G, et al. Event-based 6-DoF object tracking with distance field reaching 130 Hz[J]. IEEE Robotics and Automation Letters, 2026.

Abstract / Description

This research focuses on the challenge of real-time 3D object tracking using event-based vision. Unlike frame-based methods, our approach processes asynchronous events to update the object's 6-DoF pose with minimal latency. We utilize a distance field representation to efficiently match events with the object model, achieving robust tracking at 130 Hz on standard hardware.

Dataset (5events16)

We provide the 5events16 dataset, containing the event-based sequences (ROS bags) and 3D object models used in our experiments. Each bag file contains raw event data, camera calibration, and ground truth poses.

3D Models (.ply): ROS Bags (.bag):