Event-based cameras provide a new visual sensing model by detecting changes in image intensity asynchronously at almost unlimited frame rates. This opens the possibility for visual-inertial localization and mapping in extremely high speed and high dynamic range situations where traditional cameras fail. This new frame-less mode of operation, however, prohibits intensity gradient computations and necessitates new techniques for feature tracking and visual odometry. Our work proposes event-based features by grouping events in small spatiotemporal windows of duration determined by the optical flow length. Our feature tracking method alternates between probabilistic data association of events to features and optical flow computation based on expectation maximization (EM) of a translation model over all data associations. To enable long feature tracks, we also compute an affine deformation with respect to the initial feature point and use the resulting residual as a measure of persistence. We proposed a visual-inertial odometry algorithm that fuses the event-based features with inertial measurements to provide 6-D camera state estimates at a rate proportional to the camera velocity. The inferred trajectory is used to reduce the dimensionality of affine template matching during feature tracking, while events from the previous time step are used to reduce the complexity of optical flow estimation.
@inproceedings{Zhu_EVIO_CVPR17, author = {A. Zhu and N. Atanasov and K. Daniilidis}, title = {Event-based Visual Inertial Odometry}, booktitle = {IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)}, year = {2017}, doi = {http://www.doi.org/10.1109/CVPR.2017.616} }
@inproceedings{Zhu_EVFT_ICRA17, author = {A. Zhu and N. Atanasov and K. Daniilidis}, title = {Event-based Feature Tracking with Probabilistic Data Association}, booktitle = {IEEE Int. Conf. on Robotics and Automation (ICRA)}, year = {2017}, doi = {http://www.doi.org/10.1109/ICRA.2017.7989517} }