Unsupervised Event-based Learning of Optical Flow, Depth and Egomotion

Alex Zihao Zhu, Liangzhe Yuan, Kenneth Chaney, Kostas Daniilidis
Conference on Computer Vision and Pattern Recognition 2019


In this work, we propose a novel framework for unsupervised learning for event cameras that learns motion information from only the event stream. In particular, we propose an input representation of the events in the form of a discretized volume that maintains the temporal distribution of the events, which we pass through a neural network to predict the motion of the events. This motion is used to attempt to remove any motion blur in the event image. We then propose a loss function applied to the motion compensated event image that measures the motion blur in this image. We train two networks with this framework, one to predict optical flow, and one to predict egomotion and depths, and evaluate these networks on the Multi Vehicle Stereo Event Camera dataset, along with qualitative results from a variety of different scenes.

Realtime Time Synchronized Event-based Stereo

Alex Zihao Zhu, Yibo Chen, Kostas Daniilidis
European Conference on Computer Vision 2018


In this work, we propose a novel event based stereo method which addresses the problem of motion blur for a moving event camera. Our method uses the velocity of the camera and a range of disparities, to synchronize the positions of the events, as if they were captured at a single point in time. We represent these events using a pair of novel time synchronized event disparity volumes, which we show remove motion blur for pixels at the correct disparity in the volume, while further blurring pixels at the wrong disparity. We then apply a novel matching cost over these time synchronized event disparity volumes, which both rewards similarity between the volumes while penalizing blurriness. We show that our method outperforms more expensive, smoothing based event stereo methods, by evaluating on the Multi Vehicle Stereo Event Camera dataset.

EV-FlowNet: Self-Supervised Optical Flow Estimation for Event-based Cameras

Alex Zihao Zhu, Liangzhe Yuan, Kenneth Chaney, Kostas Daniilidis
Robotics: Science and Systems 2018
Best Student Paper Finalist (1 of 3)


We present a novel deep learning architecture for predicting optical flow from only events, and trained without any ground truth optical flow labels. The network is trained by applying a photoconsistency loss to the grayscale images captured with the events on a camera similar the DAVIS sensor, warped using the flow predictions from the network. In addition, we will provide a novel optical flow dataset for event based cameras, with ground truth flow labels, on which we evaluate our method against the frame based state of the art.

The code for this project can be found here: https://github.com/daniilidis-group/EV-FlowNet.

The Multi Vehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception

Alex Zihao Zhu, Dinesh Thakur, Tolga Özaslan, Bernd Pfrommer, Vijay Kumar, Kostas Daniilidis
Published in Robotics and Automation Letters 2018, will be presented at the International Conference on Robotics and Automation 2018


The Multi Vehicle Stereo Event Camera dataset is a collection of data designed for the development of novel 3D perception algorithms for event based cameras. Stereo event data is collected from car, motorbike, hexacopter and handheld data, and fused with lidar, IMU, motion capture and GPS to provide ground truth pose and depth images. In addition, we provide images from a standard stereo frame based camera pair for comparison with traditional techniques.

You can find the dataset here: https://daniilidis-group.github.io/mvsec/.

Event-based Visual Inertial Odometry

Alex Zihao Zhu, Nikolay Atanasov, Kostas Daniilidis
Conference on Computer Vision and Pattern Recognition 2017


In this paper, we present a novel visual inertial odometry algorithm using an event-based camera and an IMU. Building off our previous work on event-based feature tracking, we feed the tracked features into a state of the art extended kalman filter framework (MSCKF). The state estimates from the filter are then fed back into the feature tracker to reduce the dimensionality of the affine template matching step, while a new formulation using the events from the previous time step is used to reduce the complexity of the optical flow estimation step. In addition, several outlier rejection steps are introduced to make the filter robust to bad tracks.

For more results, please see my YouTube page, or this playlist.

The code for the feature tracking component of this research can be found here: https://github.com/daniilidis-group/event_feature_tracking.

Event-based Feature Tracking with Probabilistic Data Association

Alex Zihao Zhu, Nikolay Atanasov, Kostas Daniilidis
International Conference on Robotics and Automation 2017


We present a novel algorithm for tracking feature points using only the event stream.

The algorithm consists of

  • A novel optical flow estimation technique within small spatiotemporal windows of events with constant optical flow. The method models the data association between events from the same image point with a soft probability, and solves the data association problem jointly with optical flow estimation.
  • A template alignment technique with an affine deformation assumption to remove feature drift.
  • A technique to vary the temporal window size in order to maintain constant optical flow within each window.

The code for this research can be found here: https://github.com/daniilidis-group/event_feature_tracking.

Controlling a Robotic Stereo Camera Under Image Quantization Noise

Charlie Freundlich, Yan Zhang, Alex Zihao Zhu, Philippos Mordohai, Michael Zavlanos
Internal Journal of Robotics Research 2017


An active control scheme to minimize the uncertainty of the triangulation of a set of points using a stereo camera with a Kalman Filter, while maintaining field of view constraints.

create_duke

Fast, Autonomous Flight in GPS-Denied and Cluttered Environments

K. Mohta, M. Watterson, Y. Mulgaonkar, S. Liu, C. Qu, A. Maikneni, K. Saulnier, K. Sun, A. Zhu, J. Delmerico, K. Karydis, N. Atanasov, G. Loianno, D. Scaramuzza, K. Daniilidis, C. J. Taylor, V. Kumar
Journal of Field Robotics 2017


A report of our work on the DARPA Fast Lightweight Autonomy program, with goals to fly a lightweight quadrotor at speeds up to 20m/s through challenging outdoor forest and indoor cluttered environments.