3.8 Proceedings Paper

Time Lens: Event-based Video Frame Interpolation

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/CVPR46437.2021.01589

Keywords

-

Funding

  1. Huawei Zurich Research Center
  2. National Centre of Competence in Research (NCCR) Robotics through the Swiss National Science Foundation (SNSF)
  3. European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme [864042]

Ask authors/readers for more resources

State-of-the-art frame interpolation methods use optical flow for generating intermediate frames, but this can lead to errors in highly dynamic scenarios. Event cameras address this limitation by providing additional visual information in the blind-time between frames.
State-of-the-art frame interpolation methods generate intermediate frames by inferring object motions in the image from consecutive key-frames. In the absence of additional information, first-order approximations, i.e. optical flow, must be used, but this choice restricts the types of motions that can be modeled, leading to errors in highly dynamic scenarios. Event cameras are novel sensors that address this limitation by providing auxiliary visual information in the blind-time between frames. They asynchronously measure per-pixel brightness changes and do this with high temporal resolution and low latency. Event-based frame interpolation methods typically adopt a synthesis-based approach, where predicted frame residuals are directly applied to the key-frames. However, while these approaches can capture non-linear motions they suffer from ghosting and perform poorly in low-texture regions with few events. Thus, synthesis-based and flow-based approaches are complementary. In this work, we introduce Time Lens, a novel method that leverages the advantages of both. We extensively evaluate our method on three synthetic and two real benchmarks where we show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods. Finally, we release a new large-scale dataset in highly dynamic scenarios, aimed at pushing the limits of existing methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available