4.6 Article

Metasurface Generation of Paired Accelerating and Rotating Optical Beams for Passive Ranging and Scene Reconstruction

Journal

ACS PHOTONICS
Volume 7, Issue 6, Pages 1529-1536

Publisher

AMER CHEMICAL SOC
DOI: 10.1021/acsphotonics.0c00354

Keywords

computational imaging; depth sensors; metasurfaces; extended depth of focus; ranging; wavefront coding

Funding

  1. Samsung GRO grant
  2. UW Reality Lab
  3. Facebook
  4. Google
  5. Huawei
  6. Washington Research Foundation Distinguished Investigator Award
  7. National Science Foundation [NNCI-1542101, 1337840, 0335765]
  8. National Institutes of Health
  9. Molecular Engineering AMP
  10. Sciences Institute
  11. Clean Energy Institute
  12. Washington Research Foundation
  13. M. J. Murdock Charitable Trust
  14. Altatech
  15. ClassOne Technology
  16. GCE Market
  17. SPTS
  18. Div Of Electrical, Commun & Cyber Sys
  19. Directorate For Engineering [1337840] Funding Source: National Science Foundation

Ask authors/readers for more resources

Depth measurements are vital for many emerging technologies with applications in augmented reality, robotics, gesture detection, and facial recognition. These applications, however, demand compact and low-power systems beyond the capabilities of many state-of-the-art depth cameras. While active illumination techniques can enable precise scene reconstruction, they increase power consumption, and systems that employ stereo require extended form factors to separate viewpoints. Here, we exploit a single, spatially multiplexed aperture of nanoscatterers to demonstrate a solution that replicates the functionality of a high-performance depth camera typically comprising a spatial light modulator, polarizer, and multiple lenses. Using cylindrical nanoscatterers that can arbitrarily modify the phase of an incident wavefront, we passively encode two complementary optical responses to depth information in a scene. The designed optical metasurfaces simultaneously generate a focused accelerating beam and a focused rotating beam that exploit wavefront propagation-invariance to produce paired, adjacent images with a single camera snapshot. Compared to conventional depth from defocus methods, this technique enhances both the depth precision and depth of field at the same time. By decoding the captured data in software, our system produces a fully reconstructed image and transverse depth map, providing an optically passive ranging solution. In our reconstruction algorithm, we account for the field curvature of our metasurface by calculating the change in Gouy phase over the field of view, enabling a fractional ranging error of 1.7%. We demonstrate a precise, visible wavelength, and polarization-insensitive metasurface depth camera with a compact 2 mm(2) aperture.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available