Modern image sensors consist of systems of cascaded and bulky spherical optics for imaging with minimal aberrations. While these systems provide high quality images, their improved functionality comes at the cost of increased size and weight. One route to reduce a system’s complexity is via computational imaging, in which much of the aberration correction and functionality of the optics is shifted to post-processing in software. Alternatively, a designer could miniaturize the optics by replacing them with diffractive optical elements, which mimic the functionality of refractive systems in a more compact form factor. Meta-optics are an extreme example of such diffractive elements, in which quasiperiodic arrays of resonant subwavelength optical antennas impart spatially varying changes on a wavefront. While separately both computational imaging and meta-optics are promising avenues toward simplifying optical systems, a synergistic combination of these fields can further enhance system performance and facilitate advanced capabilities.
In this talk, I will present a method to combine these two techniques to enable ultrathin optics for performing full-color and varifocal imaging across the whole visible spectrum as well as high precision depth sensing. I will also discuss the use of computational techniques for designing meta-optics with exotic behaviors lacking any intuition-informed design, as well as for performing computation on incident light, with applications in optical information processing, sensing, and computing. By combining meta-optics and software backend, we can realize compact imaging systems with unprecedented functionalities, including broadband aberration-free imaging, depth sensing and optical computing. We believe such hybrid digital-optical system will create a new research field on “Software Defined Optics”, akin to Software Defined Radio, where the software is used to simplify the hardware.