Program Summary
Detection of photons—the fundamental particles of light—is ubiquitous, but performance limitations of existing photon detectors hinders the effectiveness of applications such as light/laser detection and ranging (LIDAR/LADAR), photography, astronomy, quantum information science, medical imaging, microscopy, and communications. In all of these applications, performance could be improved by replacing classical, analog light detectors with high-performance photon counting detectors.
Today, the performance of different classes of photon detectors (e.g., semiconductor detectors, superconductor detectors, biological detectors, and others) varies significantly with respect to the key metrics of timing jitter, dark counts, maximum rate, bandwidth, efficiency, photon-number resolution, operating temperature, and array/pixel size. Is it possible for a new detector design to combine the best performance characteristics of all other detector technologies in a single device? Is it possible for a new design to far exceed current performance in all metrics, simultaneously? Or does a fundamental physics-based "tradespace" force performance tradeoffs between these metrics? If so, what are the boundaries of this tradespace? Is our current understanding of the theory of photon detection sufficient to answer these questions?
The Fundamental Limits of Photon Detection (Detect) Program will establish the first-principles limits of photon detector performance by developing new models of photon detection in a variety of technology platforms, and by testing those models in proof-of-concept experiments.