More buck-per-shot: Why learning trumps mitigation in noisy quantum sensing

Published in ArXiv, 2024

Recommended citation: A.Ijaz, et al. arXiv:2410.00197

[ArXiv2024]

Abstract

Quantum sensing is one of the most promising applications for quantum technologies. However, reaching the ultimate sensitivities enabled by the laws of quantum mechanics can be a challenging task in realistic scenarios where noise is present. While several strategies have been proposed to deal with the detrimental effects of noise, these come at the cost of an extra shot budget. Given that shots are a precious resource for sensing – as infinite measurements could lead to infinite precision – care must be taken to truly guarantee that any shot not being used for sensing is actually leading to some metrological improvement. In this work, we study whether investing shots in error-mitigation, inference techniques, or combinations thereof, can improve the sensitivity of a noisy quantum sensor on a (shot) budget. We present a detailed bias-variance error analysis for various sensing protocols. Our results show that the costs of zero-noise extrapolation techniques outweigh their benefits. We also find that pre-characterizing a quantum sensor via inference techniques leads to the best performance, under the assumption that the sensor is sufficiently stable.