As a promising candidate for exhibiting quantum computational supremacy, Gaussian boson sampling (GBS) is designed to exploit the ease of experimental preparation of Gaussian states. However, sufficiently large and inevitable… Click to show full abstract
As a promising candidate for exhibiting quantum computational supremacy, Gaussian boson sampling (GBS) is designed to exploit the ease of experimental preparation of Gaussian states. However, sufficiently large and inevitable experimental noise might render GBS classically simulable. In this work, we formalize this intuition by establishing a sufficient condition for approximate polynomial-time classical simulation of noisy GBS-in the form of an inequality between the input squeezing parameter, the overall transmission rate, and the quality of photon detectors. Our result serves as a nonclassicality test that must be passed by any quantum computational supremacy demonstration based on GBS. We show that, for most linear-optical architectures, where photon loss increases exponentially with the circuit depth, noisy GBS loses its quantum advantage in the asymptotic limit. Our results thus delineate intermediate-sized regimes where GBS devices might considerably outperform classical computers for modest noise levels. Finally, we find that increasing the amount of input squeezing is helpful to evade our classical simulation algorithm, which suggests a potential route to mitigate photon loss.
               
Click one of the above tabs to view related content.