Universal bound on sampling bosons in linear optics and its computational implications

Man Hong Yung, Xun Gao, Joonsuk Huh

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

In linear optics, photons are scattered in a network through passive optical elements including beam splitters and phase shifters, leading to many intriguing applications in physics, such as Mach-Zehnder interferometry, the Hong-Ou-Mandel effect, and tests of fundamental quantum mechanics. Here we present the fundamental limit in the transition amplitudes of bosons, applicable to all physical linear optical networks. Apart from boson sampling, this transition bound results in many other interesting applications, including behaviors of Bose-Einstein condensates (BEC) in optical networks, counterparts of Hong-Ou-Mandel effects for multiple photons, and approximating permanents of matrices. In addition, this general bound implies the existence of a polynomial-time randomized algorithm for estimating the transition amplitudes of bosons, which represents a solution to an open problem raised by Aaronson and Hance (Quantum Inf Comput 2012; 14: 541-59). Consequently, this bound implies that computational decision problems encoded in linear optics, prepared and detected in the Fock basis, can be solved efficiently by classical computers within additive errors. Furthermore, our result also leads to a classical sampling algorithm that can be applied to calculate the many-body wave functions and the S-matrix of bosonic particles.

Original languageEnglish
Pages (from-to)719-729
Number of pages11
JournalNational Science Review
Volume6
Issue number4
DOIs
Publication statusPublished - 2019 Jul 1

Bibliographical note

Publisher Copyright:
© 2019 The Author(s) 2019. Published by Oxford University Press on behalf of China Science Publishing & Media Ltd.

All Science Journal Classification (ASJC) codes

  • General

Fingerprint

Dive into the research topics of 'Universal bound on sampling bosons in linear optics and its computational implications'. Together they form a unique fingerprint.

Cite this