You are on page 1of 9

Precise measurement of quantum observables with neural-network estimators

Giacomo Torlai,1, ∗ Guglielmo Mazzola,2 Giuseppe Carleo,1 and Antonio Mezzacapo3


1
Center for Computational Quantum Physics, Flatiron Institute, New York, NY 10010, USA
2
IBM Research Zurich, Saumerstrasse 4, 8803 Ruschlikon, Switzerland
3
IBM T.J. Watson Research Center, Yorktown Heights, NY 10598, USA
The measurement precision of modern quantum simulators is intrinsically constrained by the
limited set of measurements that can be efficiently implemented on hardware. This fundamental
limitation is particularly severe for quantum algorithms where complex quantum observables are
to be precisely evaluated. To achieve precise estimates with current methods, prohibitively large
amounts of sample statistics are required in experiments. Here, we propose to reduce the mea-
surement overhead by integrating artificial neural networks with quantum simulation platforms.
We show that unsupervised learning of single-qubit data allows the trained networks to accom-
modate measurements of complex observables, otherwise costly using traditional post-processing
arXiv:1910.07596v1 [quant-ph] 16 Oct 2019

techniques. The effectiveness of this hybrid measurement protocol is demonstrated for quantum
chemistry Hamiltonians using both synthetic and experimental data. Neural-network estimators
attain high-precision measurements with a drastic reduction in the amount of sample statistics,
without requiring additional quantum resources.

The measurement process in quantum mechanics has neural networks in the context of quantum many-body
far-reaching implications, ranging from the fundamen- physics [14–21]. The same approach has also been em-
tal interpretation of quantum theory [1] to the design ployed to enhance the capabilities of various quantum
of quantum hardware [2]. The advent of medium-sized simulation platforms [22–27]. With the increasing stream
quantum computers has drawn attention to scalability of quantum data produced in laboratories, it is natural
issues different than control errors or decoherence, which to expect further synergy between machine learning and
nonetheless hinder the realization of complex quantum experimental quantum hardware.
algorithms. Coherent and incoherent noise altering quan-
tum states can be corrected in fault-tolerant architec-
tures [3]. In contrast, the fluctuations introduced by a a b c
non-ideal measurement
ZY
X protocol lead to intrinsic quan- <latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit
ZY <latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit
X
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

tum noise which persists even in a fault-tolerant regime.


X X
Y Y
The most promising quantum computing platforms,
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

Z <latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit
Z <latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit

such as superconducting
Z Y
X or ion-trap processors, provide ZY <latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit
X
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

access to projective single-qubit non-demolition measure-


<latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit

X X
ments [4, 5]. ArmedZY with these simple measurements, ZY
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

<latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit

<latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit

one is faced with a Yplethora


X of quantum simulation algo- X
ZY
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

Z <latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit

rithms which rely on accurate estimations of specialized


<latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit

X X
ZY
observables. For practical purposes, in order to suppress ZY
<latexit sha1_base64="QJNBZMPcneLZOMqNdl3Nvz65Ln0=">AAAB8XicbVDLSsNAFL2pr1pfVZduBqvgqiQi6LLgxmUF+8A2lMn0ph06mYSZiVBC/8KNC0Xc+jfu/BsnbRbaemDgcM69zLknSATXxnW/ndLa+sbmVnm7srO7t39QPTxq6zhVDFssFrHqBlSj4BJbhhuB3UQhjQKBnWBym/udJ1Sax/LBTBP0IzqSPOSMGis99iNqxkGQdWeDas2tu3OQVeIVpAYFmoPqV38YszRCaZigWvc8NzF+RpXhTOCs0k81JpRN6Ah7lkoaofazeeIZObfKkISxsk8aMld/b2Q00noaBXYyT6iXvVz8z+ulJrzxMy6T1KBki4/CVBATk/x8MuQKmRFTSyhT3GYlbEwVZcaWVLEleMsnr5L2Zd1z6979Va1xVtRRhhM4hQvw4BoacAdNaAEDCc/wCm+Odl6cd+djMVpyip1j+APn8we9d5Da</latexit>
<latexit

<latexit sha1_base64="W0+zCQjdoWe8QYhpWw7E092bJpM=">AAAB8XicbVBNS8NAFHypX7V+VT16WayCp5KIoMeCF48VbKu2oWy2L+3SzSbsboQS+i+8eFDEq//Gm//GTZuDtg4sDDPvsfMmSATXxnW/ndLK6tr6RnmzsrW9s7tX3T9o6zhVDFssFrG6D6hGwSW2DDcC7xOFNAoEdoLxde53nlBpHss7M0nQj+hQ8pAzaqz02IuoGQVB9jDtV2tu3Z2BLBOvIDUo0OxXv3qDmKURSsME1brruYnxM6oMZwKnlV6qMaFsTIfYtVTSCLWfzRJPyalVBiSMlX3SkJn6eyOjkdaTKLCTeUK96OXif143NeGVn3GZpAYlm38UpoKYmOTnkwFXyIyYWEKZ4jYrYSOqKDO2pIotwVs8eZm0z+ueW/duL2qNk6KOMhzBMZyBB5fQgBtoQgsYSHiGV3hztPPivDsf89GSU+wcwh84nz++/JDb</latexit>
<latexit

<latexit sha1_base64="0eb6jrrer3FwyXHvVH7qHbmWrV4=">AAAB8XicbVDLSsNAFL3xWeur6tLNYBVclUQEXRbcuKxgH7QNZTK9aYdOJmFmIpTQv3DjQhG3/o07/8ZJm4W2Hhg4nHMvc+4JEsG1cd1vZ219Y3Nru7RT3t3bPzisHB23dJwqhk0Wi1h1AqpRcIlNw43ATqKQRoHAdjC5y/32EyrNY/lopgn6ER1JHnJGjZW6/YiacRBk3dmgUnVr7hxklXgFqUKBxqDy1R/GLI1QGiao1j3PTYyfUWU4Ezgr91ONCWUTOsKepZJGqP1snnhGLqwyJGGs7JOGzNXfGxmNtJ5GgZ3ME+plLxf/83qpCW/9jMskNSjZ4qMwFcTEJD+fDLlCZsTUEsoUt1kJG1NFmbEllW0J3vLJq6R1VfPcmvdwXa2fF3WU4BTO4BI8uIE63EMDmsBAwjO8wpujnRfn3flYjK45xc4J/IHz+QPAgZDc</latexit>
<latexit

the uncertainty arisingˆfrom a sub-optimal measurement ˆ


QC
apparatus, Û amounts
massive ⇧ of sample statistics need to
<latexit sha1_base64="t69Sf2f4Zuk5z6rsImkIX+cs+Fw=">AAAB83icbVDLSgNBEOz1GeMr6tHLYBA8SNgVwRwDuXhMwDwkWcLsZDYZMjO7zEMIS77Co3oRr36OB//GSbIHTSxoKKq66e6KUs608f1vb2Nza3tnt7BX3D84PDounZy2dWIVoS2S8ER1I6wpZ5K2DDOcdlNFsYg47UST+tzvPFGlWSIfzDSlocAjyWJGsHHSY7+h2SBr1meDUtmv+AugdRLkpAw5GoPSV3+YECuoNIRjrXuBn5oww8owwums2LeapphM8Ij2HJVYUB1mi4Nn6NIpQxQnypU0aKH+nsiw0HoqoutIuGaBzViv2nPxP69nTVwNMyZTa6gky12x5cgkaB4AGjJFieFTRzBRzJ2LyBgrTIyLqehyCFa/Xiftm0rgV4LmbblWzRMpwDlcwBUEcAc1uIcGtICAgGd4hTfPei/eu/exbN3w8pkz+APv8weLwpFK</latexit>
<latexit sha1_base64="Lw2aP+Z1/ZxZeTx+yNTtZZiW0C8=">AAAB/XicbVBNS8NAFHypX7V+pXr0EiyCBymJCPZY8OKxgmmFNpTNdtMu3d2E3Y1SQvCXeFQv4tVf4sF/46bNQVsHFoaZ93izEyaMKu2631ZlbX1jc6u6XdvZ3ds/sOuHXRWnEhMfxyyW9yFShFFBfE01I/eJJIiHjPTC6XXh9x6IVDQWd3qWkICjsaARxUgbaWjXBxOkswFHeoIRy/w8H9oNt+nO4awSryQNKNEZ2l+DUYxTToTGDCnV99xEBxmSmmJG8togVSRBeIrGpG+oQJyoIJtHz51To4ycKJbmCe3M1d8bGeJKzXh4HnIzXMRUy3Yh/uf1Ux21goyKJNVE4MWtKGWOjp2iCmdEJcGazQxBWFIT18ETJBHWprCa6cFb/vUq6V40Pbfp3V422q2ykSocwwmcgQdX0IYb6IAPGB7hGV7hzXqyXqx362MxWrHKnSP4A+vzB+EVlWw=</latexit>
<latexit sha1_base64="7y3l70qcq0rF/6ovyD0llDgkou0=">AAAB83icbVBNS8NAEJ34WetX1aOXYBE8SElEsMeCF48V7Ic0oWy2m3bp7ibsToQS+is8qhfx6s/x4L9x2+agrQ8GHu/NMDMvSgU36Hnfztr6xubWdmmnvLu3f3BYOTpumyTTlLVoIhLdjYhhgivWQo6CdVPNiIwE60Tj25nfeWL
<latexit sha1_base64="7y3l70qcq0rF/6ovyD0llDgkou0=">AAAB83icbVBNS8NAEJ34WetX1aOXYBE8SElEsMeCF48V7Ic0oWy2m3bp7ibsToQS+is8qhfx6s/x4L9x2+agrQ8GHu/NMDMvSgU36Hnfztr6xubWdmmnvLu3f3BYOTpumyTTlLVoIhLdjYhhgivWQo6CdVPNiIwE60Tj25nfeWLa8EQ94CRloSRDxWNOCVrpMRgRzIMmn/YrVa/mzeGuEr8gVSjQ7Fe+gkFCM8kUUkGM6fleimFONHIq2LQcZIalhI7JkPUsVUQyE+bzg6fuuVUGbpxoWwrdufp7IifSmImMLiNpmyXBkVm2Z+J/Xi/DuB7mXKUZMkUXu+JMuJi4swDcAdeMophYQqjm9lyXjogmFG1MZZuDv/z1Kmlf1Xyv5t9fVxv1IpESnMIZXIAPN9CAO2hCCyhIeIZXeHMy58V5dz4WrWtOMXMCf+B8/gDaUZF9</latexit>

Û ⇧
be generated by the quantum device [6]. Complex es- d
p(O )
timators are then reconstructed through classical post-
processing of single-qubit data.
As)the measurement precision remains tied to the in- Variance Reduction
p(OQC
<latexit sha1_base64="mNDYeDIOkmIOqk8wV6dB4dkk6tY=">AAACBXicbVA9SwNBEN3zM8avqJXYLAYhhYQ7EbQM2FhGMR+QHGFvM0mW7O0du3NiOIKlv8RSbcTWX2Hhv3GTXKGJDwYe780wMy+IpTDout/O0vLK6tp6biO/ubW9s1vY26+bKNEcajySkW4GzIAUCmooUEIz1sDCQEIjGF5N/MY9aCMidYejGPyQ9ZXoCc7QSp3CYRvhAdM604IpDvQWugmfWONOoeiW3SnoIvEyUiQZqp3CV7sb8SQEhVwyY1qeG6OfMo2CSxjn24mBmPEh60PLUsVCMH46fWFMT6zSpb1I21JIp+rviZSFxozC4DQIbXPIcGDm7Yn4n9dKsHfpp0LFCYLis129RFKM6CQS2hUaOMqRJYxrYc+lfMA042iDy9scvPmvF0n9rOy5Ze/mvFgpZYnkyBE5JiXikQtSIdekSmqEk0fyTF7Jm/PkvDjvzsesdcnJZg7IHzifP9RwmKc=</latexit>

p(Oqc )
terface between the quantum and the classical hardware,
<latexit sha1_base64="Z7ChrSD7+sDB6Ax9l6iNUvG7yIU=">AAAB/3icbVDLSgMxFM34rPU12qWbYBEqSJkpgi4L3bizBfuAdhgyaaYNTTJDkhGGYRZ+iUt1I279EBf+jWk7C209EDiccy/35AQxo0o7zre1sbm1vbNb2ivvHxweHdsnpz0VJRKTLo5YJAcBUoRRQbqaakYGsSSIB4z0g1lr7vcfiVQ0Eg86jYnH0UTQkGKkjeTblbg24khPMWLZfe5nnVZ+6dtVp+4sANeJW5AqKND27a/ROMIJJ0JjhpQauk6svQxJTTEjeXmUKBIjPEMTMjRUIE6Uly3C5/DCKGMYRtI8oeFC/b2RIa5UyoOrgJvheVa1as/F/7xhosNbL6MiTjQReHkrTBjUEZyXAcdUEqxZagjCkpq4EE+RRFibysqmB3f11+uk16i7Tt3tXFebjaKREjgD56AGXHADmuAOtEEXYJCCZ/AK3qwn68V6tz6WoxtWsVMBf2B9/gBTk5WP</latexit>

it becomes critical to develop methods capable of ex- NN bias


<latexit sha1_base64="pd0Whi2++wcss0kzQfUGsZrsgKs=">AAAB+HicbVDLSgNBEOz1GeMr6tHLYBBykLArgh4DXjyFCOYByRpmJ5NkyMzuMtMrxiX/4VG9iFf/xYN/4yTZgyYWNBRV3XR3BbEUBl3321lZXVvf2Mxt5bd3dvf2CweHDRMlmvE6i2SkWwE1XIqQ11Gg5K1Yc6oCyZvB6HrqNx+4NiIK73Acc1/RQSj6glG00n0H+SOm1SoJBDWTbqHolt0ZyDLxMlKEDLVu4avTi1iieIhMUmPanhujn1KNgkk+yXcSw2PKRnTA25aGVHHjp7OrJ+TUKj3Sj7StEMlM/T2RUmXMWAVngbLNiuLQLNpT8T+vnWD/yk9FGCfIQzbf1U8kwYhMUyA9oTlDObaEMi3suYQNqaYMbVZ5m4O3+PUyaZyXPbfs3V4UK6UskRwcwwmUwINLqMAN1KAODDQ8wyu8OU/Oi/PufMxbV5xs5gj+wPn8AWWYk20=</latexit>

tracting more information from a given measurement


dataset [7–12]. hOi
Given this, data-driven algorithms can <latexit sha1_base64="ha8fhk7hZqUdhWwXO+2eQpSq1gE=">AAACBXicbVBNS8NAEJ3Ur1q/op7ES7AIHqQkIuix4MWbFWwtNKFstpt26e4m7G6EEopHf4lH9SJe/RUe/Ddu0hy09cHA470ZZuaFCaNKu+63VVlaXlldq67XNja3tnfs3b2OilOJSRvHLJbdECnCqCBtTTUj3UQSxENG7sPxVe7fPxCpaCzu9CQhAUdDQSOKkTZS3z7wGRJDRnyO9Agjlt1MfVkofbvuNtwCziLxSlKHEq2+/eUPYpxyIjRmSKme5yY6yJDUFDMyrfmpIgnCYzQkPUMF4kQFWfHC1Dk2ysCJYmlKaKdQf09kiCs14eFpyE1zfqyat3PxP6+X6ugyyKhIUk0Enu2KUubo2MkjcQZUEqzZxBCEJTXnOniEJMLaBFczOXjzXy+SzlnDcxve7Xm96ZaJVOEQjuAEPLiAJlxDC9qA4RGe4RXerCfrxXq3PmatFauc2Yc/sD5/APKimME=</latexit>
hÔi
provide a viable path towards improved accuracy and
scalability in quantum simulation platforms. F gure 1 Measurements on quantum hardware w th neura -
Machine learning has recently shown its flexibility in network est mators (a) A quantum c rcu t prepares a quan-
tum state Ψ (b) S ng e-qub t measurements cons st ng o
finding approximate solutions to complex problems in a
a oca rotat on Û and a pro ect ve measurement Π̂ (c) A
broad range of physics [13]. In particular, extensive the-
neura network s tra ned on the output o the measur ng ap-
oretical work has demonstrated the potential of artificial paratus to d scover a representat on ψλ o the state Ψ that
retr eves the expectat on va ue o a target quantum observab e
Ô (d) The ntr ns c measurement uncerta nty s traded or
a systemat c reconstruct on b as ead ng to a measurement
∗ gtorlai@flatironinstitute.org outcome d str but on w th ower var ance
2

BeHto2integrate neural networks


In this Article, we propose
E quantum simulators to increase <latexit sha1_base64="F7/4N3ruA0xd2w2VmQ2due/oADc=">AAAB83icbVBNS8NAEN3Ur1q/qh69LBbBU0mKoMeilx4r2A9oQtlsp+3SzSbsTsQS+je8eFDEq3/Gm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSNnGqObR4LGPdDZkBKRS0UKCEbqKBRaGETji5m/udR9BGxOoBpwkEERspMRScoZV8H+EJs1tozPq1frniVt0F6DrxclIhOZr98pc/iHkagUIumTE9z00wyJhGwSXMSn5qIGF8wkbQs1SxCEyQLW6e0QurDOgw1rYU0oX6eyJjkTHTKLSdEcOxWfXm4n9eL8XhTZAJlaQIii8XDVNJMabzAOhAaOAop5YwroW9lfIx04yjjalkQ/BWX14n7VrVc6ve/VWlXsvjKJIzck4uiUeuSZ00SJO0CCcJeSav5M1JnRfn3flYthacfOaU/IHz+QPVBZF7</latexit>
H BeH2
<latexit sha1_base64="F7/4N3ruA0xd2w2VmQ2due/oADc=">AAAB83icbVBNS8NAEN3Ur1q/qh69LBbBU0mKoMeilx4r2A9oQtlsp+3SzSbsTsQS+je8eFDEq3/Gm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSNnGqObR4LGPdDZkBKRS0UKCEbqKBRaGETji5m/udR9BGxOoBpwkEERspMRScoZV8H+EJs1tozPq1frniVt0F6DrxclIhOZr98pc/iHkagUIumTE9z00wyJhGwSXMSn5qIGF8wkbQs1SxCEyQLW6e0QurDOgw1rYU0oX6eyJjkTHTKLSdEcOxWfXm4n9eL8XhTZAJlaQIii8XDVNJMabzAOhAaOAop5YwroW9lfIx04yjjalkQ/BWX14n7VrVc6ve/VWlXsvjKJIzck4uiUeuSZ00SJO0CCcJeSav5M1JnRfn3flYthacfOaU/IHz+QPVBZF7</latexit>

with the measurement


<latexit sha1_base64="R5mGetJW98LMPUTKgVtpQQ+YSt4=">AAACB3icbVDLSsNAFJ3UV62vqMtuBovgQkoiBV0W3HRZwT6gCWEymbZDZyZhZiKUkIVLv8SluhG3foQL/8ZJm4W2Hhg4nHMPd+4JE0aVdpxvq7KxubW9U92t7e0fHB7Zxyd9FacSkx6OWSyHIVKEUUF6mmpGhokkiIeMDMLZbeEPHohUNBb3ep4Qn6OJoGOKkTZSYNe92NhFOuvkQeaFPPOYiUcozwO74TSdBeA6cUvSACW6gf3lRTFOOREaM6TUyHUS7WdIaooZyWteqkiC8AxNyMhQgThRfrY4IofnRongOJbmCQ0X6u9EhrhScx5ehtwMc6SnatUuxP+8UarHN35GRZJqIvBy1zhlUMewKAVGVBKs2dwQhCU134V4iiTC2lRXMz24q1evk/5V03Wa7l2r0W6VjVRBHZyBC+CCa9AGHdAFPYDBI3gGr+DNerJerHfrYzlascrMKfgD6/MH/2qZ7w==</latexit>

<latexit sha1_base64="JDIih7wATrX6E5ZgOb3GeuJjIOw=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCCB5bsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7AST27nfeUKleSwfzDRBP6IjyUPOqLFS825QrrhVdwGyTrycVCBHY1D+6g9jlkYoDRNU657nJsbPqDKcCZyV+qnGhLIJHWHPUkkj1H62OHRGLqwyJGGsbElDFurviYxGWk+jwHZG1Iz1qjcX//N6qQlv/IzLJDUo2XJRmApiYjL/mgy5QmbE1BLKFLe3EjamijJjsynZELzVl9dJ+6rquVWvWavUa3kcRTiDc7gED66hDvfQgBYwQHiGV3hzHp0X5935WLYWnHzmFP7A+fwBlLeMuw==</latexit>
<latexit

precision ofM quantum


= 1k observables. Using unsupervised <latexit sha1_base64="sjQ06qSnHqznjC29AX8PRMKgnJc=">AAACAnicbVDJSgNBEO1xjeM26km8NIaApzAd4nIRAl68CBHMAskQejqdpEnPQneNGIbgxV/x4kERr36FN//GTjIHTXxQ8Hiviqp6fiyFBtf9tpaWV1bX1nMb9ubW9s6us7df11GiGK+xSEaq6VPNpQh5DQRI3owVp4EvecMfXk38xj1XWkThHYxi7gW0H4qeYBSM1HEOby7bwB8gJcOxXTgrzwqfklLHybtFdwq8SEhG8ihDteN8tbsRSwIeApNU6xZxY/BSqkAwycd2O9E8pmxI+7xlaEgDrr10+sIYF4zSxb1ImQoBT9XfEykNtB4FvukMKAz0vDcR//NaCfQuvFSEcQI8ZLNFvURiiPAkD9wVijOQI0MoU8LcitmAKsrApGabEMj8y4ukXioSt0huy/kKyeLIoSN0jE4QQeeogq5RFdUQQ4/oGb2iN+vJerHerY9Z65KVzRygP7A+fwAV15Ps</latexit>
M = 1k
<latexit sha1_base64="sjQ06qSnHqznjC29AX8PRMKgnJc=">AAACAnicbVDJSgNBEO1xjeM26km8NIaApzAd4nIRAl68CBHMAskQejqdpEnPQneNGIbgxV/x4kERr36FN//GTjIHTXxQ8Hiviqp6fiyFBtf9tpaWV1bX1nMb9ubW9s6us7df11GiGK+xSEaq6VPNpQh5DQRI3owVp4EvecMfXk38xj1XWkThHYxi7gW0H4qeYBSM1HEOby7bwB8gJcOxXTgrzwqfklLHybtFdwq8SEhG8ihDteN8tbsRSwIeApNU6xZxY/BSqkAwycd2O9E8pmxI+7xlaEgDrr10+sIYF4zSxb1ImQoBT9XfEykNtB4FvukMKAz0vDcR//NaCfQuvFSEcQI8ZLNFvURiiPAkD9wVijOQI0MoU8LcitmAKsrApGabEMj8y4ukXioSt0huy/kKyeLIoSN0jE4QQeeogq5RFdUQQ4/oGb2iN+vJerHerY9Z65KVzRygP7A+fwAV15Ps</latexit>

<latexit sha1_base64="zpfyVQujZNWLp+BZzmoRQR+ADfc=">AAACAXicbVBNS8NAFNzUr1q/ouLJS7AIHqQkIuix4MVjBVsLTQibzaZdursJuy9CCTn5SzyqF/Hq7/Dgv3Hb5qCtAwvDzBve24kyzjS47rdVW1ldW9+obza2tnd29+z9g55Oc0Vol6Q8Vf0Ia8qZpF1gwGk/UxSLiNOHaHwz9R8eqdIslfcwyWgg8FCyhBEMRgrtIz+mHHBY+JEofG6CMS7L0G66LXcGZ5l4FWmiCp3Q/vLjlOSCSiAcaz3w3AyCAitghNOy4eeaZpiM8ZAODJVYUB0Us/NL59QosZOkyjwJzkz9nSiw0HoiovNImGGBYaQX7an4nzfIIbkOCiazHKgk811Jzh1InWkdTswUJcAnhmCimDnXISOsMAFTWsP04C3+epn0Llqe2/LuLpttt2qkjo7RCTpDHrpCbXSLOqiLCCrQM3pFb9aT9WK9Wx/z0ZpVZQ7RH1ifP+jklxU=</latexit>

learning on single-qubit data to Elearn approximately the <latexit sha1_base64="Vd2idiD8NmFd7TNfQux98A/A+NA=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokI9lgQwWMF+wFtKJvNpl262cTdiVBC/4QXD4p49e9489+4bXPQ1gcDj/dmmJkXpFIYdN1vZ219Y3Nru7RT3t3bPzisHB23TZJpxlsskYnuBtRwKRRvoUDJu6nmNA4k7wTjm5nfeeLaiEQ94CTlfkyHSkSCUbRStx9yiXRwO6hU3Zo7B1klXkGqUKA5qHz1w4RlMVfIJDWm57kp+jnVKJjk03I/MzylbEyHvGepojE3fj6/d0rOrRKSKNG2FJK5+nsip7ExkziwnTHFkVn2ZuJ/Xi/DqO7nQqUZcsUWi6JMEkzI7HkSCs0ZyokllGlhbyVsRDVlaCMq2xC85ZdXSfuy5rk17/6q2qgXcZTgFM7gAjy4hgbcQRNawEDCM7zCm/PovDjvzseidc0pZk7gD5zPH9Yyj8o=</latexit>

quantum state underlying the hardware, neural networks


can be deployed to generate estimators free of intrinsic
quantum noise. This comes at a cost of a systematic
bias from the imperfect quantum state reconstruction. M = 100k
<latexit sha1_base64="bw6VpPxfLiY1isvjqYn3AkKi12A=">AAACBHicbVC7SgNBFJ2Nr7i+Vi3TDIaAVdgJ8dEIARsbIYJJhGQJs5NJMmT2wcxdMSwpbPwVGwtFbP0IO//GSbKFJh64cDjnXu69x4+l0OC631ZuZXVtfSO/aW9t7+zuOfsHTR0livEGi2Sk7nyquRQhb4AAye9ixWngS97yR5dTv3XPlRZReAvjmHsBHYSiLxgFI3WdwvVFB/gDpMR1RxO7dFqdFz4hla5TdMvuDHiZkIwUUYZ61/nq9CKWBDwEJqnWbeLG4KVUgWCST+xOonlM2YgOeNvQkAZce+nsiQkuGaWH+5EyFQKeqb8nUhpoPQ580xlQGOpFbyr+57UT6J97qQjjBHjI5ov6icQQ4WkiuCcUZyDHhlCmhLkVsyFVlIHJzTYhkMWXl0mzUiZumdxUizWSxZFHBXSEjhFBZ6iGrlAdNRBDj+gZvaI368l6sd6tj3lrzspmDtEfWJ8//p2UYA==</latexit>

We investigate
M = 10kthe trade-off between these two sources <latexit sha1_base64="Av3299kCDQ+4zxpf28Azm+uyldI=">AAACA3icbVDJSgNBEO1xjeM26k0vjSHgKUyHuFyEgBcvQgSzQDKEnk4nadKz0F0jhiHgxV/x4kERr/6EN//GTjIHTXxQ8Hiviqp6fiyFBtf9tpaWV1bX1nMb9ubW9s6us7df11GiGK+xSEaq6VPNpQh5DQRI3owVp4EvecMfXk38xj1XWkThHYxi7gW0H4qeYBSM1HEOby7bwB8gJe5wbBfOyrPCp6TUcfJu0Z0CLxKSkTzKUO04X+1uxJKAh8Ak1bpF3Bi8lCoQTPKx3U40jykb0j5vGRrSgGsvnf4wxgWjdHEvUqZCwFP190RKA61HgW86AwoDPe9NxP+8VgK9Cy8VYZwAD9lsUS+RGCI8CQR3heIM5MgQypQwt2I2oIoyMLHZJgQy//IiqZeKxC2S23K+QrI4cugIHaMTRNA5qqBrVEU1xNAjekav6M16sl6sd+tj1rpkZTMH6A+szx+KIZQm</latexit>
<latexit sha1_base64="zpfyVQujZNWLp+BZzmoRQR+ADfc=">AAACAXicbVBNS8NAFNzUr1q/ouLJS7AIHqQkIuix4MVjBVsLTQibzaZdursJuy9CCTn5SzyqF/Hq7/Dgv3Hb5qCtAwvDzBve24kyzjS47rdVW1ldW9+obza2tnd29+z9g55Oc0Vol6Q8Vf0Ia8qZpF1gwGk/UxSLiNOHaHwz9R8eqdIslfcwyWgg8FCyhBEMRgrtIz+mHHBY+JEofG6CMS7L0G66LXcGZ5l4FWmiCp3Q/vLjlOSCSiAcaz3w3AyCAitghNOy4eeaZpiM8ZAODJVYUB0Us/NL59QosZOkyjwJzkz9nSiw0HoiovNImGGBYaQX7an4nzfIIbkOCiazHKgk811Jzh1InWkdTswUJcAnhmCimDnXISOsMAFTWsP04C3+epn0Llqe2/LuLpttt2qkjo7RCTpDHrpCbXSLOqiLCCrQM3pFb9aT9WK9Wx/z0ZpVZQ7RH1ifP+jklxU=</latexit>

of uncertainty for measurementsE of quantum chemistry


R
Hamiltonians, costly with standard techniques [6]. We
<latexit sha1_base64="Vd2idiD8NmFd7TNfQux98A/A+NA=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokI9lgQwWMF+wFtKJvNpl262cTdiVBC/4QXD4p49e9489+4bXPQ1gcDj/dmmJkXpFIYdN1vZ219Y3Nru7RT3t3bPzisHB23TZJpxlsskYnuBtRwKRRvoUDJu6nmNA4k7wTjm5nfeeLaiEQ94CTlfkyHSkSCUbRStx9yiXRwO6hU3Zo7B1klXkGqUKA5qHz1w4RlMVfIJDWm57kp+jnVKJjk03I/MzylbEyHvGepojE3fj6/d0rOrRKSKNG2FJK5+nsip7ExkziwnTHFkVn2ZuJ/Xi/DqO7nQqUZcsUWi6JMEkzI7HkSCs0ZyokllGlhbyVsRDVlaCMq2xC85ZdXSfuy5rk17/6q2qgXcZTgFM7gAjy4hgbcQRNawEDCM7zCm/PovDjvzseidc0pZk7gD5zPH9Yyj8o=</latexit>

<latexit sha1_base64="9ds5Ai+t2KcOD4LI0C0UpNklZD8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ1P+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqtTyOIpzBOVyCBzdQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqZ+MzA==</latexit>
<latexit

show a reduction of various orders of magnitude in the


amount of data required to reach chemical accuracy for
simulated data. For experimental data produced by a
R
superconducting quantum hardware, we recover energy
<latexit sha1_base64="sUVLjIWlmisjLUllnGmzeb1OEQ8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ3fsZlkhqUbLkoTAUxMZl/TYZcITNiagllittbCRtTRZmx2ZRsCN7qy+ukfVX13KrXrFXqtTyOIpzBOVyCB9dQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqGuMyA==</latexit>
<latexit

estimates M = 100k
with a low number of data points. This opens <latexit sha1_base64="bw6VpPxfLiY1isvjqYn3AkKi12A=">AAACBHicbVC7SgNBFJ2Nr7i+Vi3TDIaAVdgJ8dEIARsbIYJJhGQJs5NJMmT2wcxdMSwpbPwVGwtFbP0IO//GSbKFJh64cDjnXu69x4+l0OC631ZuZXVtfSO/aW9t7+zuOfsHTR0livEGi2Sk7nyquRQhb4AAye9ixWngS97yR5dTv3XPlRZReAvjmHsBHYSiLxgFI3WdwvVFB/gDpMR1RxO7dFqdFz4hla5TdMvuDHiZkIwUUYZ61/nq9CKWBDwEJqnWbeLG4KVUgWCST+xOonlM2YgOeNvQkAZce+nsiQkuGaWH+5EyFQKeqb8nUhpoPQ580xlQGOpFbyr+57UT6J97qQjjBHjI5ov6icQQ4WkiuCcUZyDHhlCmhLkVsyFVlIHJzTYhkMWXl0mzUiZumdxUizWSxZFHBXSEjhFBZ6iGrlAdNRBDj+gZvaI368l6sd6tj3lrzspmDtEfWJ8//p2UYA==</latexit>

Figure 2. Reconstruction of the potential energy surface of the


new opportunities for quantum simulation
E on near-term <latexit sha1_base64="Vd2idiD8NmFd7TNfQux98A/A+NA=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokI9lgQwWMF+wFtKJvNpl262cTdiVBC/4QXD4p49e9489+4bXPQ1gcDj/dmmJkXpFIYdN1vZ219Y3Nru7RT3t3bPzisHB23TZJpxlsskYnuBtRwKRRvoUDJu6nmNA4k7wTjm5nfeeLaiEQ94CTlfkyHSkSCUbRStx9yiXRwO6hU3Zo7B1klXkGqUKA5qHz1w4RlMVfIJDWm57kp+jnVKJjk03I/MzylbEyHvGepojE3fj6/d0rOrRKSKNG2FJK5+nsip7ExkziwnTHFkVn2ZuJ/Xi/DqO7nQqUZcsUWi6JMEkzI7HkSCs0ZyokllGlhbyVsRDVlaCMq2xC85ZdXSfuy5rk17/6q2qgXcZTgFM7gAjy4hgbcQRNawEDCM7zCm/PovDjvzseidc0pZk7gD5zPH9Yyj8o=</latexit>

BeH2 molecule (Hartree and Angstrom units). We show, for


quantum hardware [28]. different dataset sizes M , the comparison between the exact
R
<latexit sha1_base64="9ds5Ai+t2KcOD4LI0C0UpNklZD8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ1P+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqtTyOIpzBOVyCBzdQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqZ+MzA==</latexit>
<latexit

ground state energy E0 (solid lines) and the energies obtained


with the neural-network estimator (markers). The shaded
NEURAL-NETWORK ESTIMATORS regions span one standard deviation for the estimate on the
quantum hardware with standard averaging method using M
We examine the task of estimating the expectation
R
<latexit sha1_base64="sUVLjIWlmisjLUllnGmzeb1OEQ8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ3fsZlkhqUbLkoTAUxMZl/TYZcITNiagllittbCRtTRZmx2ZRsCN7qy+ukfVX13KrXrFXqtTyOIpzBOVyCB9dQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqGuMyA==</latexit>
<latexit
measurements. In the insets, we show the deviations δλ =
|E0 − H λ | of the RBM estimators from the exact energies.
value of a generic observable Ô on a quantum state |Ψi
prepared by a quantum computer with N qubits. A di-
rect measurements produces an estimator O ≈ hÔi with given reference basis |σi of the many-body Hilbert space
sample variance σ 2 [O] ≈ hÔ2 i−hÔi2 . This measurement (e.g. |σi = |σ1z , . . . , σN
z
i, σiz = {0, 1}), the neural net-
is optimal when |Ψi is eigen-state of Ô (i.e. σ 2 [O] = 0), work provides a parametric encoding of the amplitudes
but requires sample statistics from the observable eigen- ψλ (σ) = hσ|ψλ i into a set of complex-valued weights
basis, typically not available on a quantum computer. λ [16]. Specifically, we implement the restricted Boltz-
A more flexible measurement protocol can be devised mann machine (RBM) [30], a physics-inspired generative
by considering the expansion of the observable Ô in terms neural network currently explored in many areas of con-
of K tensor products of Pauli operators densed matter physics and quantum information [31, 32].
The quantum state reconstruction is carried out by
K
X training the neural network on a dataset D of M single-
Ô = ck P̂k , P̂k ∈ {1̂, σ̂ x , σ̂ y , σ̂ z }⊗N , (1) qubit projective measurements, obtained from the target
k=1 quantum state |Ψi prepared by the hardware. Using an
extension of unsupervised learning [20], the network pa-
where ck are real coefficients. This decomposition al- rameters λ are optimized via gradient descent to mini-
lows one to estimate the expectation value from indepen- mize the statistical distance between the probability dis-
dent measurements of each Pauli operator, only requiring tribution underlying the data and the RBM projective
single-qubit data. In contrast to the direct measurement, measurement probability. We adopt the standard mea-
the final estimator Oqc suffers an increased uncertainty sure given by the Kullbach-Leibler divergence
2
pP
2 2
qc = k |ck | σ [Pk ]/S, where σ [Pk ] is the sample
1 X
variance of P̂k and S is the number of measurements (see Cλ = − log |ψλ (σ b )|2 , (2)
Supplementary Material [SM]). The overhead in sample M b
σ ∈D
statistics to reduce this uncertainty becomes particularly
severe for observables with a large number K of Pauli
bN
where σ b is a N -bit string (σ1b1 , . . . , σN ) and b are Pauli
operators. bases (bj = {x, y, z}) drawn uniformly from the set of
We overcome this limitation by deploying unsuper- Pauli operators Pk appearing in Eq. (1) [SM].
vised machine learning on single-qubit data to obtain Once the optimal parameters are selected according to
an approximate reconstruction of the quantum state |Ψi cross-validation on held-out data, measurements of spe-
(Fig. 1). We call this reconstruction approximate in the cialized observables can be performed by the neural net-
sense that, unlike traditional quantum state tomogra- work [17]. The expectation value of the quantum observ-
phy [29], we are primarily interested in the more re- able is simply approximated by the statistical estimator
stricted task of recovering measurement outcomes for nmc
1 X hσj |Ô|ψλ i
the observable Ô. We first parametrize a generic many- Oλ = , (3)
body wavefunction by an artificial neural network. In a nmc j=1 hσj |ψλ i
3

a MC Variance <latexit sha1_base64="9gjyIu8bfWD4IkqSD3N92FoMyC8=">AAAB+3icbZDLSgMxFIYz9Vbrrdalm2ARXJUZKeiy0I0boYK9QDuUTHqmDc1khuSMtAx9FTcuFHHri7jzbUwvC239IfDxn3M4J3+QSGHQdb+d3Nb2zu5efr9wcHh0fFI8LbVMnGoOTR7LWHcCZkAKBU0UKKGTaGBRIKEdjOvzevsJtBGxesRpAn7EhkqEgjO0Vr9Y6iFMMLuv0xbTgikOs36x7FbchegmeCsok5Ua/eJXbxDzNAKFXDJjup6boJ8xjYJLmBV6qYGE8TEbQteiYhEYP1vcPqOX1hnQMNb2KaQL9/dExiJjplFgOyOGI7Nem5v/1bophrd+JlSSIii+XBSmkmJM50HQgdDAUU4tMK6FvZXyEdOMo42rYEPw1r+8Ca3rimf5oVquVVdx5Mk5uSBXxCM3pEbuSIM0CScT8kxeyZszc16cd+dj2ZpzVjNn5I+czx/ai5RD</latexit>

b <latexit sha1_base64="zzKrs3RhFNS4QSiBPbtRHdbQGec=">AAACB3icbVDLSsNAFJ3UV62vqMtugkWoICWRgi4LbrqsYB/QhDCZTtqhk4czN2IJWbj0S1yqG3HrR7jwb5y0WWjrgYHDOedy5x4v5kyCaX5rpbX1jc2t8nZlZ3dv/0A/POrJKBGEdknEIzHwsKSchbQLDDgdxILiwOO0702vc79/T4VkUXgLs5g6AR6HzGcEg5JcvRrX7UgF8vm0nbmpDfQB0juSZWeuXjMb5hzGKrEKUkMFOq7+ZY8ikgQ0BMKxlEPLjMFJsQBGOM0qdiJpjMkUj+lQ0RAHVDrp/IjMOFXKyPAjoV4Ixlz9PZHiQMpZ4J17gQoHGCZy2c7F/7xhAv6Vk7IwToCGZLHLT7gBkZGXYoyYoAT4TBFMBFPfNcgEC0xAVVdRPVjLV6+S3kXDMhvWTbPWahaNlFEVnaA6stAlaqE26qAuIugRPaNX9KY9aS/au/axiJa0YuYY/YH2+QOu9pm9</latexit>
p(H qc ) c <latexit sha1_base64="KuEwDdXGmh6qzjwle/wHDCvt0xI=">AAACCnicbVDNS8MwHE3n15xfVY+CFIcwQUYrAz0OvOw4wX3AWkqapltYkpYkFUbpzaN/iUf1Il79Fzz435huPejmg8DjvfdL8ntBQolUtv1tVNbWNza3qtu1nd29/QPz8Kgv41Qg3EMxjcUwgBJTwnFPEUXxMBEYsoDiQTC9LfzBAxaSxPxezRLsMTjmJCIIKi355mnScGMdKOazTu5nbsAyl+oLQpjnF75Zt5v2HNYqcUpSByW6vvnlhjFKGeYKUSjlyLET5WVQKIIozmtuKnEC0RSO8UhTDhmWXjbfI7fOtRJaUSz04cqaq78nMsiknLHgMmA6zKCayGW7EP/zRqmKbryM8CRVmKPFW1FKLRVbRS9WSARGis40gUgQ/V0LTaCASOn2aroHZ3nrVdK/ajp207lr1dutspEqOAFnoAEccA3aoAO6oAcQeATP4BW8GU/Gi/FufCyiFaOcOQZ/YHz+ALJKms4=</latexit>
p(H ) MC
d Variance
<latexit sha1_base64="9gjyIu8bfWD4IkqSD3N92FoMyC8=">AAAB+3icbZDLSgMxFIYz9Vbrrdalm2ARXJUZKeiy0I0boYK9QDuUTHqmDc1khuSMtAx9FTcuFHHri7jzbUwvC239IfDxn3M4J3+QSGHQdb+d3Nb2zu5efr9wcHh0fFI8LbVMnGoOTR7LWHcCZkAKBU0UKKGTaGBRIKEdjOvzevsJtBGxesRpAn7EhkqEgjO0Vr9Y6iFMMLuv0xbTgikOs36x7FbchegmeCsok5Ua/eJXbxDzNAKFXDJjup6boJ8xjYJLmBV6qYGE8TEbQteiYhEYP1vcPqOX1hnQMNb2KaQL9/dExiJjplFgOyOGI7Nem5v/1bophrd+JlSSIii+XBSmkmJM50HQgdDAUU4tMK6FvZXyEdOMo42rYEPw1r+8Ca3rimf5oVquVVdx5Mk5uSBXxCM3pEbuSIM0CScT8kxeyZszc16cd+dj2ZpzVjNn5I+czx/ai5RD</latexit>

LiH
<latexit sha1_base64="U0crNIVOtsJ5TEF4Dqjgt29g+7A=">AAAB8XicbVA9SwNBEN2LXzF+RS1tFoNgFe5E0DJgk8IigvnA5Ah7m7lkyd7esTsnhiP/wsZCEVv/jZ3/xk1yhSY+GHi8N8PMvCCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqjk0eSxj3QmYASkUNFGghE6igUWBhHYwvpn57UfQRsTqHicJ+BEbKhEKztBKDz2EJ8xuRX3aL1fcqjsHXSVeTiokR6Nf/uoNYp5GoJBLZkzXcxP0M6ZRcAnTUi81kDA+ZkPoWqpYBMbP5hdP6ZlVBjSMtS2FdK7+nshYZMwkCmxnxHBklr2Z+J/XTTG89jOhkhRB8cWiMJUUYzp7nw6EBo5yYgnjWthbKR8xzTjakEo2BG/55VXSuqh6btW7u6zULvM4iuSEnJJz4pErUiN10iBNwokiz+SVvDnGeXHenY9Fa8HJZ47JHzifP7zNkOY=</latexit>

2
<latexit sha1_base64="dyAeVmE6+4VbgoNLzEmhvj0IlRc=">AAACA3icbVBNS8NAEN34WetX1IMHL8Ei9CAlKYIeC156rGA/IIlhs922S3eTuDsRS8jRX+JRvYhXf4YH/43bNgdtfTDweG+GmXlhwpkC2/42VlbX1jc2S1vl7Z3dvX3z4LCj4lQS2iYxj2UvxIpyFtE2MOC0l0iKRchpNxxfT/3uA5WKxdEtTBLqCzyM2IARDFoKzGNPsaHAd3W36QeZB/QRsnuS54FZsWv2DNYycQpSQQVagfnl9WOSChoB4Vgp17ET8DMsgRFO87KXKppgMsZD6moaYUGVn80eyK0zrfStQSx1RWDN1N8TGRZKTUR4HgrdLDCM1KI9Ff/z3BQGV37GoiQFGpH5rkHKLYitaSBWn0lKgE80wUQyfa5FRlhiAjq2ss7BWfx6mXTqNceuOTcXlUa1SKSETtApqiIHXaIGaqIWaiOCcvSMXtGb8WS8GO/Gx7x1xShmjtAfGJ8/OBiXxQ==</latexit>
[H]qc <latexit sha1_base64="tZLBEpFzuIhHEKU+ZfHkLlNpVuI=">AAAB83icbVDLSgNBEOyNrxhfUY9eFoOQg4RdEfQYyMVjAuYhyRJmJ5NkyMzsMtMrhiVf4VG9iFc/x4N/4yTZgyYWNBRV3XR3hbHgBj3v28ltbG5t7+R3C3v7B4dHxeOTlokSTVmTRiLSnZAYJrhiTeQoWCfWjMhQsHY4qc399iPThkfqHqcxCyQZKT7klKCVHnrInjBt1Gb9YsmreAu468TPSAky1PvFr94goolkCqkgxnR9L8YgJRo5FWxW6CWGxYROyIh1LVVEMhOki4Nn7oVVBu4w0rYUugv190RKpDFTGV6G0jZLgmOzas/F/7xugsPbIOUqTpAputw1TISLkTsPwB1wzSiKqSWEam7PdemYaELRxlSwOfirX6+T1lXF9yp+47pULWeJ5OEMzqEMPtxAFe6gDk2gIOEZXuHNSZwX5935WLbmnGzmFP7A+fwB4M+RdA==</latexit>
QC <latexit sha1_base64="tZLBEpFzuIhHEKU+ZfHkLlNpVuI=">AAAB83icbVDLSgNBEOyNrxhfUY9eFoOQg4RdEfQYyMVjAuYhyRJmJ5NkyMzsMtMrhiVf4VG9iFc/x4N/4yTZgyYWNBRV3XR3hbHgBj3v28ltbG5t7+R3C3v7B4dHxeOTlokSTVmTRiLSnZAYJrhiTeQoWCfWjMhQsHY4qc399iPThkfqHqcxCyQZKT7klKCVHnrInjBt1Gb9YsmreAu468TPSAky1PvFr94goolkCqkgxnR9L8YgJRo5FWxW6CWGxYROyIh1LVVEMhOki4Nn7oVVBu4w0rYUugv190RKpDFTGV6G0jZLgmOzas/F/7xugsPbIOUqTpAputw1TISLkTsPwB1wzSiKqSWEam7PdemYaELRxlSwOfirX6+T1lXF9yp+47pULWeJ5OEMzqEMPtxAFe6gDk2gIOEZXuHNSZwX5935WLbmnGzmFP7A+fwB4M+RdA==</latexit>
QC <latexit sha1_base64="uL1gz9le4XrfIZJC0dSkv/NzIB4=">AAACBXicbVDLSgNBEJyNrxhfUU/iZTEIOUjYDYIeA148RjAPyMZldtJJhszOrjO9wbAEj36JR/UiXv0KD/6Nk8dBEwsaiqpuuruCWHCNjvNtZVZW19Y3spu5re2d3b38/kFdR4liUGORiFQzoBoEl1BDjgKasQIaBgIaweBq4jeGoDSP5C2OYmiHtCd5lzOKRvLzR96QKog1F5G8K/uph/CA6T0bj/18wSk5U9jLxJ2TApmj6ue/vE7EkhAkMkG1brlOjO2UKuRMwDjnJRpiyga0By1DJQ1Bt9PpC2P71CgduxspUxLtqfp7IqWh1qMwOAtC0xxS7OtFeyL+57US7F62Uy7jBEGy2a5uImyM7EkkdocrYChGhlCmuDnXZn2qKEMTXM7k4C5+vUzq5ZLrlNyb80KlOE8kS47JCSkSl1yQCrkmVVIjjDySZ/JK3qwn68V6tz5mrRlrPnNI/sD6/AFvc5kL</latexit>
"2qc
2
[H] E QC+NN 2
BeH2
<latexit sha1_base64="eXAEar+7/mgbpsUou3rmLxjT5wU=">AAAB9XicbVBNT8JAEN36ifiFevTSCCaeSEtI9Ej0whET+Uigku0yhQ3bbbM7VUnD//DiQWO8+l+8+W9coAcFXzLJy3szmZnnx4JrdJxva219Y3NrO7eT393bPzgsHB23dJQoBk0WiUh1fKpBcAlN5CigEyugoS+g7Y9vZn77AZTmkbzDSQxeSIeSB5xRNNJ9D+EJ02uol/qV0rRfKDplZw57lbgZKZIMjX7hqzeIWBKCRCao1l3XidFLqULOBEzzvURDTNmYDqFrqKQhaC+dXz21z40ysINImZJoz9XfEykNtZ6EvukMKY70sjcT//O6CQZXXsplnCBItlgUJMLGyJ5FYA+4AoZiYghliptbbTaiijI0QeVNCO7yy6ukVSm7Ttm9rRZr1SyOHDklZ+SCuOSS1EidNEiTMKLIM3klb9aj9WK9Wx+L1jUrmzkhf2B9/gCUtpHZ</latexit>
<latexit sha1_base64="FOrqaD+AEjoqVWePnyJwp2LQcY8=">AAACBnicbVDLSsNAFJ3UV62vqDvdBIvQhZSkCLosuOmygn1AEsNkMmmHzkzCzEQoIeDSL3GpbsStP+HCv3HaZqGtBwYO59zDnXvClBKpbPvbqKytb2xuVbdrO7t7+wfm4VFfJplAuIcSmohhCCWmhOOeIoriYSowZCHFg3ByM/MHD1hIkvA7NU2xz+CIk5ggqLQUmCeeJCMG71tuxw9yL2S5R3U6gkURmHW7ac9hrRKnJHVQohuYX16UoIxhrhCFUrqOnSo/h0IRRHFR8zKJU4gmcIRdTTlkWPr5/IbCOtdKZMWJ0I8ra67+TuSQSTll4UXI9DCDaiyX7Zn4n+dmKr72c8LTTGGOFrvijFoqsWadWBERGCk61QQiQfR3LTSGAiKlm6vpHpzlq1dJv9V07KZze1lvN8pGquAUnIEGcMAVaIMO6IIeQOARPINX8GY8GS/Gu/GxGK0YZeYY/IHx+QM1kJjW</latexit>
<latexit sha1_base64="2hN2553AC98/HdMbp3m0TSIJ0sA=">AAAB9XicbVDLSgMxFL1TX7W+qi7dBIvQhZQZEXRZEMFlBfuA6VAyaaYNTTJDklHK0M9wqW7ErV/jwr8x085CWw8EDufcyz05YcKZNq777ZTW1jc2t8rblZ3dvf2D6uFRR8epIrRNYh6rXog15UzStmGG016iKBYhp91wcpP73UeqNIvlg5kmNBB4JFnECDZW8vsCmzHBPLudDao1t+HOgVaJV5AaFGgNql/9YUxSQaUhHGvte25iggwrwwins0o/1TTBZIJH1LdUYkF1kM0jz9CZVYYoipV90qC5+nsjw0LrqQjPQ2GH85R62c7F/zw/NdF1kDGZpIZKsrgVpRyZGOUVoCFTlBg+tQQTxWxcRMZYYWJsURXbg7f861XSuWh4bsO7v6w160UjZTiBU6iDB1fQhDtoQRsIxPAMr/DmPDkvzrvzsRgtOcXOMfyB8/kDYGKSTg==</latexit>
<latexit

<latexit sha1_base64="8nJe7BkbkjvAxIqSPAOpe3qBcqY=">AAAB9nicbVBNSwMxEM36WetX1aOXYBEKStkVQY+FXjyVFuwHdJeSTdM2NMkuyay0LP0bHtWLePXPePDfmLZ70NYHA4/3ZpiZF8aCG3Ddb2djc2t7Zze3l98/ODw6LpyctkyUaMqaNBKR7oTEMMEVawIHwTqxZkSGgrXDcXXut5+YNjxSjzCNWSDJUPEBpwSs5PvAJpA2qle12qxXKLpldwG8TryMFFGGeq/w5fcjmkimgApiTNdzYwhSooFTwWZ5PzEsJnRMhqxrqSKSmSBd3DzDl1bp40GkbSnAC/X3REqkMVMZXofSNksCI7Nqz8X/vG4Cg/sg5SpOgCm63DVIBIYIzzPAfa4ZBTG1hFDN7bmYjogmFGxSeZuDt/r1OmndlD237DVui5VSlkgOnaMLVEIeukMV9IDqqIkoitEzekVvzsR5cd6dj2XrhpPNnKE/cD5/AIrEklk=</latexit>

<latexit sha1_base64="K8pyH0lPotSkDtfnC0H6NZZOMeU=">AAACA3icbVBPS8MwHE3nvzn/VT148BIcwg4y2iHocaAHjxPcHKy1pGm6hSVpSVJhlB79JB7Vi3j1Y3jw25htPejmg8Djvd/jl98LU0aVdpxvq7Kyura+Ud2sbW3v7O7Z+wc9lWQSky5OWCL7IVKEUUG6mmpG+qkkiIeM3Ifjq6l//0ikoom405OU+BwNBY0pRtpIgX3kXROm0UMryL2Q5x4z0QgVRWDXnaYzA1wmbknqoEQnsL+8KMEZJ0JjhpQauE6q/RxJTTEjRc3LFEkRHqMhGRgqECfKz2cHFPDUKBGME2me0HCm/k7kiCs14eFZyM0wR3qkFu2p+J83yHR86edUpJkmAs93xRmDOoHTQmBEJcGaTQxBWFLzXYhHSCKsTW0104O7ePUy6bWartN0b8/r7UbZSBUcgxPQAC64AG1wAzqgCzAowDN4BW/Wk/VivVsf89GKVWYOwR9Ynz/m85eR</latexit>

<latexit sha1_base64="6rcD7bIKd9ALRm9dwLFQS9lgmJU=">AAAB83icbVDLSgNBEOyNrxhfUY9eBhPBU9gNAT0GvOQYwTwgu4TZyWwyZPbBTK8YlvyGFw+KePVnvPk3TpI9aGJBQ1HVTXeXn0ih0ba/rcLW9s7uXnG/dHB4dHxSPj3r6jhVjHdYLGPV96nmUkS8gwIl7yeK09CXvOdP7xZ+75ErLeLoAWcJ90I6jkQgGEUjuS7yJ8xa1WG9Oh+WK3bNXoJsEicnFcjRHpa/3FHM0pBHyCTVeuDYCXoZVSiY5POSm2qeUDalYz4wNKIh1162vHlOrowyIkGsTEVIlurviYyGWs9C33SGFCd63VuI/3mDFINbLxNRkiKP2GpRkEqCMVkEQEZCcYZyZghlSphbCZtQRRmamEomBGf95U3Srdccu+bcNyrNRh5HES7gEq7BgRtoQgva0AEGCTzDK7xZqfVivVsfq9aClc+cwx9Ynz9EMZEe</latexit>
H2

M
<latexit sha1_base64="mYfCySN1ilXjlvH77OnNLTDu/H8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCFy9CC/YD2lA220m7drMJuxuhhP4CLx4U8epP8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53CxubW9k5xt7S3f3B4VD4+aes4VQxbLBax6gZUo+ASW4Ybgd1EIY0CgZ1gcjv3O0+oNI/lg5km6Ed0JHnIGTVWat4PyhW36i5A1omXkwrkaAzKX/1hzNIIpWGCat3z3MT4GVWGM4GzUj/VmFA2oSPsWSpphNrPFofOyIVVhiSMlS1pyEL9PZHRSOtpFNjOiJqxXvXm4n9eLzXhjZ9xmaQGJVsuClNBTEzmX5MhV8iMmFpCmeL2VsLGVFFmbDYlG4K3+vI6aV9VPbfqNWuVei2PowhncA6X4ME11OEOGtACBgjP8ApvzqPz4rw7H8vWgpPPnMIfOJ8/oNeMww==</latexit>
<latexit
E0
<latexit sha1_base64="igo3CP7AsgetEve2Ns4ylLwtfM4=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkUI8FETxWtB/QhrLZTtqlm03Y3Qgl9Cd48aCIV3+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7ASTm7nfeUKleSwfzTRBP6IjyUPOqLHSw+3AHZQrbtVdgKwTLycVyNEclL/6w5ilEUrDBNW657mJ8TOqDGcCZ6V+qjGhbEJH2LNU0gi1ny1OnZELqwxJGCtb0pCF+nsio5HW0yiwnRE1Y73qzcX/vF5qwms/4zJJDUq2XBSmgpiYzP8mQ66QGTG1hDLF7a2EjamizNh0SjYEb/XlddK+qnpu1buvVRq1PI4inME5XIIHdWjAHTShBQxG8Ayv8OYI58V5dz6WrQUnnzmFP3A+fwC4NY1e</latexit>
<latexit sha1_base64="oX3qOz5Dp4AHMvlbkgwJ6DnacT8=">AAAB9nicbVDLSgMxFL3js9ZX1aWbYBFcaJkpgi4LIrisYB/QGUomzbShSWZIMmIZ+hsu1Y249Wdc+Ddm2llo64HA4Zx7uScnTDjTxnW/nZXVtfWNzdJWeXtnd2+/cnDY1nGqCG2RmMeqG2JNOZO0ZZjhtJsoikXIaScc3+R+55EqzWL5YCYJDQQeShYxgo2V/AtfYDMimGe3036l6tbcGdAy8QpShQLNfuXLH8QkFVQawrHWPc9NTJBhZRjhdFr2U00TTMZ4SHuWSiyoDrJZ5ik6tcoARbGyTxo0U39vZFhoPRHheSjscJ5SL9q5+J/XS010HWRMJqmhksxvRSlHJkZ5B2jAFCWGTyzBRDEbF5ERVpgY21TZ9uAt/nqZtOs1z61595fVRr1opATHcAJn4MEVNOAOmtACAgk8wyu8OU/Oi/PufMxHV5xi5wj+wPn8Ac8Jko8=</latexit>
E E0 E <latexit sha1_base64="igo3CP7AsgetEve2Ns4ylLwtfM4=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkUI8FETxWtB/QhrLZTtqlm03Y3Qgl9Cd48aCIV3+RN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7ASTm7nfeUKleSwfzTRBP6IjyUPOqLHSw+3AHZQrbtVdgKwTLycVyNEclL/6w5ilEUrDBNW657mJ8TOqDGcCZ6V+qjGhbEJH2LNU0gi1ny1OnZELqwxJGCtb0pCF+nsio5HW0yiwnRE1Y73qzcX/vF5qwms/4zJJDUq2XBSmgpiYzP8mQ66QGTG1hDLF7a2EjamizNh0SjYEb/XlddK+qnpu1buvVRq1PI4inME5XIIHdWjAHTShBQxG8Ayv8OYI58V5dz6WrQUnnzmFP3A+fwC4NY1e</latexit>
<latexit sha1_base64="XOYXAQMpo32SZHZkZyzK3SUKytA=">AAAB9XicbVDLSgMxFL1TX7W+qi7dBIvgQspMEXRZEMFlBfuA6VAyaaYNTTJDklHK0M9wqW7ErV/jwr8x085CWw8EDufcyz05YcKZNq777ZTW1jc2t8rblZ3dvf2D6uFRR8epIrRNYh6rXog15UzStmGG016iKBYhp91wcpP73UeqNIvlg5kmNBB4JFnECDZW8vsCmzHBPLudDao1t+7OgVaJV5AaFGgNql/9YUxSQaUhHGvte25iggwrwwins0o/1TTBZIJH1LdUYkF1kM0jz9CZVYYoipV90qC5+nsjw0LrqQgvQmGH85R62c7F/zw/NdF1kDGZpIZKsrgVpRyZGOUVoCFTlBg+tQQTxWxcRMZYYWJsURXbg7f861XSadQ9t+7dX9aajaKRMpzAKZyDB1fQhDtoQRsIxPAMr/DmPDkvzrvzsRgtOcXOMfyB8/kDY2SSWA==</latexit>
<latexit
M
<latexit sha1_base64="mYfCySN1ilXjlvH77OnNLTDu/H8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCFy9CC/YD2lA220m7drMJuxuhhP4CLx4U8epP8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53CxubW9k5xt7S3f3B4VD4+aes4VQxbLBax6gZUo+ASW4Ybgd1EIY0CgZ1gcjv3O0+oNI/lg5km6Ed0JHnIGTVWat4PyhW36i5A1omXkwrkaAzKX/1hzNIIpWGCat3z3MT4GVWGM4GzUj/VmFA2oSPsWSpphNrPFofOyIVVhiSMlS1pyEL9PZHRSOtpFNjOiJqxXvXm4n9eLzXhjZ9xmaQGJVsuClNBTEzmX5MhV8iMmFpCmeL2VsLGVFFmbDYlG4K3+vI6aV9VPbfqNWuVei2PowhncA6X4ME11OEOGtACBgjP8ApvzqPz4rw7H8vWgpPPnMIfOJ8/oNeMww==</latexit>
<latexit

Figure 3. Measurement uncertainty of neural-network estimators for molecular systems. The energy units are Hartrees. (a)
Statistical variance in the MC sampling of the neural network as a function of the size M of the training dataset. We compare
this with the variance σ 2 [H]qc = K 2 2
P
k=1 |ck | σ [Pk ] arising from independent estimation of each Pauli operator with S = M/K
measurements. (b) Energy measurement distribution on the quantum computer. The red lines bounds the chemical accuracy
interval E = 1.6 × 10−3 . The total number of measurements M is set to 64k, 512k, 64k for the LiH, BeH2 (STO-3G basis) and
H2 (6-31G basis) molecules respectively. (c) Energy measurement distribution of the neural-network estimator (histogram),
on the same number of measurements used in (b). All estimates fall within chemical accuracy from the true value E0 . (d)
Energy errors induced by imperfect state reconstruction. We compare, for different dataset sizes M , the sample variance of the
mean ε2qc = σ 2 [H]qc /S with the variance of the distribution of the neural-network estimator ∆2λ , calculated from the energy
histograms in (c).

where {σ1 , . . . , σnmc } is a set of nmc configurations drawn quantized fermionic Hamiltonian in the atomic STO-3G
from the probability distribution |ψλ (σ)|2 via Monte basis through a parity transformation and qubit taper-
Carlo (MC) sampling. Here lies the critical advantage of ing from molecular symmetries [34, 35]. We train a set
the neural-network estimator: despite that the wavefunc- of RBMs at different inter-atomic separations R using
tion ψλ (σ) is reconstructed from single-qubit data gener- datasets D of increasing size M , and perform measure-
ated by the quantum computer, the measurement it pro- ments of the molecular Hamiltonians H λ . We show
duces is not affected by the intrinsic quantum noise. This in Fig. 2 the neural-network estimators over the entire
is in fact equivalent to the direct measurement scheme molecular energy surface. Comparing these measure-
where data is collected in the eigen-basis of the observ- ments with exact energies shows that a relatively good
able Ô [SM]. precision can be achieved with as low as M = 103 (total)
measurements. For a given number of measurements M ,
the neural-network estimator provides better estimates
RESULTS with respect to the conventional estimator H qc .
The higher precision of estimators produced by the
We benchmark our technique on molecular Hamiltoni- neural networks originates from the direct parametriza-
ans Ĥ, an exemplary test cases for complex observables. tion of the many-body wavefunction, eliminating any in-
For these fermionic systems, the number of Pauli oper- trinsic quantum noise. In turn, the imperfect quantum
ators K in Eq. (1) can grow up to the fourth power in reconstruction leads to two additional sources of uncer-
the number of orbitals considered [33]. The resulting fast tainty: a MC variance of statistical nature, and a system-
growth in measurement complexity remains a roadblock atic bias in the expectation value. In the following, we
for quantum simulations on near-term hardware based investigate these noise sources for the BeH2 molecule, as
on low depth quantum-classical hybrid algorithms, such well as the Lithium Hydrate (LiH) in STO-3G basis and
as variational quantum eigensolvers [34]. the Hydrogen (H2 ) molecule in the 6-31G basis, encoded
We generate synthetic measurement datasets, sam- in N = 4 and N = 8 qubits respectively. We estimate the
pling from the exact ground state of Beryllium Hydrate uncertainty of the measurement with the quantum com-
(BeH2 ), calculated by exact diagonalization of a N = 6 puter using the exact variance calculated on the ground
qubit Hamiltonian. The latter is obtained from a second- state wavefunction, with S = M/K measurements per
4

p( < E)
"2qc
<latexit sha1_base64="jhUyCwgakxB+Tdv578K1f02l54w=">AAACAXicbVDLSgMxFM3UV62vUXHlJliEClJmpKALFwURXFawD+gMJZPJtKFJZkgyQhm68ktcqhtx63e48G/MtLPQ1gOBwzn3ck9OkDCqtON8W6WV1bX1jfJmZWt7Z3fP3j/oqDiVmLRxzGLZC5AijArS1lQz0kskQTxgpBuMb3K/+0ikorF40JOE+BwNBY0oRtpIA/soqXkhYRpdexzpEUYsu52eDeyqU3dmgMvELUgVFGgN7C8vjHHKidCYIaX6rpNoP0NSU8zItOKliiQIj9GQ9A0ViBPlZ7P4U3hqlBBGsTRPaDhTf29kiCs14cF5wM1wHlMt2rn4n9dPdXTlZ1QkqSYCz29FKYM6hnkdMKSSYM0mhiAsqYkL8QhJhLUprWJ6cBd/vUw6F3XXqbv3jWqzUTRSBsfgBNSACy5BE9yBFmgDDDLwDF7Bm/VkvVjv1sd8tGQVO4fgD6zPH7OQllI=</latexit>

<latexit sha1_base64="uL1gz9le4XrfIZJC0dSkv/NzIB4=">AAACBXicbVDLSgNBEJyNrxhfUU/iZTEIOUjYDYIeA148RjAPyMZldtJJhszOrjO9wbAEj36JR/UiXv0KD/6Nk8dBEwsaiqpuuruCWHCNjvNtZVZW19Y3spu5re2d3b38/kFdR4liUGORiFQzoBoEl1BDjgKasQIaBgIaweBq4jeGoDSP5C2OYmiHtCd5lzOKRvLzR96QKog1F5G8K/uph/CA6T0bj/18wSk5U9jLxJ2TApmj6ue/vE7EkhAkMkG1brlOjO2UKuRMwDjnJRpiyga0By1DJQ1Bt9PpC2P71CgduxspUxLtqfp7IqWh1qMwOAtC0xxS7OtFeyL+57US7F62Uy7jBEGy2a5uImyM7EkkdocrYChGhlCmuDnXZn2qKEMTXM7k4C5+vUzq5ZLrlNyb80KlOE8kS47JCSkSl1yQCrkmVVIjjDySZ/JK3qwn68V6tz5mrRlrPnNI/sD6/AFvc5kL</latexit>

<latexit sha1_base64="2hy3zGFdgEWKDeisfeuJZlD7wW0=">AAAB8nicbVDLSgNBEOz1GeMr6tHLYBBykLArgh4DIniMYB6QLGF2MkmGzMwuM71CWPITHtWLePV3PPg3TpI9aGJBQ1HVTXdXlEhh0fe/vbX1jc2t7cJOcXdv/+CwdHTctHFqGG+wWMamHVHLpdC8gQIlbyeGUxVJ3orGtzO/9cSNFbF+xEnCQ0WHWgwEo+ikdteKoaK9u16p7Ff9OcgqCXJShhz1Xumr249ZqrhGJqm1ncBPMMyoQcEknxa7qeUJZWM65B1HNVXchtn83ik5d0qfDGLjSiOZq78nMqqsnajoIlKuWVEc2WV7Jv7ndVIc3ISZ0EmKXLPFrkEqCcZk9j/pC8MZyokjlBnhziVsRA1l6FIquhyC5a9XSfOyGvjV4OGqXKvkiRTgFM6gAgFcQw3uoQ4NYCDhGV7hzUPvxXv3Phata14+cwJ/4H3+AMnmkM4=</latexit>
ELiH <latexit sha1_base64="U0crNIVOtsJ5TEF4Dqjgt29g+7A=">AAAB8XicbVA9SwNBEN2LXzF+RS1tFoNgFe5E0DJgk8IigvnA5Ah7m7lkyd7esTsnhiP/wsZCEVv/jZ3/xk1yhSY+GHi8N8PMvCCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqjk0eSxj3QmYASkUNFGghE6igUWBhHYwvpn57UfQRsTqHicJ+BEbKhEKztBKDz2EJ8xuRX3aL1fcqjsHXSVeTiokR6Nf/uoNYp5GoJBLZkzXcxP0M6ZRcAnTUi81kDA+ZkPoWqpYBMbP5hdP6ZlVBjSMtS2FdK7+nshYZMwkCmxnxHBklr2Z+J/XTTG89jOhkhRB8cWiMJUUYzp7nw6EBo5yYgnjWthbKR8xzTjakEo2BG/55VXSuqh6btW7u6zULvM4iuSEnJJz4pErUiN10iBNwokiz+SVvDnGeXHenY9Fa8HJZ47JHzifP7zNkOY=</latexit>
<latexit sha1_base64="K8pyH0lPotSkDtfnC0H6NZZOMeU=">AAACA3icbVBPS8MwHE3nvzn/VT148BIcwg4y2iHocaAHjxPcHKy1pGm6hSVpSVJhlB79JB7Vi3j1Y3jw25htPejmg8Djvd/jl98LU0aVdpxvq7Kyura+Ud2sbW3v7O7Z+wc9lWQSky5OWCL7IVKEUUG6mmpG+qkkiIeM3Ifjq6l//0ikoom405OU+BwNBY0pRtpIgX3kXROm0UMryL2Q5x4z0QgVRWDXnaYzA1wmbknqoEQnsL+8KMEZJ0JjhpQauE6q/RxJTTEjRc3LFEkRHqMhGRgqECfKz2cHFPDUKBGME2me0HCm/k7kiCs14eFZyM0wR3qkFu2p+J83yHR86edUpJkmAs93xRmDOoHTQmBEJcGaTQxBWFLzXYhHSCKsTW0104O7ePUy6bWartN0b8/r7UbZSBUcgxPQAC64AG1wAzqgCzAowDN4BW/Wk/VivVsf89GKVWYOwR9Ynz/m85eR</latexit>
2

E
<latexit sha1_base64="NlAWJLmTed0sd11pC7wJ+DY1z6Q=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCCB5bsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7AST27nfeUKleSwfzDRBP6IjyUPOqLFS825QrrhVdwGyTrycVCBHY1D+6g9jlkYoDRNU657nJsbPqDKcCZyV+qnGhLIJHWHPUkkj1H62OHRGLqwyJGGsbElDFurviYxGWk+jwHZG1Iz1qjcX//N6qQlrfsZlkhqUbLkoTAUxMZl/TYZcITNiagllittbCRtTRZmx2ZRsCN7qy+ukfVX13KrXvK7Ua3kcRTiDc7gED26gDvfQgBYwQHiGV3hzHp0X5935WLYWnHzmFP7A+fwBleuMvw==</latexit>
<latexit

H
R
<latexit sha1_base64="9ds5Ai+t2KcOD4LI0C0UpNklZD8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ1P+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqtTyOIpzBOVyCBzdQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqZ+MzA==</latexit>
<latexit
<latexit sha1_base64="biEnvgyl6TFrv7WOKK40rDHWIcY=">AAAB9nicbVDLSgMxFL1TX7W+qi7dBIvgQsqMFHRZcNNlBfuAzlAyaaYNTTJDkhHL0N9wqW7ErT/jwr8x085CWw8EDufcw705YcKZNq777ZQ2Nre2d8q7lb39g8Oj6vFJV8epIrRDYh6rfog15UzSjmGG036iKBYhp71wepf7vUeqNIvlg5klNBB4LFnECDZW8v3Ymnk2a82H1ZpbdxdA68QrSA0KtIfVL38Uk1RQaQjHWg88NzFBhpVhhNN5xU81TTCZ4jEdWCqxoDrIFjfP0YVVRiiKlX3SoIX6O5FhofVMhFehsMMCm4letXPxP2+Qmug2yJhMUkMlWe6KUo5MjPIO0IgpSgyfWYKJYvZcRCZYYWJsUxXbg7f663XSva57bt27b9SajaKRMpzBOVyCBzfQhBa0oQMEEniGV3hznpwX5935WI6WnCJzCn/gfP4AZdCS8Q==</latexit>
<latexit

QC
<latexit sha1_base64="uJQBq81Y36UTCix6kyaE+/8UgmY=">AAAB8HicbVBNS8NAEN34WetX1aOXYBF6KokIeiz04rEF+yFtKJvtpF26uwm7E7GE/govHhTx6s/x5r9x2+agrQ8GHu/NMDMvTAQ36Hnfzsbm1vbObmGvuH9weHRcOjltmzjVDFosFrHuhtSA4ApayFFAN9FAZSigE07qc7/zCNrwWN3jNIFA0pHiEWcUrfTQR3jCrFmfDUplr+ot4K4TPydlkqMxKH31hzFLJShkghrT870Eg4xq5EzArNhPDSSUTegIepYqKsEE2eLgmXtplaEbxdqWQneh/p7IqDRmKkPbKSmOzao3F//zeilGt0HGVZIiKLZcFKXCxdidf+8OuQaGYmoJZZrbW102ppoytBkVbQj+6svrpH1V9b2q37wu1yp5HAVyTi5IhfjkhtTIHWmQFmFEkmfySt4c7bw4787HsnXDyWfOyB84nz/xVZBn</latexit>

QC+NN
<latexit sha1_base64="8nJe7BkbkjvAxIqSPAOpe3qBcqY=">AAAB9nicbVBNSwMxEM36WetX1aOXYBEKStkVQY+FXjyVFuwHdJeSTdM2NMkuyay0LP0bHtWLePXPePDfmLZ70NYHA4/3ZpiZF8aCG3Ddb2djc2t7Zze3l98/ODw6LpyctkyUaMqaNBKR7oTEMMEVawIHwTqxZkSGgrXDcXXut5+YNjxSjzCNWSDJUPEBpwSs5PvAJpA2qle12qxXKLpldwG8TryMFFGGeq/w5fcjmkimgApiTNdzYwhSooFTwWZ5PzEsJnRMhqxrqSKSmSBd3DzDl1bp40GkbSnAC/X3REqkMVMZXofSNksCI7Nqz8X/vG4Cg/sg5SpOgCm63DVIBIYIzzPAfa4ZBTG1hFDN7bmYjogmFGxSeZuDt/r1OmndlD237DVui5VSlkgOnaMLVEIeukMV9IDqqIkoitEzekVvzsR5cd6dj2XrhpPNnKE/cD5/AIrEklk=</latexit>

BeH2 R
R
<latexit sha1_base64="9ds5Ai+t2KcOD4LI0C0UpNklZD8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ1P+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqtTyOIpzBOVyCBzdQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqZ+MzA==</latexit>
<latexit

<latexit sha1_base64="eXAEar+7/mgbpsUou3rmLxjT5wU=">AAAB9XicbVBNT8JAEN36ifiFevTSCCaeSEtI9Ej0whET+Uigku0yhQ3bbbM7VUnD//DiQWO8+l+8+W9coAcFXzLJy3szmZnnx4JrdJxva219Y3NrO7eT393bPzgsHB23dJQoBk0WiUh1fKpBcAlN5CigEyugoS+g7Y9vZn77AZTmkbzDSQxeSIeSB5xRNNJ9D+EJ02uol/qV0rRfKDplZw57lbgZKZIMjX7hqzeIWBKCRCao1l3XidFLqULOBEzzvURDTNmYDqFrqKQhaC+dXz21z40ysINImZJoz9XfEykNtZ6EvukMKY70sjcT//O6CQZXXsplnCBItlgUJMLGyJ5FYA+4AoZiYghliptbbTaiijI0QeVNCO7yy6ukVSm7Ttm9rRZr1SyOHDklZ+SCuOSS1EidNEiTMKLIM3klb9aj9WK9Wx+L1jUrmzkhf2B9/gCUtpHZ</latexit>

<latexit sha1_base64="9ds5Ai+t2KcOD4LI0C0UpNklZD8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEsMeCF4+t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ7dzvPKHSPJYPZpqgH9GR5CFn1FipeT8oV9yquwBZJ15OKpCjMSh/9YcxSyOUhgmqdc9zE+NnVBnOBM5K/VRjQtmEjrBnqaQRaj9bHDojF1YZkjBWtqQhC/X3REYjradRYDsjasZ61ZuL/3m91IQ1P+MySQ1KtlwUpoKYmMy/JkOukBkxtYQyxe2thI2poszYbEo2BG/15XXSvqp6btVrXlfqtTyOIpzBOVyCBzdQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AqZ+MzA==</latexit>
<latexit

pMax
QC
<latexit sha1_base64="6ZhAxjMYeVgNAgUYCAJ9LXMyjh0=">AAAB/nicbZDLSsNAFIYn9VbrLSqu3AwWoauSiKDLghs3QgV7gTaEyXTSDp1MwsyJtISAr+LGhSJufQ53vo3TNgtt/WHg4z/ncM78QSK4Bsf5tkpr6xubW+Xtys7u3v6BfXjU1nGqKGvRWMSqGxDNBJesBRwE6yaKkSgQrBOMb2b1ziNTmsfyAaYJ8yIylDzklICxfPukD2wCWZL72YLuyCTPfbvq1J258Cq4BVRRoaZvf/UHMU0jJoEKonXPdRLwMqKAU8HySj/VLCF0TIasZ1CSiGkvm5+f43PjDHAYK/Mk4Ln7eyIjkdbTKDCdEYGRXq7NzP9qvRTCay/jMkmBSbpYFKYCQ4xnWeABV4yCmBogVHFzK6YjoggFk1jFhOAuf3kV2hd116m795fVRq2Io4xO0RmqIRddoQa6RU3UQhRl6Bm9ojfryXqx3q2PRWvJKmaO0R9Znz+yxZaC</latexit>

Figure 5. Molecular energy (Hartrees) of LiH from experi-


mental data generated by a superconducting quantum proces-
<latexit sha1_base64="uJQBq81Y36UTCix6kyaE+/8UgmY=">AAAB8HicbVBNS8NAEN34WetX1aOXYBF6KokIeiz04rEF+yFtKJvtpF26uwm7E7GE/govHhTx6s/x5r9x2+agrQ8GHu/NMDMvTAQ36Hnfzsbm1vbObmGvuH9weHRcOjltmzjVDFosFrHuhtSA4ApayFFAN9FAZSigE07qc7/zCNrwWN3jNIFA0pHiEWcUrfTQR3jCrFmfDUplr+ot4K4TPydlkqMxKH31hzFLJShkghrT870Eg4xq5EzArNhPDSSUTegIepYqKsEE2eLgmXtplaEbxdqWQneh/p7IqDRmKkPbKSmOzao3F//zeilGt0HGVZIiKLZcFKXCxdidf+8OuQaGYmoJZZrbW102ppoytBkVbQj+6svrpH1V9b2q37wu1yp5HAVyTi5IhfjkhtTIHWmQFmFEkmfySt4c7bw4787HsnXDyWfOyB84nz/xVZBn</latexit>

QC+NN
<latexit sha1_base64="8nJe7BkbkjvAxIqSPAOpe3qBcqY=">AAAB9nicbVBNSwMxEM36WetX1aOXYBEKStkVQY+FXjyVFuwHdJeSTdM2NMkuyay0LP0bHtWLePXPePDfmLZ70NYHA4/3ZpiZF8aCG3Ddb2djc2t7Zze3l98/ODw6LpyctkyUaMqaNBKR7oTEMMEVawIHwTqxZkSGgrXDcXXut5+YNjxSjzCNWSDJUPEBpwSs5PvAJpA2qle12qxXKLpldwG8TryMFFGGeq/w5fcjmkimgApiTNdzYwhSooFTwWZ5PzEsJnRMhqxrqSKSmSBd3DzDl1bp40GkbSnAC/X3REqkMVMZXofSNksCI7Nqz8X/vG4Cg/sg5SpOgCm63DVIBIYIzzPAfa4ZBTG1hFDN7bmYjogmFGxSeZuDt/r1OmndlD237DVui5VSlkgOnaMLVEIeukMV9IDqqIkoitEzekVvzsR5cd6dj2XrhpPNnKE/cD5/AIrEklk=</latexit>

sor [37], as function of the interatomic distance (Angstroms).


The inset shows the variance ε2qc from sub-sampling 5 × 103
H2 measurements out of 2.5 × 106 data points, and the corre-
sponding variance ∆2λ obtained by separate neural-network
<latexit sha1_base64="6rcD7bIKd9ALRm9dwLFQS9lgmJU=">AAAB83icbVDLSgNBEOyNrxhfUY9eBhPBU9gNAT0GvOQYwTwgu4TZyWwyZPbBTK8YlvyGFw+KePVnvPk3TpI9aGJBQ1HVTXeXn0ih0ba/rcLW9s7uXnG/dHB4dHxSPj3r6jhVjHdYLGPV96nmUkS8gwIl7yeK09CXvOdP7xZ+75ErLeLoAWcJ90I6jkQgGEUjuS7yJ8xa1WG9Oh+WK3bNXoJsEicnFcjRHpa/3FHM0pBHyCTVeuDYCXoZVSiY5POSm2qeUDalYz4wNKIh1162vHlOrowyIkGsTEVIlurviYyGWs9C33SGFCd63VuI/3mDFINbLxNRkiKP2GpRkEqCMVkEQEZCcYZyZghlSphbCZtQRRmamEomBGf95U3Srdccu+bcNyrNRh5HES7gEq7BgRtoQgva0AEGCTzDK7xZqfVivVsfq9aClc+cwx9Ynz9EMZEe</latexit>

reconstructions.

for sufficiently large M the distribution of the neural-


network estimator sharply peaks and gets close to the
M
<latexit sha1_base64="mYfCySN1ilXjlvH77OnNLTDu/H8=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeCFy9CC/YD2lA220m7drMJuxuhhP4CLx4U8epP8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53CxubW9k5xt7S3f3B4VD4+aes4VQxbLBax6gZUo+ASW4Ybgd1EIY0CgZ1gcjv3O0+oNI/lg5km6Ed0JHnIGTVWat4PyhW36i5A1omXkwrkaAzKX/1hzNIIpWGCat3z3MT4GVWGM4GzUj/VmFA2oSPsWSpphNrPFofOyIVVhiSMlS1pyEL9PZHRSOtpFNjOiJqxXvXm4n9eLzXhjZ9xmaQGJVsuClNBTEzmX5MhV8iMmFpCmeL2VsLGVFFmbDYlG4K3+vI6aV9VPbfqNWuVei2PowhncA6X4ME11OEOGtACBgjP8ApvzqPz4rw7H8vWgpPPnMIfOJ8/oNeMww==</latexit>
<latexit
exact expectation value, with a positive off-set due to the
energy variational principle. For a quantitative compari-
Figure 4. Probability of reaching chemical accuracy as a func- son between the two distributions, we show in Fig. 3d the
tion of the total number of measurements M . We compute variance of the mean for the neural-network estimator 2λ
the probability p(δ < E) to obtain a final energy estimate (estimated from the histograms). We observe about two
within chemical accuracy from the exact ground state energy orders of magnitude improvement over the uncertainty
between. Plotted are probabilities for the neural-network es- 2qc of the standard estimator. Further systematic errors
timator and standard one. An upper bound pMax P to the latter due to approximate representability have been shown to
is also shown, obtained by setting σ 2 [H]qc = ( k |ck |)2 . be negligible for molecular systems of larger sizes [36].
The total uncertainty in the final measurement esti-
mator is a combination of systematic bias and statis-
Pauli operator. For all molecules, we consider the geom- tical noise. We quantify the combined effect by con-
etry at the bond distance. sidering the probability p(δ < E) that the deviation
The statistical
p uncertainty from the MC averaging is δ = |E0 − H| from the ground state energy E0 is smaller
given by λ = σ 2 [H]λ /nmc , where σ 2 [H]λ is the vari- than chemical accuracy E. The specific value, which de-
ance of Ĥ on the samples generated by the neural net- pends on thermal fluctuations at room temperature, is
work [SM]. Since the target state (i.e. ground state) is fixed to E = 1.6 × 10−3 p Ha. A simple calculation leads
an eigenstate of the observable, a perfect reconstruction to p(δqc < E) = Erf( E S/2σ 2 [H]qc ) for the standard
would lead to zero variance. Deviations from the exact estimator. We evaluate this probability for the neural-
ground state set the amount σ 2 [Hλ ] > 0 of statistical network estimator by independently re-sampling each
uncertainty in the sampling. In Fig. 3a, we show the MC neural-network across the separate training realizations.
variance for training datasets of increasing size M . Here, We show the results in Fig. 4, where we also include an
we fix the amount of MC samples to nmc = 105 , which is upper bound pMax often referenced in literature [6, 9].
sufficient to make statistical fluctuations negligible. As We observe drastic improvements up to three orders of
expected, the MC variance decreases as M grows larger magnitude in the total measurements M required to get
and the quality of the reconstruction improves, with sig- to p(δλ < E) = 1.
nificant reduction compared to the variance σ 2 [H]qc ob- Finally, we show estimations of molecular energies with
tained from standard post-processing. experimental data obtained in a variational quantum
The reconstruction error in the neural-network esti- eigensolver. We use data from Ref. [37], which consist of
mator is also affected by finite-size deviations in the samples from an approximate ground state preparation of
training dataset. To understand this contribution, we the LiH molecule on superconducting quantum hardware.
train a collection of 100 RBMs on independent mea- In Fig. 5, we plot the energy profile reconstructed by the
surement datasets, and compare the measurement dis- neural network, showing a good agreement using only a
tribution with the one obtained from standard averaging fraction of the total experimental measurements. Note
(Fig. 3b). By examining histograms of energies built from that decoherence determines a discrepancy between the
separate dataset realizations (Fig. 3c), we observe that reconstructed and the measured profile, since our RBM
5

makes a pure state assumption, which is not exactly ver- tal hardware, we have also assumed that the quantum
ified experimentally. state is approximately pure. When decoherence effects
To estimate the uncertainty, we train 50 RBMs on sep- substantially corrupt the state, density-matrix neural-
arate datasets obtained by sub-sampling M = 5 × 103 network reconstruction techniques [41, 42] could be em-
measurement data points, out of the original 2.5 × 106 ployed as an alternative to the algorithm presented here.
measurements in [37]. Despite the mixing in the quan- Generative modelsother than neural networks [43] could
tum state underlying the measurements, the uncertain- also be explored in this setup.
ties in the neural-network estimators are systematically Finally, the increased measurement precision with
lower than the standard measurement scheme, similarly lower sample complexity makes neural-network estima-
to what observed in synthetic data. tors a powerful asset in variational quantum simulations
of ground states by hybrid algorithms using low-depth
quantum circuits [34, 44]. It is natural to expect inte-
DISCUSSION gration of generative models in the feedback loop for the
quantum circuit optimization. With the ever increas-
We have introduced a novel procedure to measure com- ing size of quantum hardware, we envision that machine
plex observables in quantum hardware. The approach learning will play a fundamental role in the development
is based on approximate quantum state reconstruction of the next generation of quantum technologies.
tailored to retrieve a quantum observable of interest.
For the particularly demanding case of quantum chem-
istry applications, we have provided evidence that neural-
network estimators achieve precise measurements with a ACKNOWLEDGEMENTS
reduced amount of sample statistics.
An intriguing open question for future research is We thank J. Carrasquilla, M. T. Fishman, J. Gambetta
the systematic understanding of machine learning-based and R. G. Melko for useful discussions. We thank A. Kan-
quantum state reconstruction. For positive wavefunc- dala for the availability of raw experimental data from
tions, a favorable asymptotic reconstruction scaling has Ref. [37]. The Flatiron Institute is supported by the Si-
been recently shown [38]. For non-positive states, such mons Foundation. A.M. acknowledges support from the
as ground states of interacting electrons, recent works IBM Research Frontiers Institute. Numerical simulations
addressed representability [36, 39, 40], while much less have been carried out on the Simons Foundation Super-
is known for the reconstruction complexity, leaving open computing Center. Quantum state reconstruction was
prospects for future studies. performed with the NetKet software [45], and the molec-
For measurement data generated by the experimen- ular Hamiltonians were obtained with Qiskit Aqua [46].

[1] Maximilian Schlosshauer, Johannes Kofler, and Anton [7] Andrew Jena, Scott Genin, and Michele Mosca, “Pauli
Zeilinger, “A snapshot of foundational attitudes toward Partitioning with Respect to Gate Sets,” arXiv e-prints
quantum mechanics,” Studies in History and Philosophy , arXiv:1907.07859 (2019), arXiv:1907.07859 [quant-ph].
of Science Part B: Studies in History and Philosophy of [8] Tzu-Ching Yen, Vladyslav Verteletskyi, and Artur F.
Modern Physics 44, 222 – 230 (2013). Izmaylov, “Measuring all compatible operators in one se-
[2] A. A. Clerk, M. H. Devoret, S. M. Girvin, Florian Mar- ries of a single-qubit measurements using unitary trans-
quardt, and R. J. Schoelkopf, “Introduction to quantum formations,” arXiv e-prints , arXiv:1907.09386 (2019),
noise, measurement, and amplification,” Rev. Mod. Phys. arXiv:1907.09386 [quant-ph].
82, 1155–1208 (2010). [9] William J. Huggins, Jarrod McClean, Nicholas Rubin,
[3] Earl T Campbell, Barbara M Terhal, and Christophe Zhang Jiang, Nathan Wiebe, K. Birgitta Whaley, and
Vuillot, “Roads towards fault-tolerant universal quantum Ryan Babbush, “Efficient and Noise Resilient Measure-
computation,” Nature 549, 172 (2017). ments for Quantum Chemistry on Near-Term Quantum
[4] A. Wallraff, D. I. Schuster, A. Blais, L. Frunzio, R. S. Computers,” arXiv e-prints , arXiv:1907.13117 (2019),
Huang, J. Majer, S. Kumar, S. M. Girvin, and R. J. arXiv:1907.13117 [quant-ph].
Schoelkopf, “Strong coupling of a single photon to a [10] Pranav Gokhale, Olivia Angiuli, Yongshan Ding, Kai-
superconducting qubit using circuit quantum electrody- wen Gui, Teague Tomesh, Martin Suchara, Margaret
namics,” Nature 431, 162–167 (2004). Martonosi, and Frederic T. Chong, “Minimizing State
[5] D. J. Wineland, J. C. Bergquist, Wayne M. Itano, Preparations in Variational Quantum Eigensolver by
and R. E. Drullinger, “Double-resonance and optical- Partitioning into Commuting Families,” arXiv e-prints
pumping experiments on electromagnetically confined, , arXiv:1907.13623 (2019), arXiv:1907.13623 [quant-ph].
laser-cooled ions,” Opt. Lett. 5, 245–247 (1980). [11] Ophelia Crawford, Barnaby van Straaten, Daochen
[6] Dave Wecker, Matthew B. Hastings, and Matthias Wang, Thomas Parks, Earl Campbell, and Stephen
Troyer, “Progress towards practical quantum variational Brierley, “Efficient quantum measurement of Pauli
algorithms,” Phys. Rev. A 92, 042303 (2015). operators,” arXiv e-prints , arXiv:1908.06942 (2019),
6

arXiv:1908.06942 [quant-ph]. trapped-ion quantum spin simulator,” arXiv e-prints ,


[12] Andrew Zhao, Andrew Tranter, William M. Kirby, arXiv:1910.02496 (2019), arXiv:1910.02496 [quant-ph].
Shu Fay Ung, Akimasa Miyake, and Peter Love, [28] John Preskill, “Quantum Computing in the NISQ era
“Measurement reduction in variational quantum al- and beyond,” Quantum 2, 79 (2018).
gorithms,” arXiv e-prints , arXiv:1908.08067 (2019), [29] K Banaszek, M Cramer, and D Gross, “Focus on quan-
arXiv:1908.08067 [quant-ph]. tum tomography,” New Journal of Physics 15, 125020
[13] Giuseppe Carleo, Ignacio Cirac, Kyle Cranmer, Lau- (2013).
rent Daudet, Maria Schuld, Naftali Tishby, Leslie Vogt- [30] David H. Ackley, Geoffrey E. Hinton, and Terrence J.
Maranto, and Lenka Zdeborová, “Machine learning Sejnowski, “A learning algorithm for boltzmann ma-
and the physical sciences,” arXiv e-prints (2019), chines*,” Cognitive Science 9, 147–169 (1985).
arXiv:1903.10563. [31] Giacomo Torlai and Roger G. Melko, “Machine learn-
[14] Juan Carrasquilla and Roger G. Melko, “Machine learn- ing quantum states in the NISQ era,” arXiv e-prints ,
ing phases of matter,” Nature Physics 13, 431 (2017). arXiv:1905.04312 (2019), arXiv:1905.04312 [quant-ph].
[15] Lei Wang, “Discovering phase transitions with unsuper- [32] Roger G. Melko, Giuseppe Carleo, Juan Carrasquilla,
vised learning,” Phys. Rev. B 94, 195105 (2016). and J. Ignacio Cirac, “Restricted boltzmann ma-
[16] Giuseppe Carleo and Matthias Troyer, “Solving the chines in quantum physics,” Nature Physics (2019),
quantum many-body problem with artificial neural net- 10.1038/s41567-019-0545-1.
works,” Science 355, 602–606 (2017). [33] David Poulin, Matthew B. Hastings, Dave Wecker,
[17] Giacomo Torlai and Roger G Melko, “Learning thermo- Nathan Wiebe, Andrew C. Doberty, and Matthias
dynamics with Boltzmann machines,” Physical Review B Troyer, “The trotter step size required for accurate quan-
94, 165134 (2016). tum simulation of quantum chemistry,” Quantum Infor-
[18] Evert P. L. van Nieuwenburg, Ye-Hua Liu, and Sebas- mation & Computation 15, 361–384 (2015).
tian D. Huber, “Learning phase transitions by confusion,” [34] Abhinav Kandala, Antonio Mezzacapo, Kristan Temme,
Nature Physics 13, 435 (2017). Maika Takita, Markus Brink, Jerry M. Chow, and
[19] Maciej Koch-Janusz and Zohar Ringel, “Mutual informa- Jay M. Gambetta, “Hardware-efficient variational quan-
tion, neural networks and the renormalization group,” tum eigensolver for small molecules and quantum mag-
Nature Physics 14, 578–582 (2018). nets,” Nature 549, 242 EP – (2017).
[20] Giacomo Torlai, Guglielmo Mazzola, Juan Carrasquilla, [35] Sergey Bravyi, Jay M. Gambetta, Antonio Mezza-
Matthias Troyer, Roger Melko, and Giuseppe Car- capo, and Kristan Temme, “Tapering off qubits
leo, “Neural-network quantum state tomography,” Na- to simulate fermionic Hamiltonians,” arXiv e-prints ,
ture Physics 14, 447–450 (2018). arXiv:1701.08213 (2017), arXiv:1701.08213 [quant-ph].
[21] Marin Bukov, Alexandre G. R. Day, Dries Sels, Phillip [36] Kenny Choo, Antonio Mezzacapo, and Giuseppe Carleo,
Weinberg, Anatoli Polkovnikov, and Pankaj Mehta, “Re- “Fermionic neural-network states for ab-initio electronic
inforcement learning in different phases of quantum con- structure,” arXiv:1909.12852 (2019).
trol,” Phys. Rev. X 8, 031086 (2018). [37] Abhinav Kandala, Kristan Temme, Antonio D. Córcoles,
[22] Alireza Seif, Kevin A Landsman, Norbert M Linke, Car- Antonio Mezzacapo, Jerry M. Chow, and Jay M. Gam-
oline Figgatt, C Monroe, and Mohammad Hafezi, “Ma- betta, “Error mitigation extends the computational reach
chine learning assisted readout of trapped-ion qubits,” of a noisy quantum processor,” Nature 567, 491–495
Journal of Physics B: Atomic, Molecular and Optical (2019).
Physics 51, 174006 (2018). [38] Dan Sehayek, Anna Golubeva, Michael S. Albergo, Bo-
[23] Benno S. Rem, Niklas Käming, Matthias Tarnowski, hdan Kulchytskyy, Giacomo Torlai, and Roger G. Melko,
Luca Asteria, Nick Fläschner, Christoph Becker, Klaus “The learnability scaling of quantum states: restricted
Sengstock, and Christof Weitenberg, “Identifying quan- Boltzmann machines,” arXiv e-prints , arXiv:1908.07532
tum phase transitions using artificial neural networks on (2019), arXiv:1908.07532 [quant-ph].
experimental data,” Nature Physics (2019). [39] Di Luo and Bryan K. Clark, “Backflow transformations
[24] Annabelle Bohrdt, Christie S. Chiu, Geoffrey Ji, Muqing via neural networks for quantum many-body wave func-
Xu, Daniel Greif, Markus Greiner, Eugene Demler, tions,” Phys. Rev. Lett. 122, 226401 (2019).
Fabian Grusdt, and Michael Knap, “Classifying snap- [40] David Pfau, James S. Spencer, Alexand er G. de G.
shots of the doped hubbard model with machine learn- Matthews, and W. M. C. Foulkes, “Ab-Initio Solution
ing,” Nature Physics (2019). of the Many-Electron Schrödinger Equation with Deep
[25] Giacomo Torlai, Brian Timar, Evert P. L. van Nieuwen- Neural Networks,” arXiv e-prints , arXiv:1909.02487
burg, Harry Levine, Ahmed Omran, Alexander Keesling, (2019), arXiv:1909.02487 [physics.chem-ph].
Hannes Bernien, Markus Greiner, Vladan Vuletić, [41] Giacomo Torlai and Roger G. Melko, “Latent space pu-
Mikhail D. Lukin, Roger G. Melko, and Manuel En- rification via neural density operators,” Phys. Rev. Lett.
dres, “Integrating Neural Networks with a Quantum 120, 240503 (2018).
Simulator for State Reconstruction,” arXiv e-prints , [42] Juan Carrasquilla, Giacomo Torlai, Roger G. Melko, and
arXiv:1904.08441 (2019), arXiv:1904.08441 [quant-ph]. Leandro Aolita, “Reconstructing quantum states with
[26] Yi Zhang, A. Mesaros, K. Fujita, S. D. Edkins, M. H. generative models,” Nature Machine Intelligence 1, 155–
Hamidian, K. Ch’ng, H. Eisaki, S. Uchida, J. C. Séa- 161 (2019).
mus Davis, Ehsan Khatami, and Eun-Ah Kim, “Ma- [43] Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eis-
chine learning in electronic-quantum-matter imaging ex- ert, and J. Ignacio Cirac, “Expressive power of tensor-
periments,” Nature 570, 484–490 (2019). network factorizations for probabilistic modeling, with
[27] Yi Hong Teoh, Marina Drygala, Roger G. Melko, applications from hidden Markov models to quantum
and Rajibul Islam, “Machine learning design of a machine learning,” arXiv e-prints , arXiv:1907.03741
7

k1 kN ki
(2019), arXiv:1907.03741 [cs.LG]. surement σjk = {σj,1 , . . . , σj,N } (σj,i = {0, 1}) in the
[44] C. Kokail, C. Maier, R. van Bijnen, T. Brydges, M. K. measurement basis k = (k1 , . . . , kN ) (ki = {x, y, z}).
Joshi, P. Jurcevic, C. A. Muschik, P. Silvi, R. Blatt, C. F. For simplicity, we assume in the following that the
Roos, and P. Zoller, “Self-verifying variational quan- same number of measurements S is used for each Pauli
tum simulation of lattice models,” Nature 569, 355–360
term, leading to M = K × S total queries to the quan-
(2019).
[45] Giuseppe Carleo, Kenny Choo, Damian Hofmann, James tum hardware. Given the measurement dataset D, the
E. T. Smith, Tom Westerhout, Fabien Alet, Emily J. expectation value of each Pauli operator and its variance
Davis, Stavros Efthymiou, Ivan Glasser, Sheng-Hsuan are provided by the sample estimators
Lin, Marta Mauri, Guglielmo Mazzola, Christian B.
S
Mendl, Evert van Nieuwenburg, Ossian O’Reilly, Hugo X Pk,j
Théveniaut, Giacomo Torlai, Filippo Vicentini, and Pk = (5)
j=1
S
Alexander Wietek, “Netket: A machine learning toolkit
for many-body quantum systems,” SoftwareX 10, 100311 S
X (Pk,j − Pk )2
(2019). σ 2 [Pk ] = , (6)
[46] Héctor Abraham, Ismail Yunus Akhalwaya, Gadi Alek- j=1
S−1
sandrowicz, Thomas Alexander, Gadi Alexandrowics,
Eli Arbel, Abraham Asfaw, Carlos Azaustre, Panagi- QN ki

otis Barkoutsos, George Barron, Luciano Bello, Yael where Pk,j = i=1 (−1)σj,i is the result of a single mea-
Ben-Haim, Lev S. Bishop, Samuel Bosch, David Bucher, surement for the k-th Pauli operator. The standard way
et al., “Qiskit: An open-source framework for quantum of building estimators for the observable Ô on quantum
computing,” (2019) computers follows from Eqs. (5) and (6):
K S
K X
X X ck Pk,j
SUPPLEMENTARY MATERIAL Oqc = ck P k = (7)
S
k=1 k=1 j=1
K K XS
In this Supplementary Material, we first discuss the X X |ck |2 (Pk,j − P k )2
standard technique to perform measurements of generic σ 2 [O]qc = |ck |2 σ 2 [Pk ] = .
j=1
S−1
k=1 k=1
quantum observables in quantum hardware. Then we de-
scribe the representation of the many-body wavefunction (8)
with a restricted Boltzmann machine, and the approxi-
The probability distribution of the measurement outcome
mate quantum state reconstruction technique. We also
Oqc is a normal distribution with standard deviation
provide details on the methods used to generate the data
shown in the manuscript. v
uK S 2
σ[O]qc uX X c (Pk,j − P k )2
k
qc = √ =t . (9)
S j=1
S(S − 1)
Measurements in quantum hardware k=1

A simple bound to this estimator is given by [6]


A generic N -qubit quantum observable can be ex-
pressed as a linear combination ( k |ck |)2
P
2 2
qc ≤ Max = . (10)
K
X M
Ô = ck P̂k , (4) From this expression one can see how the error for this
k=1 estimator is directly related to the number of Pauli op-
erators K in Eq. (4). Finally, we note that in the numer-
where P̂k ∈ {1̂, σ̂ x , σ̂ y , σ̂ z }⊗N are tensor products of
ical simulation with synthetic data we have estimated
single-qubit Pauli operators. While direct measurement
the variance of the Pauli operators using the expectation
of eigen-states of the observable Ô is in general not feasi-
value calculated on the exact ground state wavefunction,
ble, each Pauli operator P̂k can be estimated indepen-
i.e. σ 2 [Pk ] = 1−hP̂k i2 , rather than using sample variance
dently on a quantum computer. Once a given quan-
from Eq. (6).
tum state |Ψi of interest is prepared by the quantum
hardware, P̂k is measured by applying a suitable unitary
transformation Ûk (compiled into a set of single-qubit Neural-network quantum reconstruction
gates) into the eigen-basis of P̂k , followed by single-qubit
projective measurement. By measuring each Pauli op- We propose to overcome the large measurement un-
erator in the expansion in Eq. (4) independently, an es- certainty of the standard procedure by using the mea-
timate for the observable Ô is retrieved from a dataset surement data to gain access to the quantum state un-
D = {D1 , . . . , DK }. Each Dk = {σ1k , . . . , σSk } is a collec- derlying the hardware. Contrary to traditional quantum
tion of S projective measurements σjk for the Pauli op- state tomography [29], we perform an approximate recon-
erator P̂k , where σjk is a N -bit string single-qubit mea- struction based on unsupervised learning of single-qubit
8

projective measurement data with using artificial neural negative-log likelihood (NLL):
networks [20]. X
We adopt a representation of a pure quantum state Cλ = − log pλ (σ b )
based on a restricted Boltzmann machine (RBM), a σ b ∈D
(13)
stochastic neural network made out of two layers of bi-
X X
= log |ψλ (σ)|2 − |D|−1 log |ψλ (σ b )|2 ,
nary units: a visible layer σ describing the qubits and a σ σ b ∈D
hidden layer h, used to capture the correlations between
the visible units [30]. The two layers are connected by a with |D| the size of the dataset. The RBM wavefunction
weight matrix W , and additional fields (or biases) c and in the b basis is
d couple to each unit in the two layers. Given a reference X
ψλ (σ b ) = Uσb σ ψλ (σ) , (14)
basis |σi = |σ1 , . . . , σN i for N qubits (e.g. σj = σjz ), the
σ
RBM provides the following (unnormalized) parametriza-
tion of the many-body wavefunction: where Û is the unitary transformation that relates the
X P P P measurement basis |σ b i with the reference basis |σi. As
ψλ (σ) = e ij Wij σi hj + j dj hj + i ai σi we restrict to the Pauli group, Û has a tensor product
h
 (11) structure over each qubit, leading to the following matrix
P
ai σi
P
log cosh
P
Wij σi +dj representation:
=e i e j i .
N
b
Y
In order to capture quantum states with complex-valued Uσ b σ = hσj j |σj i . (15)
amplitudes, we adopt complex-valued network parame- j=1
ters λ = {a, W , d} [16]. For a detailed description of
the neural-network properties in the context of quantum The gradient of the cost function Gλ = GλR + iGλI can
many-body wavefunctions, we refer the reader to recent be calculated analytically:
reviews [31, 32]. X ∂
The goal of the quantum state reconstruction is to dis- Gλω = Zλ−1 ω
|ψλ (σ)|2
σ
∂λ
cover a set of parameters λ such that the RBM wave- X ∂
function approximates an unknown target quantum state − |D|−1 log |ψλ (σ b )|2
∂λ ω
Ψ on a set of measurement data. In a given measure- σ b ∈D
ment basis b = (b1 , . . . , bN ) (bi = {x, y, z}) spanned by X ∂  
bN
|σ b i = |σ1b1 , , . . . , σN i, the measurement probability dis- = Zλ−1 |ψλ (σ)|2 ω log ψλ (σ) + log ψλ∗ (σ) +
∂λ
tribution is specified by the Born rule P (σ b ) = |Ψ(σ b )|2 , σ
where Ψ(σ b ) = hσ b |Ψi. Maximum likelihood learning of X ∂  
− |D|−1 log ψλ (σ b ) + log ψλ∗ (σ b ) .
the network parameters corresponds to minimizing the ∂λ ω
bσ ∈D
extended Kullbach-Leibler (KL) divergence
(16)
b
P (σ ) Here weP have defined the wavefunction normalization
XX
Cλ = P (σ b ) log
pλ (σ b ) Zλ = σ |ψλ (σ)|2 . The derivative of the wavefunction
b σb
XX (12) in the basis b is:
≈− P (σ b ) log pλ (σ b ) , P ∂
b σb
∂ b σ Uσ b σ ψλ (σ) ∂λω log ψλ (σ)
log ψλ (σ ) = P
∂λω Uσb σ ψλ (σ) (17)
where the sum
P σ
b runs on P the informationally-

≡ Φλω (σ) Qσb (σ)


complete set of 3N bases and σ b runs over the full λ

Hilbert space. We also omit the entropy contribution ∂


P P b b where Φλω (σ) = log ψλ (σ) and the average is
∂λω
b σ b P (σ ) log P (σ ) since it does not depend on the taken with respect to the quasi-probability distribution
parameters λ. Note that Cλ ≥ 0, and Cλ assumes its
Qσλ (σ) = Uσ b σ ψλ (σ). By inserting Eq.( 17) into the
b
minimum value when P (σ b ) = pλ (σ b ) ∀ b, σ b . Conse-
gradient in Eq. (16) we find
quently, the optimal set of parameters λ∗ = argminλ Cλ
can be discovered by iterative updates of the form λω →

 

 
Gλω = 2 Re Φλω (σ) p (σ) − 2 Re Φλω (σ) Qb (σ) D
λ
λω − η Gλω , where the learning rate η is the size of the λ
(18)
update, Gλω = ∂λ∂ ω Cλ is the gradient of the cost func- The final gradient Gλ can be written into a compact from
tion, and ω = R, I indicates the real or imaginary part of by exploiting the holomorphic property of log ψλ (σ). Fol-
the parameters and gradients. lowing the definition of Wirtinger derivatives and apply-
In practice, the two exponentially large sums in ing the Cauchy-Riemann conditions,
Eq. (12) are reduced using the finite-size training dataset
D, with measurement bases b corresponding to the set 1
Φλ (σ) = (Φλ (σ) − iΦλ,I (σ))
of Pauli operators P̂k appearing in the decomposition of 2  R    (19)
the observable in Eq. (4). This leads to the approximate = Re ΦλR (σ) − iRe ΦλI (σ) ,
9

we can express the gradient as where each single measurement is given by


h
i
Gλ = 2 Φ∗λ (σ) p (σ) − Φ∗λ (σ) Qσb (σ) D .


(20)
λ λ hσj |Ô|ψλ i X ψλ (σ 0 )
Oλ,j = = hσ|Ô|σ 0 i , (28)
Similarly to a traditional RBM, the gradient breaks hσj |ψλ i 0
ψ λ (σ)
σ
down into two components, traditionally called the posi-
tive and negative phase, driven respectively by the model
and σj are samples drawn from the neural-network dis-
and the data [30]. The negative-phase gradient is approx-
tribution. Irrespective to the sampling procedure em-
imated by a Monte Carlo average
ployed, the efficiency of the measurement procedure re-
n mains tied to the sparsity of the matrix representation

∗ 1X ∗
Φλ (σ) p ≈ Φ (σi ) , (21) of the observable Ô in the reference basis. However, for
λ (σ) n i=1 λ
most cases of interest, only a small number of matrix
elements hσ|Ô|σ 0 i are different from zero for a given σ.
where the configurations {σi } are sampled from the dis-
tribution pλ (σ). The positive-phase gradient is averaged Training details. In every training instance, we fix
on the quasi-probability Qσλ (σ)
b the total number of hidden units equal to the number of
qubits N . Each sample σ b in the training dataset is mea-
U b ψλ (σ)Φ∗λ (σ)
P
sured in a random basis uniformly sampled from the set
Φλ (σ) Qσb (σ) = σPσ σ


, (22) of Pauli operators appearing in the observable decompo-
λ
σ Uσ b σ ψλ (σ)
sition. In practice, only a sub-set of the full training data
which itself contains an intractable summation over the (called mini-batch) is used to compute the gradient for a
exponential size of the Hilbert space. However, the com- single update. The batch size was varied across training
plexity of such expression can be reduced by careful realizations, but it never exceeded 104 . The parameters
choice of measurement bases. In fact, by measuring only updates are carried out using the RMSprop optimizer,
a subset of NU qubits τ = (τ1 , . . . , τNU ) in local bases which introduces adaptive learning rates
different than the reference one, the unitary rotation U
simplifies to η
λ0k = λk − √ Gλ (29)
NU gk +  k
Y bτ Y
Uσ b σ = hστj j |στj i δσb` ,σ . (23)
` `
j=1 `∈τ
/
where the baseline value is set to η = 0.01,  = 10−7 is a
where δα,β is the Kronecker delta. By defining the basis small off-set for numerical stability, and
|si = |s1 , . . . , sNU i spanning the sub-space for the qubits
being acted non-trivially upon by Û (i.e. |sj i = |στj i), gk0 = βgk + (1 − β)Gλ2k (30)
we obtain:
NU
X XY bτ
hO i is the running average of the gradient squared (β = 0.9).
Uσb σ ψλ (σ) = hστj j |στj i hσ`z | ⊗ hs| |ψλ i , During both training and measurement process, the con-
σ s j=1 `∈τ
/ figurations {σj } sampled from the RBM distribution, re-
(24) quired respectively for calculating the negative phase and
which now runs over 2NU terms. The scaling of the re- the neural-network estimator, are obtained using parallel
construction algorithm is then O(2NU n|D|). tempering with 20-25 parallel chains.
Neural-network estimator. Once training is complete,
the RBM is used to perform the measurement of the ob- In order to select the optimal set of RBM parameters
servable Ô. The expectation value of the observable on λ, we split the dataset D and use 90% of data for training
the RBM wavefunction is and the remaining 10% for validation. During training,
we evaluate the NLL on the validation set, and save a
hψλ |Ô|ψλ i X hσ|Ô|ψλ i fixed number of different network parameters generating
= Zλ−1 |ψλ (σ)|2 . (25) the lowest values (usually between 100 and 500). For the
hψλ |ψλ i σ
hσ|ψλ i
case of synthetic data, the best set of parameters λ∗ is
This can be approximated by Monte Carlo sampling, selected (among this set) as the one the generates the
leading to the neural-network estimator lowest energy. For experimental data, we keep the set
that generates the lowest NLL on the validation set. Fi-
nmc
1 X nally, we note that the calculation of the NLL requires
Oλ = Oλ,j (26) the knowledge of the partition function Zλ , which was
nmc j=1
computed exactly in the numerical experiments. In gen-
nmc
X (Oλ,j − Oλ )2 eral, an approximation to Zλ (and thus to NLL) can be
σ 2 [O]λ = , (27) obtained using either parallel tempering or annealed im-
j=1
nmc − 1
portance sampling.

You might also like