Quantum machine learning (QML) is emerging as an application of quantum
computing with the potential to deliver quantum advantage, but its realisation
for practical applications remains impeded by challenges. Amongst those, a key
barrier is the computationally expensive task of encoding classical data into a
quantum state, which could erase any prospective speed-ups over classical
algorithms. In this work, we implement methods for the efficient preparation of
quantum states representing encoded image data using variational, genetic and
matrix product state based algorithms. Our results show that these methods can
approximately prepare states to a level suitable for QML using circuits two
orders of magnitude shallower than a standard state preparation implementation,
obtaining drastic savings in circuit depth and gate count without unduly
sacrificing classification accuracy. Additionally, the QML models trained and
evaluated on approximately encoded data display an increased robustness to
adversarially generated input data perturbations. This partial alleviation of
adversarial vulnerability, possible due to the "drowning out" of adversarial
perturbations while retaining the meaningful large-scale features of the data,
constitutes a considerable benefit for approximate state preparation in
addition to lessening the requirements of the quantum hardware. Our results,
based on simulations and experiments on IBM quantum devices, highlight a
promising pathway for the future implementation of accurate and robust QML
models on complex datasets relevant for practical applications, bringing the
possibility of NISQ-era QML advantage closer to reality.Comment: 14 pages, 8 figure