You are on page 1of 15

3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

A non-review of Quantum Machine Learning: trends and


explorations
This is a Perspective.

By Vedran Dunjko (LIACS, Leiden University, Niels Bohrweg 1, 2333 CA Leiden, The Netherlands) and Peter Wittek
(Rotman School of Management, University of Toronto, Toronto, Ontario M5S 3E6, Canada, Creative Destruction Lab,
Toronto, Ontario M5S 3E6, Canada, Vector Institute for Artificial Intelligence, Toronto, Ontario M5S 1M1, Canada,
and Perimeter Institute for Theoretical Physics, Waterloo, Ontario N2L 2Y5, Canada).

Published: 2020-03-17, volume 4, page 32


Doi: https://doi.org/10.22331/qv-2020-03-17-32
Citation: Quantum Views 4, 32 (2020)

PRINT PAGE

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 1/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

About this non-review

By mid-2019, both Peter and myself had found ourselves numerous times in situations where we were asked to define
what Quantum Machine Learning is (and what it isn’t), or, worse, where we were prompted to divine what the ultimate
approach to it should be. We discussed the topic in passing, to find that we generally share identical feelings on the
topic: there is nothing to be gained by constraining what QML is or isn’t, and, we for sure have no clue what the future
of QML is guaranteed to bring (albeit, we were both sure it will bring something cool!), as essentially all existent
research lines have genuine potential. This discussion became a blog-type text, with the objective to elaborate on this
perspective of ours, and to provide an entry point to the rich and diverse spectrum of topics QML could be – a listing of
reviews, mostly, with a sprinkle of other promising yet non-mainstream topics – without much fuss, and certainly
without ungrateful “expert predictions”.

On advice from colleagues, we have sent this, as Peter dubbed it, “non-review” (a somewhat extended blog-post, really)
to Quantum reviewers, to see if it would be suitable as a Perspectives article, sometime in September 2019.

At this point Peter had already left for his final expedition to the Himalayas. The article has been on hold while we all
were hoping Peter would still be found, and also later while we were slowly accepting his loss. Peter’s colleagues and
myself believe Peter would prefer this note published.

To honour Peter’s original ideas and text, the note is published with no alterations, which could have been made based
on useful suggestions from the reviewers and colleagues, or other types of updates (including new works which would
have influenced parts of the text – the field is very fluid), from the original minimal version we submitted together.

— Vedran Dunjko

See also Quantum’s blog post and testimonials on Peter.

What is quantum machine learning? What are the key issues? As time progresses, these questions are becoming more
difficult to answer. As quantum machine learning rapidly grows, even the much simpler question of what the reference on
the topic is, seems overly ambitious. While the long-term perspectives of quantum machine learning may still be quite
opaque, the field is bustling, growing, and changing, making it difficult to tame with a single review. This perspective is a
tribute to review articles and books, while also drawing attention to recent trends and exploratory works that are often
overshadowed by the volume of mainstream contributions.

Contents
Introduction
The books
Surveys and review articles
Not perspectives
1. Supervised and unsupervised learning

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 2/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

2. Reinforcement learning and AI aspects


3. Machine learning in (experimental) physics
4. Quantum-inspired machine learning
QML and beyond

Introduction
As time progresses, any attempts to pin down quantum machine learning into a well-behaved young discipline are becoming
increasingly more difficult. Quantum machine learning (QML) is not one settled and homogeneous field; partly, this is
because machine learning itself is quite diverse. But the situation is more complicated, due to the respective roles that
quantum and machine learning may play in “QML”. For some, QML is all about using quantum effects to perform machine
learning somehow better. For others, it is clear that it is about utilizing machine learning as the key tool for certain
quantum problems. A more comprehensive view is that quantum machine learning is simply the field exploring the
connections between quantum computing and quantum physics on one hand, and machine learning, and related fields, on
the other. Quantum-applied machine learning, and quantum-enhanced machine learning are then the two dominating, but
not only aspects of quantum machine learning. For instance, quantum-inspired machine learning and quantum-generalized
learning ideas stand out as two very promising research lines, in line with the QML philosophy, but are ultimately not about
speed-ups or applications of ML in quantum experiments.

Quantum-inspired machine learning, for instance, draws inspiration from quantum processing to come up with novel
classical learning models and new ways to train and evaluate them: examples here are the approaches of Tang for fast
stochastic linear-algebraic manipulations [1], ideas involving tensor networks as learning models (see works of Stoudenmire
[2]), and inspirations behind the projective simulation of Briegel [3].

Quantum-generalized machine learning generalizes even the basic concepts: just like density matrices generalize classical
notions of information, quantum-generalized machine learning asks what machine learning can look like when data, or
environments are genuinely quantum objects.

It is clear that it makes little sense to try to write a comprehensive review of all of quantum machine learning (we state this
as two authors who have gone down this path only a few years back). So how does one begin exploring quantum machine
learning? Well, there are the old-school methods: books, review papers, and papers; but nowadays, we also have Youtube
videos, “awesome” pages on GitHub, blogs, online tutorials and similar. In this non-review, we remain old-school, and reflect
on recent literature reporting on the developments in QML from our biased perspective.

The books
By now, a few books have emerged which align with quantum machine learning, and which emphasize different aspects of
the prospective field. The first book which carried the title “quantum machine learning” was Quantum Machine Learning: What
Quantum Computing Means to Data Mining, by one of the authors [4]. It was the entry point to quantum-enhanced machine
learning, suitable for persons with a machine learning background. Supervised Learning with Quantum Computers, by Maria
Schuld and Francesco Petruccione [5] is a more recent book also focusing on quantum-enhanced machine learning, and
places more focus on the potential of near-term gate-based architectures.

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 3/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

Aside from the books above, which directly address the core of modern QML, it is worth while to keep an eye out on books
that emphasize less mainstream ideas. Principles of Quantum Artificial Intelligence, by Andraes Wichert [6], is technically not
on machine learning, but rather on “problem solving” aspects important for symbolic AI and reasoning. Given the close
relationship between AI and ML, it makes sense to keep this book in mind as well. Quantum Robotics: A Primer on Current
Science and Future Perspectives by Prateek Tandon, Stanley Lam and Ben Shih [7], places a bit more emphasis on QML
aspects which could go in the way of robotics, although much of the book is dedicated to more standard quantum
computing and QML ideas.

The available QML books focus mostly on quantum-enhanced supervised learning, and the broad QML field is by now
significantly larger. This brings us to the next level.

Surveys and review articles


To learn more, we have to turn our attention to chunks of QML covered in a number of review articles. Here’s a few. The first
one came out at the same time as the first book: the title is An introduction to Quantum Machine Learning, by Maria Schuld,
Ilya Sinayskiy, and Francesco Petruccione [8]. It is a short-and-sweet survey of some of the quantum-enhanced algorithms
known at the time. The same set of authors undertook a review of long-term research in the perspectives in quantum neural
networks in The quest for a Quantum Neural Network [9]. At the time, the jury was out on the perspectives. Modern results in
shallow architectures (see, e.g., the book [5] above) could be seen as a sign of rejuvenation of the field.

If we can label the early excitement in QML as the first generation quantum-enhanced machine learning algorithms, then
the review paper in Nature [10] closed that era. This review focused on quantum linear algebraic enhancements for machine
learning, but it also signalled the expansion of the field to include machine learning applications in design and control of
quantum systems: an aspect of quantum-applied machine learning.

The number of original results kept growing rapidly, with more interest in near-term feasibility and various cross-overs
between quantum physics and machine learning, as well as in theoretical foundations. On the theory side, Quantum machine
learning: a classical perspective [11] covered quantum-enhanced machine learning with an algorithmic and complexity
theoretical emphasis. Srinivasan Arunachalam and Ronald de Wolf wrote A Survey of Quantum Learning Theory [12], dedicated
to quantum probably approximately correct learning, and related topics. This formal aspect of quantum machine learning is
often underrepresented in literature, yet it is one of the oldest applications of quantum computing.

One of the authors of this perspective co-authored Machine learning & artificial intelligence in the quantum domain: a review of
recent progress [13], with the goal of comprehensiveness, providing examples of quantum-enhanced, quantum-applied, and
quantum generalized machine learning and AI. A fresh out of the oven review covers the hottest QML topic over the last
year, with a self-explanatory title: Parameterized quantum circuits as machine learning models [14]. Quantum neural networks
finally also achieved a level of maturity, as summarized in Quantum Deep Learning Neural Networks [15].

Looking at other cross-overs, Machine learning and the physical sciences [16], is an excellent review of quantum-applied
machine learning. It is also one of the first reviews dedicated to the thriving topic. Applications of ML in chemistry and
quantum chemistry is perhaps tangential to QML in a narrower sense, but since quantum computing loves quantum
chemistry, we can expect many ideas to cross over. This idea is immortalized in Guest Editorial: Special Topic on Data-Enabled
Theoretical Chemistry [17].

Learning in quantum control: High-dimensional global optimization for noisy quantum dynamics [18] reviews machine learning
ideas applied in quantum control, but with new results; so a research paper/review paper hybrid.

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 4/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

Not perspectives
Having listed a number of books and reviews on the broad spectrum of topics of quantum machine learning, one is now
tempted to put the field in perspective, and there is a number of possible ways to do this. One is to “follow the trends” in
order to classify what the consensus on the “hot topics” of the field had been over time.

As mentioned, quantum machine learning actually has quite a long history, ranging back to the early days of quantum
computing. In our “broad viewpoint” on what the most explored topics were over time, it is easy to argue that in the period
from early 1990s, until mid-2000s the research in what is now recognized as “quantum machine learning” had two themes:
quantum computational learning theory, and quantum generalizations of neural networks. The main motivations behind
these approaches can be understood as fundamental in flavour: how does access to pure quantum states encoding classical
distributions enhance learnability –separating classical distributions from quantum states; in the context of quantum neural
networks, the questions ranged from the explorations in the possible quantum nature of the human brain, to finding ways
to reconcile the necessary non-linearities in neuronal processing with the fundamental linearity of quantum mechanics.

The early 2000s mark the first quantum-generalized machine learning ideas, but from 2006 onwards, there has been a
steady influx of quantum algorithms with a pragmatic objective: using quantum computers to perform some ML
computations faster – this is what we labelled above as the first generation of quantum-enhanced machine learning
algorithms.

The period from early 2010s brought an explosive rise in this domain, riding on the potential of quantum linear algebra
algorithms and quantum databases, which, so we hoped, open up the possibility for a steady supply of exponential speed-
ups.

From 2015, we had new sparks of ideas proposing more expressive quantum models (so improving the learning
performance, rather than just straightforward computational complexity), but also first significant signs of life in what is now
called quantum-applied machine learning — the latter domain had in fact always been present, but almost certainly now
due to the successes of machine learning, the term “machine learning” became pervasive in such works.

As indicated above, in 2018, a number of shifts occurred. The progress in experimental quantum computing, and quantum
algorithms designed with restricted circuit-based architectures in mind (VQE, QAOA), inspired the idea that limited quantum
machines just may be the best models (in the sense of parametrized distributions, or as sets of hypothesis functions for
classification) for machine learning. Well, maybe not the best, but certainly models for which the massive body of quantum
supremacy research may actually start to provide hard evidence that these models cannot be simulated by a classical
machine — barring complexity theoretical consequences which would make any honest theoretical computer scientist
blush.

This research line is only more emphasized by the perceived blow that Tang and a number of follow-up works have
provided: quantum database-based approaches can (in a precise way) be dequantized, which has removed some of the
glitter from the perceived importance of quantum algorithms in this line. However, dequantization does not really mean
“classical algorithms are as good”; it means “we now know the separation is not exponential.” But a high degree polynomial
separation can be just as good in practice (or even better, depending on the fine-grained parameters swallowed by the big-O
notation, and actually relevant instance sizes). More on this later.

Currently, perhaps the hottest trend in quantum-enhanced learning algorithms is playing around with parametrized
quantum circuits, whose parameters are tuned much like the weights of a neural network: it is actually bringing us back to
1990s, when the term “quantum neural networks” (in a way, even to the very first work on the topic of Lewenstein in ’94 [19])
was used to mean precisely that: a circuit whose parameters are tuned to realize a desired mapping. Once fringe, these
works are now front and center, as one of the most promising applications of NISQ architectures. Why? Do you have no idea

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 5/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

how to come up with an algorithm for your restricted quantum machine? No worries, put parameters in, and treat it as a
model, it will do the “best it can”, and who knows, just maybe it is the best model ever! These days we hear much less about
the big-data quantum-database quantum machine learning ideas which were almost synonymous with QML.

Each one of the phases of QML had review works which identified “the key” for QML, the biggest questions, and what should
be done next. And it is right that they should do so, as research needs direction. Most of such directions got, very rapidly
(especially having in mind that while QML is “hot”, objectively few researchers work in this area), substituted with a new
series of questions. This, we believe, will be unavoidable for young fields, which are growing faster than maturing.

Right now, QML has a few “obvious” objectives. First and foremost, there is the promise that QML may be the “best
application for quantum computers.” For this we would need to present classes of algorithms with end-to-end speed up,
with all the fine-print accounted for [20], and which have strong evidence of a separation relative to classical algorithms.
Furthermore, these algorithms should be good learning algorithms; a feature that is often put in the second place in the
race for quantum-classical separations.

Second there is the hope that they are also the “best applications for near-term devices”. Combine variational circuits with
the inherent (well, potentially one could hope such a thing may be true) robustness of ML algorithms to data robustness,
and show that this robustness percolates to the learning algorithm itself. Then you have the golden-egg laying goose: robust,
near-term application of quantum computers that matters. Now what is missing in this story is any convincing evidence that
these algorithms are actually any good, but we shall have to suffer through this for a while until some new theory gets
developed. This does not sound particularly appealing, but nobody knows why deep learning neural networks work great
either.

The plan for the future of QML seems solid, but then, as Mike Tyson and Joe Louis say: “Everybody’s got plans… until they get
hit.” As we learn more, these obvious objectives will certainly change, and, e.g., with the possibility of dequantizations (what
if additive-error-noisy systems can be classically simulated in many cases?), quite abruptly at that.
However, we are now focusing on QML in a too narrow a sense. The entire topics of QML change as new ideas emerge.

So here is what would be ideal. We should remove the pressure from having a clear perspective of “where the field is going”,
especially since the contours of the field are not set yet. We should remind ourselves that some of the biggest
breakthroughs where achieved by pure curiosity-driven research, and not by ticking off this weeks “target achievements”.
That would be ideal. In reality, modern science, to live, needs publications and goals. This must be acknowledged as well.

We are happy that the vast majority of works follow this trajectory, as long as time and again, some light is shed on less
acknowledged, curiosity-driven, fringe topic research. The remainder of this work aims to do that. The selection of these
works is not based on them being milestones on a well-planned out roadmap. Rather, they are works we like, we found
inspiring, fun, or visionary.

Below is a non-exhaustive list of some of the more recent papers highlighting QML aspects not covered in detail in the above
reviews. Here, we can start distinguishing the flavours of QML to more precise detail.

1. Supervised and unsupervised learning


Let us start with some newer lines of thought on quantum-enhanced machine learning. For a starter, hardware for
continuous-variable quantum computing is emerging, although the paradigm is tricky due to the difficulty of error
correction. Continuous-variable quantum neural networks [21] offers a model in the NISQ, uncorrected era, and it may give the
edge we are after. Bayesian Deep Learning on a Quantum Computer [22] benefits from recent classical results that connect

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 6/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

Bayesian learning to deep learning, and both can benefit from quantum computers, although the paper is tongue-in-cheek,
since it has matrix exponentiation in its core and it shows experimental results on NISQ-era hardware, which are not exactly
compelling.

Nevertheless, quantum computers have an irresistible appeal to train feedforward neural networks; see, for instance,
Quantum algorithms for feedforward neural networks [23]. Quantum Convolutional Neural Networks [24] introduces a novel
model quantum-generalizing convolutional neural networks, which may be suitable for the problems of learning of quantum
states. This paper also fits in the domain of quantum-generalized machine learning.

Sublinear quantum algorithms for training linear and kernel-based classifiers [25] constitutes a long awaited application of
quantum multiplicative weight primal-dual ideas in supervised machine learning. Quantum classification of the MNIST dataset
via Slow Feature Analysis [26] proposed classification based on the Quantum Frobenius Distance, which we can also think of
as a kernel function.

2. Reinforcement learning and AI aspects


There have been new developments in theoretical and applied aspects of quantum-enhanced reinforcement learning, as
well. Quantum Algorithms for Solving Dynamic Programming Problems [27] proves separations and lower bounds for the
learning of exact optimal policies given quantum access to transition functions in Markov decision processes. Quantum
gradient estimation and its application to quantum reinforcement learning [28] is a truly excellent master thesis in quantum
computing, showing the potential of quantum computing for policy gradient methods.

Both authors of this perspective are rather fond of old-school AI, also witnessed in the paper Quantum Enhanced Inference in
Markov Logic Networks [29]. This shows quantum advantages for Gibbs sampling in networks that combine causal networks
and formal deduction, but there is plenty of more interesting questions to answer in this domain.

3. Machine learning in (experimental) physics


Next we have some newer lines of thought on machine learning applied to (experimental) physics. Machine learning can of
course be used to help us speed up various types of information processing tasks, but in Detecting quantum speedup by
quantum walk with convolutional neural networks [30], the authors show that neural networks can detect whether a quantum
algorithm can produce a speed-up in quantum walk scenarios where theoretical bounds are not known. This result is
exciting especially in the context of real-world practical computing, where theoretical worst-case bounds are often less
important than heuristic domain-specific performance.

In a different direction, in Machine learning for long-distance quantum communication [31] it is shown that AI systems based
on reinforcement learning can also be challenged to actually design new quantum communication protocols. Together with
works like [32], where machine learning is tasked to invent new error correcting codes, such works push the envelopes of
what we may come to expect machines to be capable of.

Switching gears from discovering protocols to unveiling nature itself, in Discovering physical concepts with neural networks
[33], the authors investigate machine-assisted discovery in the physics realm, including inferring the bounds on the
dimensionality of quantum systems.

In a similar, but more quantitative sense of machine-assisted research, in Automated discovery of characteristic features of
phase transitions in many-body localization [34], the authors further illustrate that the true breakthroughs will come when
machines discover truly new properties, like new order parameters. This is possible in the unsupervised and weakly

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 7/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

supervised regime, as this paper shows.

We finalize this section with a paper which is on the border of genuine ML applications, but it is certainly related; Convex
optimization of programmable quantum computers [35] provides an interesting observation that finding optimal program
states for finite gate arrays to realize a target quantum evolution constitutes a (perhaps unexpectedly) convex optimization
problem. This opens the doors to plethora of classical (in the sense of “being a classic”) optimization methods for “optimal
programming” of programmable quantum circuits.

4. Quantum-inspired machine learning


There has been much movement in quantum-inspired machine learning; although this is a borderline topic for QML, it is
easy to imagine that many results here may inspire new quantum algorithms right back.

A prominent new research line considers using tensor networks in place of neural networks for learning, as illustrated in,
e.g., Supervised Learning with Quantum-Inspired Tensor Networks [2]. This research line is new but also deeply rooted, due to
the intricate mathematical connections between neural nets, tensor networks, and learning problems and significant bodies
of research that studied some of the aspects. Although this research line is briefly mentioned in review [16] (we focus on
research not previously covered here), this is a rapidly growing field of research, which will likely deserve its own review
papers.

This brings us to the breakthrough results of Ewin Tang, who showed that classical randomized algorithms can achieve
exponential improvements over standard classical approaches for many settings previously reserved for quantum linear
algebra. That is, the gap between classical and quantum algorithms is no longer exponential, but it is critical to note it is still
a high-polynomial separation. Current exact polynomial degrees render the classical algorithms in general insufficiently
efficient for real-world use, whereas quantum algorithms would work. The first study in the question of the actual real-world
advantages of quantum processing is given in Quantum-inspired algorithms in practice [36].

Regarding the dequantization results themselves, Ewin has a small online review of her own on the topic, so best hear it
from the expert herself.

QML and beyond


This brings us to numerous topics that are still not generally directly included in QML, but we would not be surprised if an
explicit (applied) link emerges presently. Here we list a few interesting examples (and not necessarily the very first papers on
the topic).

In Learning Hidden Quantum Markov Models [37], the authors explore the learning of (quantum) hidden Markov models, and
they explicitly associate their work with the QML domain. Learning hidden Markov models is intimately related to
unsupervised learning, can be used in classification, and their connection to machine learning is quite obvious as they are
special cases of Bayesian networks. Will hidden quantum Markov models be as relevant for quantum machine learning?
Time will tell.

Next we move from learning to elements of meaning, and natural language processing (which is certainly one of the most
prominent long-term objectives of AI). Here, quantum logicians have been investigating the suitability of the non-
commutative structure of quantum theory to model aspects of natural languages, e.g., Word Vectors and Quantum Logic:
Experiments with negation and disjunction [38] (the field is much larger and older than this single example). Once again,
moving from more fundamentally flavoured research, to research with a pragmatic “what can we enhance”-hue, recently we
https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 8/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

have seen papers providing algorithmic ideas where quantum computing offers an edge in language processing. Examples
include. Quantum Algorithms for Compositional Natural Language Processing [39] and Quantum Language Processing [40]. These
developments will no doubt influence ideas in quantum AI.

As a final comment, the entire field of “genuinely quantum” machine learning (where the data itself is quantum) is still
finding its right place and full recognition. Perhaps as quantum technologies mature, and problems of quantum learning
become genuinely practical, the field will crystallize and grow. Although this field is acknowledged in a few reviews we
mentioned, the interested reader can see new ideas where the field may be heading in, e.g., Unsupervised classification of
quantum data [41] (where we move from supervised to unsupervised generalizations). In a related vein, we have already
mentioned the work of Cong [24] where algorithms for the deep neural-network-like analysis of actual quantum states are
suggested.

In summary, QML is diverse, growing, inclusive, and it is rich in open questions. We (the authors of this non-review) are
biased towards topics we are interested in and we are certain we are missing many new exciting ideas that have been
popping up in recent times. Capturing all the QML trends, which will in the end be central is, for the time being, an
impossible task — and, in a way, this is the key message of this note.

Acknowledgements
We would like to thank Sofiene Jerbi, Charles Moussa and Casper Gyurik in helping us compile the books, reviews, articles
and for the proofreading of the text.

► BibTeX data

@article{Dunjko2020nonreviewofquantum,
doi = {10.22331/qv-2020-03-17-32},
url = {https://doi.org/10.22331/qv-2020-03-17-32},
title = {A non-review of {Q}uantum {M}achine {L}earning: trends and explorations},
author = {Dunjko, Vedran and Wittek, Peter},
journal = {{Quantum Views}},
publisher = {{Verein zur F{\"{o}}rderung des Open Access Publizierens in den
Quantenwissenschaften}},
volume = {4},
pages = {32},
month = mar,
year = {2020}
}
► References
[1] Ewin Tang. A quantum-inspired classical algorithm for recommendation systems. In Proceedings of the 51st Annual ACM
SIGACT Symposium on Theory of Computing, STOC 2019, pages 217–228, New York, NY, USA, 2019. ACM. 10.1145/​
3313276.3316310.
https:/​/​doi.org/​10.1145/​3313276.3316310

[2] Edwin Stoudenmire and David J Schwab. Supervised learning with tensor networks. In D. D. Lee, M. Sugiyama, U. V.
Luxburg, I. Guyon, and R. Garnett, editors, Advances in Neural Information Processing Systems 29, pages 4799–4807. Curran
Associates, Inc., 2016.
https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 9/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

[3] Hans J. Briegel and Gemma De las Cuevas. Projective simulation for artificial intelligence. Scientific Reports, 2(1), May
2012. 10.1038/​srep00400.
https:/​/​doi.org/​10.1038/​srep00400

[4] Peter Wittek. Quantum Machine Learning: What Quantum Computing Means to Data Mining. Elsevier Science, 2016.

[5] Maria Schuld and Francesco Petruccione. Supervised Learning with Quantum Computers. Springer International
Publishing, 2018.

[6] Andreas Wichert. Principles of Quantum Artificial Intelligence. WORLD SCIENTIFIC, August 2013. 10.1142/​8980.
https:/​/​doi.org/​10.1142/​8980

[7] Prateek Tandon, Stanley Lam, Ben Shih, Tanay Mehta, Alex Mitev, and Zhiyang Ong. Quantum robotics: A primer on cur-
rent science and future perspectives. Synthesis Lectures on Quantum Computing, 6(1):1–149, January 2017. 10.2200/​
S00746ED1V01Y201612QMC010.
https:/​/​doi.org/​10.2200/​S00746ED1V01Y201612QMC010

[8] Maria Schuld, Ilya Sinayskiy, and Francesco Petruccione. An introduction to quantum machine learning. Contemporary
Physics, 56(2):172–185, October 2014. 10.1080/​00107514.2014.964942.
https:/​/​doi.org/​10.1080/​00107514.2014.964942

[9] Maria Schuld, Ilya Sinayskiy, and Francesco Petruccione. The quest for a quantum neural network. Quantum Information
Processing, 13(11):2567–2586, November 2014. 10.1007/​s11128-014-0809-8.
https:/​/​doi.org/​10.1007/​s11128-014-0809-8

[10] Jacob Biamonte, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, and Seth Lloyd. Quantum machine
learning. Nature, 549(7671):195–202, September 2017. 10.1038/​nature23474.
https:/​/​doi.org/​10.1038/​nature23474

[11] Carlo Ciliberto, Mark Herbster, Alessandro Davide Ialongo, Massimiliano Pontil, Andrea Rocchetto, Simone Severini, and
Leonard Wossnig. Quantum machine learning: a classical perspective. Proceedings of the Royal Society A: Mathematical,
Physical and Engineering Sciences, 474(2209):20170551, January 2018. 10.1098/​rspa.2017.0551.
https:/​/​doi.org/​10.1098/​rspa.2017.0551

[12] Srinivasan Arunachalam and Ronald de Wolf. A survey of quantum learning theory, 2017.

[13] Vedran Dunjko and Hans J Briegel. Machine learning & artificial intelligence in the quantum domain: a review of recent
progress. Reports on Progress in Physics, 81(7):074001, June 2018. 10.1088/​1361-6633/​aab406.
https:/​/​doi.org/​10.1088/​1361-6633/​aab406

[14] Marcello Benedetti, Erika Lloyd, and Stefan Sack. Parameterized quantum circuits as machine learning models, 2019.
10.1088/​2058-9565/​ab4eb5.
https:/​/​doi.org/​10.1088/​2058-9565/​ab4eb5

[15] Abu Kamruzzaman, Yousef Alhwaiti, Avery Leider, and Charles C. Tappert. Quantum deep learning neural networks. In
Lecture Notes in Networks and Systems, pages 299–311. Springer International Publishing, February 2019.

[16] Giuseppe Carleo, Ignacio Cirac, Kyle Cranmer, Laurent Daudet, Maria Schuld, Naftali Tishby, Leslie Vogt-Maranto, and
Lenka Zdeborová. Machine learning and the physical sciences, 2019. 10.1103/​RevModPhys.91.045002.
https:/​/​doi.org/​10.1103/​RevModPhys.91.045002

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 10/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

[17] Matthias Rupp, O. Anatole von Lilienfeld, and Kieron Burke. Editorial: Special topic on data-enabled theoretical chem-
istry. Journal of Chemical Physics Guest Editorial: Special Topic on Data-Enabled Theoretical Chemistry, 148:241401, 6 2018.
10.1063/​1.5043213.
https:/​/​doi.org/​10.1063/​1.5043213

[18] Pantita Palittapongarnpim, Peter Wittek, Ehsan Zahedinejad, Shakib Vedaie, and Barry C. Sanders. Learning in quantum
control: High-dimensional global optimization for noisy quantum dynamics. Neurocomputing, 268:116–126, December 2017.
10.1016/​j.neucom.2016.12.087.
https:/​/​doi.org/​10.1016/​j.neucom.2016.12.087

[19] Maciej Lewenstein. Quantum perceptrons. Journal of Modern Optics, 41(12):2491–2501, December 1994. 10.1080/​
09500349414552331.
https:/​/​doi.org/​10.1080/​09500349414552331

[20] Scott Aaronson. Read the fine print. Nature Physics, 11(4):291–293, April 2015. 10.1038/​nphys3272.
https:/​/​doi.org/​10.1038/​nphys3272

[21] Nathan Killoran, Thomas R. Bromley, Juan Miguel Arrazola, Maria Schuld, Nicolás Quesada, and Seth Lloyd. Continuous-
variable quantum neural networks, 2018. 10.1103/​PhysRevResearch.1.033063.
https:/​/​doi.org/​10.1103/​PhysRevResearch.1.033063

[22] Zhikuan Zhao, Alejandro Pozas-Kerstjens, Patrick Rebentrost, and Peter Wittek. Bayesian deep learning on a quantum
computer. Quantum Machine Intelligence, 1(1-2):41–51, May 2019. 10.1007/​s42484-019-00004-7.
https:/​/​doi.org/​10.1007/​s42484-019-00004-7

[23] Jonathan Allcock, Chang-Yu Hsieh, Iordanis Kerenidis, and Shengyu Zhang. Quantum algorithms for feedforward neural
networks, 2018.

[24] Iris Cong, Soonwon Choi, and Mikhail D. Lukin. Quantum convolutional neural networks, 2018. 10.1038/​s41567-019-
0648-8.
https:/​/​doi.org/​10.1038/​s41567-019-0648-8

[25] Tongyang Li, Shouvanik Chakrabarti, and Xiaodi Wu. Sublinear quantum algorithms for training linear and kernel-based
classifiers. 2019.

[26] Iordanis Kerenidis and Alessandro Luongo. Quantum classification of the mnist dataset via slow feature analysis, 2018.

[27] Pooya Ronagh. Quantum algorithms for solving dynamic programming problems, 2019.

[28] Arjan Cornelissen. Quantum gradient estimation and its application to quantum reinforcement learning, 2019. MSc
Thesis.

[29] Peter Wittek and Christian Gogolin. Quantum enhanced inference in Markov logic networks. Scientific Reports, 7(1), April
2017. 10.1038/​srep45672.
https:/​/​doi.org/​10.1038/​srep45672

[30] Alexey A. Melnikov, Leonid E. Fedichkin, and Alexander Alodjants. Detecting quantum speedup by quantum walk with
convolutional neural networks, 2019. 10.1088/​1367-2630/​ab5c5e.
https:/​/​doi.org/​10.1088/​1367-2630/​ab5c5e

[31] Julius Wallnöfer, Alexey A. Melnikov, Wolfgang Dür, and Hans J. Briegel. Machine learning for long-distance quantum
communication, 2019.
https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 11/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

[32] Thomas Fösel, Petru Tighineanu, Talitha Weiss, and Florian Marquardt. Reinforcement learning with neural networks for
quantum feedback. Physical Review X, 8(3), September 2018. 10.1103/​PhysRevX.8.031084.
https:/​/​doi.org/​10.1103/​PhysRevX.8.031084

[33] Raban Iten, Tony Metger, Henrik Wilming, Lidia del Rio, and Renato Renner. Discovering physical concepts with neural
networks, 2018. 10.1103/​PhysRevLett.124.010508.
https:/​/​doi.org/​10.1103/​PhysRevLett.124.010508

[34] Patrick Huembeli, Alexandre Dauphin, Peter Wittek, and Christian Gogolin. Automated discovery of characteristic fea-
tures of phase transitions in many-body localization. Phys. Rev. B, 99:104106, Mar 2019. 10.1103/​PhysRevB.99.104106.
https:/​/​doi.org/​10.1103/​PhysRevB.99.104106

[35] Leonardo Banchi, Jason Pereira, Seth Lloyd, and Stefano Pirandola. Convex optimization of programmable quantum
computers, 2019.

[36] Juan Miguel Arrazola, Alain Delgado, Bhaskar Roy Bardhan, and Seth Lloyd. Quantum-inspired algorithms in practice,
2019.

[37] Siddarth Srinivasan, Geoff Gordon, and Byron Boots. Learning hidden quantum Markov models. In Amos Storkey and
Fernando Perez-Cruz, editors, Proceedings of the Twenty-First International Conference on Artificial Intelligence and
Statistics, volume 84 of Proceedings of Machine Learning Research, pages 1979–1987, Playa Blanca, Lanzarote, Canary
Islands, 09–11 Apr 2018. PMLR.

[38] Dominic Widdows and Stanley Peters. Word vectors and quantum logic: Experiments with negation and disjunction. In
In Proceedings of the 8th Mathematics of Language Conference, pages 141–154, 2003.

[39] William Zeng and Bob Coecke. Quantum algorithms for compositional natural language processing. In Proceedings of
the 2016 Workshop on Semantic Spaces at the Intersection of NLP, Physics and Cognitive Science, SLPCS@QPL 2016,
Glasgow, Scotland, 11th June 2016., pages 67–75, 2016. 10.4204/​EPTCS.221.8.
https:/​/​doi.org/​10.4204/​EPTCS.221.8

[40] Nathan Wiebe, Alex Bocharov, Paul Smolensky, Matthias Troyer, and Krysta M Svore. Quantum language processing,
2019.

[41] Gael Sentis, Alex Monràs, Ramon Mu noz Tapia, John Calsamiglia, and Emilio Bagan. Unsupervised classification of
quantum data, 2019. 10.1103/​PhysRevX.9.041029.
https:/​/​doi.org/​10.1103/​PhysRevX.9.041029

Cited by
[1] Leonardo Alchieri, Davide Badalotti, Pietro Bonardi, and Simone Bianco, "An introduction to quantum machine learning:
from quantum logic to quantum deep learning", Quantum Machine Intelligence 3 2, 28 (2021).

[2] Raphaël Couturier, Etienne Dionis, Stéphane Guérin, Christophe Guyeux, and Dominique Sugny, "Characterization of a
Driven Two-Level Quantum System by Supervised Learning", Entropy 25 3, 446 (2023).

[3] Gabriele Cavallaro, Dora B. Heras, Zebin Wu, Manil Maskey, Sebastian Lopez, Piotr Gawron, Mihai Coca, and Mihai Datcu,
"High-Performance and Disruptive Computing in Remote Sensing: HDCRS—A new Working Group of the GRSS Earth Science
Informatics Technical Committee [Technical Committees]", IEEE Geoscience and Remote Sensing Magazine 10 2, 329 (2022).

[4] Nai-Hui Chia, András Pal Gilyén, Tongyang Li, Han-Hsuan Lin, Ewin Tang, and Chunhao Wang, "Sampling-based Sublinear
Low-rank Matrix Arithmetic Framework for Dequantizing Quantum Machine Learning", Journal of the ACM 69 5, 1 (2022).
https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 12/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

[5] Irinel Tapalaga, Ivan Traparić, Nora Trklja Boca, Jagoš Purić, and Ivan P. Dojčinović, "Stark spectral line broadening model-
ing by machine learning algorithms", Neural Computing and Applications 34 8, 6349 (2022).

[6] Fabio Valerio Massoli, Lucia Vadicamo, Giuseppe Amato, and Fabrizio Falchi, "A Leap among Quantum Computing and
Quantum Neural Networks: A Survey", ACM Computing Surveys 55 5, 1 (2023).

[7] P V Zahorodko, S O Semerikov, V N Soloviev, A M Striuk, M I Striuk, and H M Shalatska, "Comparisons of performance
between quantum-enhanced and classical machine learning algorithms on the IBM Quantum Experience", Journal of
Physics: Conference Series 1840 1, 012021 (2021).

[8] Viraj Kulkarni, Milind Kulkarni, and Aniruddha Pant, "Quantum computing methods for supervised learning", Quantum
Machine Intelligence 3 2, 23 (2021).

[9] Kishor Bharti, Tobias Haug, Vlatko Vedral, and Leong-Chuan Kwek, "Machine learning meets quantum foundations: A
brief survey", AVS Quantum Science 2 3, 034101 (2020).

[10] Marco Fanizza, Michalis Skotiniotis, John Calsamiglia, Ramon Muñoz-Tapia, and Gael Sentís, "Universal algorithms for
quantum data learning", Europhysics Letters 140 2, 28001 (2022).

[11] Syed Farhan Ahmad, Raghav Rawat, and Minal Moharir, 2021 International Conference on Computational Intelligence
and Knowledge Economy (ICCIKE) 345 (2021) ISBN:978-1-6654-2921-4.

[12] Juan Carrasquilla, "Machine learning for quantum matter", Advances in Physics: X 5 1, 1797528 (2020).

[13] Maria Avramouli, Ilias Savvas, Georgia Garani, and Anna Vasilaki, 25th Pan-Hellenic Conference on Informatics 397
(2021) ISBN:9781450395557.

[14] Casper Gyurik, Chris Cade, and Vedran Dunjko, "Towards quantum advantage via topological data analysis", Quantum 6,
855 (2022).

[15] Raffaele Guarasci, Giuseppe De Pietro, and Massimo Esposito, "Quantum Natural Language Processing: Challenges and
Opportunities", Applied Sciences 12 11, 5651 (2022).

[16] Mario Krenn, Jonas Landgraf, Thomas Foesel, and Florian Marquardt, "Artificial intelligence and machine learning for
quantum technologies", Physical Review A 107 1, 010101 (2023).

[17] Essam H. Houssein, Zainab Abohashima, Mohamed Elhoseny, and Waleed M. Mohamed, "Machine learning in the
quantum realm: The state-of-the-art, challenges, and future vision", Expert Systems with Applications 194, 116512 (2022).

[18] Alexey A. Melnikov, Pavel Sekatski, and Nicolas Sangouard, "Setting Up Experimental Bell Tests with Reinforcement
Learning", Physical Review Letters 125 16, 160401 (2020).

[19] Maria Schuld and Nathan Killoran, "Is Quantum Advantage the Right Goal for Quantum Machine Learning?", PRX
Quantum 3 3, 030101 (2022).

[20] Pavlo V. Zahorodk, Yevhenii O. Modlo, Olga O. Kalinichenko, Tetiana V. Selivanova, and Serhiy O. Semerikov, Quantum
enhanced machine learning: An overview (2021).

[21] Upal Mahbub, Tauhidur Rahman, and Md Atiqur Rahman Ahad, Intelligent Systems Reference Library 200, 335 (2021)
ISBN:978-3-030-68589-8.

[22] Juneseo Lee, Alicia B. Magann, Herschel A. Rabitz, and Christian Arenz, "Progress toward favorable landscapes in
quantum combinatorial optimization", Physical Review A 104 3, 032401 (2021).
https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 13/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

[23] Boris Sokolov, Matteo A. C. Rossi, Guillermo García-Pérez, and Sabrina Maniscalco, "Emergent entanglement structures
and self-similarity in quantum spin chains", Philosophical Transactions of the Royal Society A: Mathematical, Physical and
Engineering Sciences 380 2227, 20200421 (2022).

[24] Julius Wallnöfer, Alexey A. Melnikov, Wolfgang Dür, and Hans J. Briegel, "Machine Learning for Long-Distance Quantum
Communication", PRX Quantum 1 1, 010301 (2020).

[25] Michal Krelina, "Quantum technology for military applications", EPJ Quantum Technology 8 1, 24 (2021).

[26] Alessandro Roggero, Jakub Filipek, Shih-Chieh Hsu, and Nathan Wiebe, "Quantum Machine Learning with SQUID",
Quantum 6, 727 (2022).

[27] Alicia B. Magann, Christian Arenz, Matthew D. Grace, Tak-San Ho, Robert L. Kosut, Jarrod R. McClean, Herschel A. Rabitz,
and Mohan Sarovar, "From Pulses to Circuits and Back Again: A Quantum Optimal Control Perspective on Variational
Quantum Algorithms", PRX Quantum 2 1, 010101 (2021).

[28] Nai-Hui Chia, András Gilyén, Tongyang Li, Han-Hsuan Lin, Ewin Tang, and Chunhao Wang, Proceedings of the 52nd
Annual ACM SIGACT Symposium on Theory of Computing 387 (2020) ISBN:9781450369794.

[29] S. D. Manko, D. N. Frolovtsev, and S. A. Magnitsky, "Simulation of Quantum Tomography Process of Biphoton
Polarization States on a Quantum Computer", Moscow University Physics Bulletin 76 2, 97 (2021).

[30] Liangliang Fan and Haozhen Situ, "Compact data encoding for data re-uploading quantum classifier", Quantum
Information Processing 21 3, 87 (2022).

[31] Maria Schuld and Francesco Petruccione, Quantum Science and Technology 23 (2021) ISBN:978-3-030-83097-7.

[32] Emanuel F. de Lima, Marllos E. F. Fernandes, and Leonardo K. Castelano, "Quantum computing with two independent
control functions: Optimal solutions to the teleportation protocol", Physical Review A 105 3, 032454 (2022).

[33] Christiane P. Koch, Ugo Boscain, Tommaso Calarco, Gunther Dirr, Stefan Filipp, Steffen J. Glaser, Ronnie Kosloff, Simone
Montangero, Thomas Schulte-Herbrüggen, Dominique Sugny, and Frank K. Wilhelm, "Quantum optimal control in quantum
technologies. Strategic report on current status, visions and goals for research in Europe", EPJ Quantum Technology 9 1, 19
(2022).

[34] Daniel Sierra-Sosa, Soham Pal, and Michael Telahun, "Data rotation and its influence on quantum encoding", Quantum
Information Processing 22 1, 89 (2023).

The above citations are from Crossref's cited-by service (last updated successfully 2023-03-20 18:07:27). The list may be
incomplete as not all publishers provide suitable and complete citation data.

On SAO/NASA ADS no data on citing works was found (last attempt 2023-03-20 18:07:27).

This View is published in Quantum Views under the Creative Commons Attribution 4.0 International (CC BY 4.0) license.
Copyright remains with the original copyright holders such as the authors or their institutions.

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 14/15
3/21/23, 10:17 AM A non-review of Quantum Machine Learning: trends and explorations – Quantum

2 thoughts on “A non-review of Quantum Machine Learning: trends and explorations”

Pingback: In memoriam: Peter Wittek – Quantum

Pingback: Canada in the Quantum Computing Race: Going from Strength to Strength - Quantropi

Copyright © 2023 Quantum – OnePress theme by FameThemes

https://quantum-journal.org/views/qv-2020-03-17-32/?ref=https://githubhelp.com 15/15

You might also like