Bayesian deep learning on a quantum computer

@article{Zhao2018BayesianDL,
  title={Bayesian deep learning on a quantum computer},
  author={Zhikuan Zhao and Alejandro Pozas-Kerstjens and Patrick Rebentrost and Peter Wittek},
  journal={Quantum Machine Intelligence},
  year={2018},
  volume={1},
  pages={41 - 51},
  url={https://api.semanticscholar.org/CorpusID:49554188}
}
This work leverages a quantum algorithm designed for Gaussian processes and develops a new algorithm for Bayesian deep learning on quantum computers, providing at least a polynomial speedup over classical algorithms.

Quantum Bayesian Neural Networks

This work proposes a quantum algorithm for Bayesian neural network inference, drawing on recent advances in quantum deep learning, and finds that already for small numbers of qubits, this algorithm approximates the true posterior well, while it does not require any repeated computations and thus fully realizes the quantum speedups.

Quantum Bayesian computation

This article describes how quantum von Neumann measurement provides quantum versions of popular machine learning algorithms such as Markov chain Monte Carlo and deep learning that are fundamental to Bayesian learning and applies a quantum FFT algorithm to Chicago house price data.

Bayesian machine learning for Boltzmann machine in quantum-enhanced feature spaces

A quantum bayesian learning framework of the restricted Boltzmann machine in the quantum-enhanced feature spaces is developed, which achieves exponential speed-up over their classical counterparts and is one of the promising candidates to achieve quantum supremacy.

Sparse quantum Gaussian processes to counter the curse of dimensionality

Evidence is provided through numerical tests, mathematical error bound estimation, and complexity analysis that the method can address the “curse of dimensionality,” where each additional input parameter no longer leads to an exponential growth of the computational cost.

Quantum Statistical Inference

This thesis demonstrates that quantum algorithms can be applied to enhance the computing and training of Gaussian processes, a powerful model widely used in classical statistical inference and supervised machine learning, and presents an analytical toolkit for causal inference in quantum data.

Quantum neural networks form Gaussian processes

This work shows that the outputs of certain models based on Haar random unitary or orthogonal deep QNNs converge to Gaussian processes in the limit of large Hilbert space dimension $d$.

Quantum Speedup of Natural Gradient for Variational Bayes

This work proposes a computationally e-cient regression-based method for natural gradient estimation, with convergence guarantees under standard assumptions, and enables the use of quantum matrix inversion to further speed up VB.

Quantum Machine Learning and Deep Learning: Fundamentals, Algorithms, Techniques, and Real-World Applications

This work provides a bottom-up view of quantum circuits starting from quantum data representation, quantum gates, the fundamental quantum algorithms, and more complex quantum processes, and real-world implementations of quantum machine learning and quantum deep learning are presented.

Quantum algorithms for training Gaussian Processes

It is shown that quantum computing can be used to estimate the logarithm of the marginal likelihood of a GP with exponentially improved efficiency under certain conditions.

Protocol for Implementing Quantum Nonparametric Learning with Trapped Ions.

Nonparametric learning is able to make reliable predictions by extracting information from similarities between a new set of input data and all samples. Here we point out a quantum paradigm of
...

A Universal Training Algorithm for Quantum Deep Learning

We introduce the Backwards Quantum Propagation of Phase errors (Baqprop) principle, a central theme upon which we construct multiple universal optimization heuristics for training both parametrized

A quantum algorithm to train neural networks using low-depth circuits

A low-depth quantum algorithm to train quantum Boltzmann machine neural networks using classical-quantum hybrid variational methods and employs the quantum approximate optimization algorithm as a subroutine in order to approximately sample from Gibbs states of Ising Hamiltonians.

Quantum algorithms for training Gaussian Processes

It is shown that quantum computing can be used to estimate the logarithm of the marginal likelihood of a GP with exponentially improved efficiency under certain conditions.

Classification with Quantum Neural Networks on Near Term Processors

This work introduces a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning, and shows through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets.

Quantum Machine Learning in Feature Hilbert Spaces.

This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.

The quest for a Quantum Neural Network

This article presents a systematic approach to QNN research, concentrating on Hopfield-type networks and the task of associative memory, and outlines the challenge of combining the nonlinear, dissipative dynamics of neural computing and the linear, unitary dynamics of quantum computing.

Quantum assisted Gaussian process regression

It is shown that even in some cases not ideally suited to the quantum linear systems algorithm, a polynomial increase in efficiency still occurs, leading to an exponential reduction in computation time in some instances.

Deep Neural Networks as Gaussian Processes

The exact equivalence between infinitely wide deep networks and GPs is derived and it is found that test performance increases as finite-width trained networks are made wider and more similar to a GP, and thus that GP predictions typically outperform those of finite- width networks.

Quantum machine learning

The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers.

Gaussian Process Behaviour in Wide Deep Neural Networks

It is shown that, under broad conditions, as the authors make the architecture increasingly wide, the implied random function converges in distribution to a Gaussian process, formalising and extending existing results by Neal (1996) to deep networks.
...