
Fast Quantum Algorithm for Learning with Optimized Random Features
Kernel methods augmented with random features give scalable algorithms f...
read it

Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features
Although kernel methods are widely used in many learning problems, they ...
read it

QuantumInspired Classical Algorithm for Slow Feature Analysis
Recently, there has been a surge of interest for quantum computation for...
read it

A rigorous and robust quantum speedup in supervised machine learning
Over the past few years several quantum machine learning algorithms were...
read it

A Variant of Gradient Descent Algorithm Based on Gradient Averaging
In this work, we study an optimizer, GradAvg to optimize error function...
read it

Importance of Kernel Bandwidth in Quantum Machine Learning
Quantum kernel methods are considered a promising avenue for applying qu...
read it

Analysis of the Convergence Speed of the ArimotoBlahut Algorithm by the Second Order Recurrence Formula
In this paper, we investigate the convergence speed of the ArimotoBlahu...
read it
Exponential Error Convergence in Data Classification with Optimized Random Features: Acceleration by Quantum Machine Learning
Random features are a central technique for scalable learning algorithms based on kernel methods. A recent work has shown that an algorithm for machine learning by quantum computer, quantum machine learning (QML), can exponentially speed up sampling of optimized random features, even without imposing restrictive assumptions on sparsity and lowrankness of matrices that had limited applicability of conventional QML algorithms; this QML algorithm makes it possible to significantly reduce and provably minimize the required number of features for regression tasks. However, a major interest in the field of QML is how widely the advantages of quantum computation can be exploited, not only in the regression tasks. We here construct a QML algorithm for a classification task accelerated by the optimized random features. We prove that the QML algorithm for sampling optimized random features, combined with stochastic gradient descent (SGD), can achieve stateoftheart exponential convergence speed of reducing classification error in a classification task under a lownoise condition; at the same time, our algorithm with optimized random features can take advantage of the significant reduction of the required number of features so as to accelerate each iteration in the SGD and evaluation of the classifier obtained from our algorithm. These results discover a promising application of QML to significant acceleration of the leading classification algorithm based on kernel methods, without ruining its applicability to a practical class of data sets and the exponential errorconvergence speed.
READ FULL TEXT
Comments
There are no comments yet.