Emergent Mind

Practical application improvement to Quantum SVM: theory to practice

(2012.07725)
Published Dec 14, 2020 in quant-ph and cs.LG

Abstract

Quantum machine learning (QML) has emerged as an important area for Quantum applications, although useful QML applications would require many qubits. Therefore our paper is aimed at exploring the successful application of the Quantum Support Vector Machine (QSVM) algorithm while balancing several practical and technical considerations under the Noisy Intermediate-Scale Quantum (NISQ) assumption. For the quantum SVM under NISQ, we use quantum feature maps to translate data into quantum states and build the SVM kernel out of these quantum states, and further compare with classical SVM with radial basis function (RBF) kernels. As data sets are more complex or abstracted in some sense, classical SVM with classical kernels leads to less accuracy compared to QSVM, as classical SVM with typical classical kernels cannot easily separate different class data. Similarly, QSVM should be able to provide competitive performance over a broader range of data sets including simpler'' data cases in which smoother decision boundaries are required to avoid any model variance issues (i.e., overfitting). To bridge the gap betweenclassical-looking'' decision boundaries and complex quantum decision boundaries, we propose to utilize general shallow unitary transformations to create feature maps with rotation factors to define a tunable quantum kernel, and added regularization to smooth the separating hyperplane model. We show in experiments that this allows QSVM to perform equally to SVM regardless of the complexity of the data sets and outperform in some commonly used reference data sets.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.