Bridging Classical and Quantum Machine Learning: Knowledge Transfer From Classical to Quantum Neural Networks Using Knowledge Distillation (2311.13810v2)
Abstract: Quantum neural networks (QNNs), harnessing superposition and entanglement, have shown potential to surpass classical methods in complex learning tasks but remain limited by hardware constraints and noisy conditions. In this work, we present a novel framework for transferring knowledge from classical convolutional neural networks (CNNs) to QNNs via knowledge distillation, thereby reducing the need for resource intensive quantum training and error mitigation. We conduct extensive experiments using two parameterized quantum circuits (PQCs) with 4 and 8 qubits on MNIST, Fashion MNIST, and CIFAR10 datasets. The approach demonstrates consistent accuracy improvements attributed to distilled knowledge from larger classical networks. Through ablation studies, we systematically compare the effect of state of the art dimensionality reduction techniques fully connected layers, center cropping, principal component analysis, and pooling to compress high-dimensional image data prior to quantum encoding. Our findings reveal that fully connected layers retain the most salient features for QNN inference, thereby surpassing other down sampling approaches. Additionally, we examine state of the art data encoding methods (amplitude, angle, and qubit encoding) and identify amplitude encoding as the optimal strategy, yielding superior accuracy across all tested datasets and qubit configurations. Through computational analyses, we show that our distilled 4-qubit and 8-qubit QNNs achieve competitive performance while utilizing significantly fewer parameters than their classical counterparts. Our results establish a promising paradigm for bridging classical deep learning and emerging quantum computing, paving the way for more powerful, resource conscious models in quantum machine intelligence.
- Comparing concepts of quantum and classical neural network models for image classification task. In Progress in Image Processing, Pattern Recognition and Communication Systems: Proceedings of the Conference (CORES, IP&C, ACS)-June 28-30 2021 12, pages 61–71. Springer, 2022.
- Qnet: A scalable and noise-resilient quantum neural network architecture for noisy intermediate-scale quantum computers. Frontiers in Physics, 9:702, 2022.
- Patrick J Coles. Seeking quantum advantage for neural networks. Nature Computational Science, 1(6):389–390, 2021.
- Training deep quantum neural networks. Nature communications, 11(1):808, 2020.
- Learnability of quantum neural networks. PRX Quantum, 2(4):040337, 2021.
- Challenges and opportunities in quantum machine learning. Nature Computational Science, 2(9):567–576, 2022.
- Quantum neural networks: Concepts, applications, and challenges. In 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), pages 413–416. IEEE, 2021.
- A review of quantum neural networks: methods, models, dilemma. arXiv preprint arXiv:2109.01840, 2021.
- Knowledge distillation in quantum neural network using approximate synthesis. In Proceedings of the 28th Asia and South Pacific Design Automation Conference, pages 639–644, 2023.
- Knowledge distillation for optimization of quantized deep neural networks. In 2020 IEEE Workshop on Signal Processing Systems (SiPS), pages 1–6. IEEE, 2020.
- Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015.
- Knowledge distillation: A survey. International Journal of Computer Vision, 129:1789–1819, 2021.
- Li Deng. The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE signal processing magazine, 29(6):141–142, 2012.
- Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
- Quantum supremacy using a programmable superconducting processor. Nature, 574(7779):505–510, 2019.
- Quantum computational advantage with a programmable photonic processor. Nature, 606(7912):75–81, 2022.
- Peter W Shor. Scheme for reducing decoherence in quantum computer memory. Physical review A, 52(4):R2493, 1995.
- Noisy intermediate-scale quantum algorithms. Reviews of Modern Physics, 94(1):015004, 2022.
- Qubits made by advanced semiconductor manufacturing. Nature Electronics, 5(3):184–190, 2022.
- Coherent phonon manipulation in coupled mechanical resonators. Nature Physics, 9(8):480–484, 2013.
- Quantum boltzmann machine. Physical Review X, 8(2):021050, 2018.
- Quantum machine learning. Nature, 549(7671):195–202, 2017.
- The power of quantum neural networks. Nature Computational Science, 1(6):403–409, 2021.
- Supervised learning with quantum-enhanced feature spaces. Nature, 567(7747):209–212, 2019.
- Experimental quantum speed-up in reinforcement learning agents. Nature, 591(7849):229–233, 2021.
- Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644, 2021.
- Prospects and challenges of quantum finance. arXiv preprint arXiv:2011.06492, 2020.
- Fundamental limits of quantum error mitigation. npj Quantum Information, 8(1):114, 2022.
- Error statistics and scalability of quantum error mitigation formulas. npj Quantum Information, 9(1):35, 2023.
- Reference-state error mitigation: A strategy for high accuracy quantum computation of chemistry. Journal of Chemical Theory and Computation, 19(3):783–789, 2023.
- Error mitigation extends the computational reach of a noisy quantum processor. Nature, 567(7749):491–495, 2019.
- Learning-based quantum error mitigation. PRX Quantum, 2(4):040330, 2021.
- Unifying and benchmarking state-of-the-art quantum error mitigation techniques. Quantum, 7:1034, 2023.
- Unified approach to data-driven quantum error mitigation. Physical Review Research, 3(3):033098, 2021.
- Error mitigation for quantum approximate optimization. arXiv preprint arXiv:2301.05042, 2023.
- Optimizing resource efficiencies for scalable full-stack quantum computers. PRX Quantum, 4(4):040319, 2023.
- Next steps in quantum computing: Computer science’s role. arXiv preprint arXiv:1903.10541, 2019.
- Artificial intelligence computing at the quantum level. Data, 7(3):28, 2022.
- Quantum computing opportunities in renewable energy. SN Computer Science, 2(5):393, 2021.
- Transfer learning for scalability of neural-network quantum states. Physical Review E, 101(5):053301, 2020.
- Outsmarting quantum chemistry through transfer learning. 2018.
- Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning. Nature communications, 10(1):2903, 2019.
- Transfer learning in hybrid classical-quantum neural networks. Quantum, 4:340, 2020.
- An integrated framework for covid-19 classification based on classical and quantum transfer learning from a chest radiograph. Concurrency and Computation: Practice and Experience, 34(20):e6434, 2022.
- Using a novel transfer learning method for designing thin film solar cells with enhanced quantum efficiencies. Scientific reports, 9(1):5034, 2019.
- A new model for brain tumor detection using ensemble transfer learning and quantum variational classifier. Computational intelligence and neuroscience, 2022, 2022.
- Quantum behaved particle swarm optimization-based deep transfer learning model for sugarcane leaf disease detection and classification. Mathematical Problems in Engineering, 2022, 2022.
- Enhancing materials property prediction by leveraging computational and experimental data using deep transfer learning. Nature communications, 10(1):5316, 2019.
- Classical-to-quantum transfer learning for spoken command recognition based on quantum neural networks. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8627–8631. IEEE, 2022.
- Classical-to-quantum convolutional neural network transfer learning. Neurocomputing, 555:126643, 2023.
- Generalized cross entropy loss for training deep neural networks with noisy labels. Advances in neural information processing systems, 31, 2018.
- On information and sufficiency. The annals of mathematical statistics, 22(1):79–86, 1951.
- John Von Neumann. Mathematical foundations of quantum mechanics: New edition, volume 53. Princeton university press, 2018.
- Differentiable learning of quantum circuit born machines. Physical Review A, 98(6):062324, 2018.
- Adagrad stepsizes: Sharp convergence over nonconvex landscapes. The Journal of Machine Learning Research, 21(1):9047–9076, 2020.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
- Variants of rmsprop and adagrad with logarithmic regret bounds. In International conference on machine learning, pages 2545–2553. PMLR, 2017.
- Adaptive subgradient methods for online learning and stochastic optimization. Journal of machine learning research, 12(7), 2011.
- Quantum machine learning in feature hilbert spaces. Physical review letters, 122(4):040504, 2019.
- Classification with quantum neural networks on near term processors. arXiv preprint arXiv:1802.06002, 2018.
- Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
- Alex Krizhevsky. One weird trick for parallelizing convolutional neural networks. arXiv preprint arXiv:1404.5997, 2014.
- Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32, 2019.
- Qiskit: An open-source framework for quantum computing. Accessed on: Mar, 16, 2019.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
- Rethinking the inception architecture for computer vision. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 2818–2826, 2016.