Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 58 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 17 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 179 tok/s Pro
GPT OSS 120B 463 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

T-count Optimized Quantum Circuits for Bilinear Interpolation (1809.09249v3)

Published 24 Sep 2018 in quant-ph, cs.AR, and cs.ET

Abstract: Quantum circuits for basic image processing functions such as bilinear interpolation are required to implement image processing algorithms on quantum computers. In this work, we propose quantum circuits for the bilinear interpolation of NEQR encoded images based on Clifford+T gates. Quantum circuits for the scale up operation and scale down operation are illustrated. The proposed quantum circuits are based on quantum Clifford+T gates and are optimized for T-count. Quantum circuits based on Clifford+T gates can be made fault tolerant but the T gate is very costly to implement. As a result, reducing T-count is an important optimization goal. The proposed quantum bilinear interpolation circuits are based on (i) a quantum adder, (ii) a proposed quantum subtractor, and (iii) a quantum multiplication circuit. Further, both designs are compared and shown to be superior to existing work in terms of T-count. The proposed quantum bilinear interpolation circuits for the scale down operation and for the scale up operation each have a $92.52\%$ improvement in terms of T-count compared to the existing work.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.