Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 27 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 70 tok/s Pro
Kimi K2 117 tok/s Pro
GPT OSS 120B 459 tok/s Pro
Claude Sonnet 4 34 tok/s Pro
2000 character limit reached

Heavy tails and pruning in programmable photonic circuits (2208.02251v1)

Published 3 Aug 2022 in cs.ET, physics.optics, and quant-ph

Abstract: Developing hardware for high-dimensional unitary operators plays a vital role in implementing quantum computations and deep learning accelerations. Programmable photonic circuits are singularly promising candidates for universal unitaries owing to intrinsic unitarity, ultrafast tunability, and energy efficiency of photonic platforms. Nonetheless, when the scale of a photonic circuit increases, the effects of noise on the fidelity of quantum operators and deep learning weight matrices become more severe. Here we demonstrate a nontrivial stochastic nature of large-scale programmable photonic circuits-heavy-tailed distributions of rotation operators-that enables the development of high-fidelity universal unitaries through designed pruning of superfluous rotations. The power law and the Pareto principle for the conventional architecture of programmable photonic circuits are revealed with the presence of hub phase shifters, allowing for the application of network pruning to the design of photonic hardware. We extract a universal architecture for pruning random unitary matrices and prove that "the bad is sometimes better to be removed" to achieve high fidelity and energy efficiency. This result lowers the hurdle for high fidelity in large-scale quantum computing and photonic deep learning accelerators.

Citations (16)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (2)