Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 174 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory (2403.14849v1)

Published 21 Mar 2024 in cs.IT, cs.LG, and math.IT

Abstract: The distortion-rate function of output-constrained lossy source coding with limited common randomness is analyzed for the special case of squared error distortion measure. An explicit expression is obtained when both source and reconstruction distributions are Gaussian. This further leads to a partial characterization of the information-theoretic limit of quadratic Gaussian rate-distortion-perception coding with the perception measure given by Kullback-Leibler divergence or squared quadratic Wasserstein distance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Y. Blau and T. Michaeli, “The perception-distortion tradeoff,” in Proc. IEEE Conf. Comp. Vision and Pattern Recog. (CVPR), 2018, pp. 6288–6237.
  2. Y. Blau and T. Michaeli, “Rethinking lossy compression: The rate-distortion-perception tradeoff,” in International Conference on Machine Learning, pp. 675–685, 2019.
  3. R. Matsumoto, “Introducing the perception-distortion tradeoff into the rate-distortion theory of general information sources,” IEICE Comm. Express, vol. 7, no. 11, pp. 427–431, 2018.
  4. R. Matsumoto, “Rate-distortion-perception tradeoff of variable-length source coding for general information sources,? IEICE Comm. Express, vol. 8, no. 2, pp. 38–42, 2019.
  5. L. Theis and A. B. Wagner, “A coding theorem for the rate-distortion-perception function,” ICLR 2021 neural compression workshop.
  6. J. Chen, L. Yu, J. Wang, W. Shi, Y. Ge, and W. Tong, “On the rate-distortion-perception function,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 4, pp. 664–673, Dec. 2022.
  7. A. B. Wagner, “The rate-distortion-perception tradeoff: The role of common randomness,” 2022, arXiv:2202.04147. [Online] Available: https://arxiv.org/abs/2202.04147
  8. M. Li, J. Klejsa, and W. B. Kleijn, “Distribution preserving quantization with dithering and transformation,” IEEE Signal Process. Lett., vol. 17, no. 12, pp. 1014–1017, Dec. 2010.
  9. M. Li, J. Klejsa, and W. B. Kleijn. (2011). “On distribution preserving quantization. [Online]. Available: http://arxiv.org/abs/1108.3728
  10. J. Klejsa, G. Zhang, M. Li, and W. B. Kleijn, “Multiple description distribution preserving quantization,” IEEE Trans. Signal Process., vol. 61, no. 24, pp. 6410–6422, Dec. 2013.
  11. N. Saldi, T. Linder, and S. Yüksel, “Randomized quantization and source coding with constrained output distribution,” IEEE Trans. Inf. Theory, vol. 61, no. 1, pp. 91–106, Jan. 2015.
  12. N. Saldi, T. Linder, and S. Yüksel, “Output constrained lossy source coding with limited common randomness,” IEEE Trans. Inf. Theory, vol. 61, no. 9, pp. 4984–4998, Sep. 2015.
  13. D. C. Dowson and B. V. Landau, “ The Fréchet distance between multivariate normal distributions,” J. Multivariate Anal., vol. 12, no. 3, pp. 450–-455, 1982.
  14. C. R. Givens and R. M. Shortt, “A class of Wasserstein metrics for probability,” Michigan Math. J., vol. 31, no. 2, pp. 231–240, 1984.
  15. T. A. Atif, M. A. Sohail, and S. S. Pradhan, “Lossy quantum source coding with a global error criterion based on a posterior reference map,” 2023, arXiv:2302.00625. [Online] Available: https://arxiv.org/abs/2302.00625
  16. S. Salehkalaibar, J. Chen, A. Khisti, and W. Yu, “Rate-distortion-perception tradeoff based on the conditional-distribution perception measure,” 2024, arXiv:2401.12207. [Online] Available: https://arxiv.org/abs/2401.12207
  17. L. Theis and E. Agustsson, “On the advantages of stochastic encoders,” ICLR 2021 neural compression workshop.
  18. Z. Yan, F. Wen, R. Ying, C. Ma, and P. Liu, “On perceptual lossy compression: The cost of perceptual reconstruction and an optimal training framework,” in International Conference on Machine Learning, 2021.
  19. H. Liu, G. Zhang, J. Chen, A. Khisti, “Lossy compression with distribution shift as entropy constrained optimal transport,” in International Conference on Learning Representations, 2022.
  20. H. Liu, G. Zhang, J. Chen and A. Khisti, “Cross-domain lossy compression as entropy constrained optimal transport,” IEEE Journal on Selected Areas in Information Theory, vol. 3, no. 3, pp. 513–527, Sep. 2022.
  21. O. Rioul, “Information theoretic proofs of entropy power inequalities,” IEEE Trans. Inf. Theory, vol. 57, no. 1, pp. 33–55, Jan. 2011.
  22. S. Salehkalaibar, B. Phan, J. Chen, W. Yu, and A. Khisti, “On the choice of perception loss function for learned video compression,” in Conference on Neural Information Processing Systems, 2023.
  23. J. Qian, S. Salehkalaibar, J. Chen, A. Khisti, W. Yu, W. Shi, Y. Ge, and W. Tong, “Rate-Distortion-Perception Tradeoff for Vector Gaussian Sources,” IEEE Journal on Selected Areas in Information Theory, submitted for publication.
Citations (2)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com