Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 445 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Vector Flows and the Capacity of a Discrete Memoryless Channel (2312.16472v1)

Published 27 Dec 2023 in cs.IT and math.IT

Abstract: One of the fundamental problems of information theory, since its foundation by Shannon in 1948, has been the computation of the capacity of a discrete memoryless channel, a quantity expressing the maximum rate at which information can travel through the channel. In the literature, several algorithms were proposed to approximately compute the capacity of a discrete memoryless channel, being an analytical solution unavailable for the general discrete memoryless channel. This paper presents a novel approach to compute the capacity, which is based on a continuous-time dynamical system. Such a dynamical system can indeed be regarded as a continuous-time version of the Blahut-Arimoto algorithm. In fact, the updating map appearing in the Blahut-Arimoto algorithm is here obtained as a suitable discretization of the vector flow presented, using an analogy with some game-theoretical models. Finally, this analogy suggests a high-level hardware circuit design enabling analog computation to estimate the capacity.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, pp. 379–423, 1948.
  2. Cambridge University Press, 2003.
  3. V. Guruswami, A. Riazanov, and M. Ye, “Arikan meets shannon: Polar codes with near-optimal convergence to channel capacity,” in Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020, (New York, NY, USA), p. 552–564, Association for Computing Machinery, 2020.
  4. I. Csiszár and G. Tusnády, “Information geometry and alternating minimization procedures,” Stat Decis, Supplement Issue, vol. 1, pp. 205–237, 01 1984.
  5. S. Arimoto, “An algorithm for computing the capacity of arbitrary discrete memoryless channels,” IEEE Transactions on Information Theory, vol. 18, no. 1, pp. 14–20, 1972.
  6. R. Blahut, “Computation of channel capacity and rate-distortion functions,” IEEE Transactions on Information Theory, vol. 18, no. 4, pp. 460–473, 1972.
  7. G. Matz and P. Duhamel, “Information geometric formulation and interpretation of accelerated blahut-arimoto-type algorithms,” in Information Theory Workshop, pp. 66–70, 2004.
  8. Y. Yu, “Squeezing the arimoto–blahut algorithm for faster convergence,” IEEE Transactions on Information Theory, vol. 56, no. 7, pp. 3149–3157, 2010.
  9. H. Boche, R. F. Schaefer, and H. V. Poor, “Algorithmic computability and approximability of capacity-achieving input distributions,” IEEE Transactions on Information Theory, vol. 69, no. 9, pp. 5449–5462, 2023.
  10. T. Sutter, Convex programming in optimal control and information theory. Doctoral thesis, ETH Zurich, Zurich, 2017.
  11. M. A. Tope and J. M. Morris, “A pac-bound on the channel capacity of an observed discrete memoryless channel,” in 2021 55th Annual Conference on Information Sciences and Systems (CISS), pp. 1–6, 2021.
  12. J. Hofbauer and K. Sigmund, Evolutionary games and population dynamics. Cambridge: Cambridge University Press, 1998.
  13. Cham, Switzerland: Springer International Publishing, 2016.
  14. U. Helmke and J. B. Moore, Optimization and Dynamical Systems. London: Springer London, 1994.
  15. R. W. Brockett, “Dynamical systems that sort lists, diagonalize matrices and solve linear programming problems,” Proceedings of the 27th IEEE Conference on Decision and Control, pp. 799–803 vol.1, 1988.
  16. M. T. Chu and L. K. Norris, “Isospectral flows and abstract matrix factorizations,” SIAM Journal on Numerical Analysis, vol. 25, no. 6, pp. 1383–1391, 1988.
  17. L. Faybusovich, “Dynamical systems which solve optimization problems with linear constraints,” IMA Journal of Mathematical Control and Information, vol. 8, pp. 135–149, 1991.
  18. L. E. Baum and J. A. Eagon, “An inequality with applications to statistical estimation for probabilistic functions of Markov processes and to a model for ecology,” Bull. Am. Math. Soc., vol. 73, no. 3, pp. 360–363, 1967.
  19. L. E. Baum and G. R. Sell, “Growth transformations for functions on manifolds.,” Pacific Journal of Mathematics, vol. 27, pp. 211–227, 1968.
  20. G. Palaiopanos, I. Panageas, and G. Piliouras, “Multiplicative weights update with constant step-size in congestion games: Convergence, limit cycles and chaos,” in Advances in Neural Information Processing Systems (I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, eds.), vol. 30, Curran Associates, Inc., 2017.
  21. I. Panageas, G. Piliouras, and X. Wang, “Multiplicative Weights Updates as a distributed constrained optimization algorithm: Convergence to second-order stationary points almost always,” in Proceedings of the 36th International Conference on Machine Learning (K. Chaudhuri and R. Salakhutdinov, eds.), vol. 97 of Proceedings of Machine Learning Research, pp. 4961–4969, PMLR, June 2019.
  22. T. M. Cover and J. A. Thomas, Elements of information theory. Wiley-Interscience, second ed., 2006.
  23. E. Arikan, “Channel polarization: A method for constructing capacity-achieving codes,” in 2008 IEEE International Symposium on Information Theory, pp. 1173–1177, 2008.
  24. J. Weibull, Evolutionary game theory. Cambridge, Massachusetts: MIT Press, 1995.
  25. G. Birkhoff and G.-C. Rota, Ordinary Differential Equations. New York: John Wiley & Sons, 1989.
  26. I. M. Bomze, “Evolution towards the maximum clique,” Journal of Global Optimization, vol. 10, pp. 143–164, 1997.
  27. Cambridge University Press, 2004.
  28. S. Arora, E. Hazan, and S. Kale, “The multiplicative weights update method: a meta-algorithm and applications,” Theory Comput., vol. 8, pp. 121–164, 2012.
  29. J. J. Hopfield, “Neurons with Graded Response Have Collective Computational Properties like Those of Two-State Neurons,” Proceedings of the National Academy of Sciences of the United States of America, vol. 81, no. 10, pp. 3088–3092, 1984. Publisher: National Academy of Sciences.
  30. A. Cichocki and R. Unbehauen, Neural Networks for Optimization and Signal Processing. USA: John Wiley & Sons, Inc., 1st ed., 1993.
  31. Wiley Series in Probability and Statistics, Wiley, 1 ed., 2006.
  32. Z. Naja, F. Alberge, and P. Duhamel, “Geometrical interpretation and improvements of the blahut-arimoto’s algorithm,” in 2009 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 2505–2508, 2009.
  33. M. Pelillo and A. Torsello, “Payoff-monotonic game dynamics and the maximum clique problem,” Neural Comput., vol. 18, no. 5, pp. 1215–1258, 2006.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper:

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube