Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 156 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 110 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

Information erasure lurking behind measures of complexity (0905.2918v2)

Published 18 May 2009 in physics.data-an, cond-mat.stat-mech, and cs.CC

Abstract: Complex systems are found in most branches of science. It is still argued how to best quantify their complexity and to what end. One prominent measure of complexity (the statistical complexity) has an operational meaning in terms of the amount of resources needed to forecasting a system's behaviour. Another one (the effective measure complexity, aka excess entropy) is a measure of mutual information stored in the system proper. We show that for any given system the two measures differ by the amount of information erased during forecasting. We interpret the difference as inefficiency of a given model. We find a bound to the ratio of the two measures defined as information-processing efficiency, in analogy to the second law of thermodynamics. This new link between two prominent measures of complexity provides a quantitative criterion for good models of complex systems, namely those with little information erasure.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.