Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 131 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 71 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 385 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Information Loss in Static Nonlinearities (1102.4794v2)

Published 23 Feb 2011 in cs.IT, math.IT, and nlin.SI

Abstract: In this work, conditional entropy is used to quantify the information loss induced by passing a continuous random variable through a memoryless nonlinear input-output system. We derive an expression for the information loss depending on the input density and the nonlinearity and show that the result is strongly related to the non-injectivity of the considered system. Tight upper bounds are presented, which can be evaluated with less difficulty than a direct evaluation of the information loss, which involves the logarithm of a sum. Application of our results is illustrated on a set of examples.

Citations (12)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Questions

We haven't generated a list of open questions mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.