Emergent Mind

Abstract

Shannon entropy is the most crucial foundation of Information Theory, which has been proven to be effective in many fields such as communications. Renyi entropy and Chernoff information are other two popular measures of information with wide applications. The mutual information is effective to measure the channel information for the fact that it reflects the relation between output variables and input variables. In this paper, we reexamine these channel information measures in big data viewpoint by means of ACE algorithm. The simulated results show us that decomposition results of Shannon and Chernoff mutual information with respect to channel parametersare almost the same. In this sense, Shannon shakes hands with Chernoff since they are different measures of the same information quantity. We also propose a conjecture that there is nature of channel information which is only decided by the channel parameters.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.