Emergent Mind

Abstract

Information-theoretic measures such as relative entropy and correlation are extremely useful when modeling or analyzing the interaction of probabilistic systems. We survey the quantum generalization of 5 such measures and point out some of their commonalities and interpretations. In particular we find the application of information theory to distributional semantics useful. By modeling the distributional meaning of words as density operators rather than vectors, more of their semantic structure may be exploited. Furthermore, properties of and interactions between words such as ambiguity, similarity and entailment can be simulated more richly and intuitively when using methods from quantum information theory.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.