Emergent Mind

Potential Conditional Mutual Information: Estimators, Properties and Applications

(1710.05012)
Published Oct 13, 2017 in cs.IT , cs.LG , math.IT , and stat.ML

Abstract

The conditional mutual information I(X;Y|Z) measures the average information that X and Y contain about each other given Z. This is an important primitive in many learning problems including conditional independence testing, graphical model inference, causal strength estimation and time-series problems. In several applications, it is desirable to have a functional purely of the conditional distribution p{Y|X,Z} rather than of the joint distribution p{X,Y,Z}. We define the potential conditional mutual information as the conditional mutual information calculated with a modified joint distribution p{Y|X,Z} q{X,Z}, where q_{X,Z} is a potential distribution, fixed airport. We develop K nearest neighbor based estimators for this functional, employing importance sampling, and a coupling trick, and prove the finite k consistency of such an estimator. We demonstrate that the estimator has excellent practical performance and show an application in dynamical system inference.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.