Emergent Mind

Representation of Context-Specific Causal Models with Observational and Interventional Data

(2101.09271)
Published Jan 22, 2021 in math.ST , math.CO , stat.ME , stat.ML , and stat.TH

Abstract

We consider the problem of representing causal models that encode context-specific information for discrete data using a proper subclass of staged tree models which we call CStrees. We show that the context-specific information encoded by a CStree can be equivalently expressed via a collection of DAGs. As not all staged tree models admit this property, CStrees are a subclass that provides a transparent, intuitive and compact representation of context-specific causal information. We prove that CStrees admit a global Markov property which yields a graphical criterion for model equivalence generalizing that of Verma and Pearl for DAG models. These results extend to the general interventional model setting, making CStrees the first family of context-specific models admitting a characterization of interventional model equivalence. We also provide a closed-form formula for the maximum likelihood estimator of a CStree and use it to show that the Bayesian information criterion is a locally consistent score function for this model class. The performance of CStrees is analyzed on both simulated and real data, where we see that modeling with CStrees instead of general staged trees does not result in a significant loss of predictive accuracy, while affording DAG representations of context-specific causal information.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.