Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 64 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 174 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Estimating Granger Causality with Unobserved Confounders via Deep Latent-Variable Recurrent Neural Network (1909.03704v1)

Published 9 Sep 2019 in cs.LG and stat.ML

Abstract: Granger causality analysis, as one of the most popular time series causality methods, has been widely used in the economics, neuroscience. However, unobserved confounders is a fundamental problem in the observational studies, which is still not solved for the non-linear Granger causality. The application works often deal with this problem in virtue of the proxy variables, who can be treated as a measure of the confounder with noise. But the proxy variables has been proved to be unreliable, because of the bias it may induce. In this paper, we try to "recover" the unobserved confounders for the Granger causality. We use a generative model with latent variable to build the relationship between the unobserved confounders and the observed variables(tested variable and the proxy variables). The posterior distribution of the latent variable is adopted to represent the confounders distribution, which can be sampled to get the estimated confounders. We adopt the variational autoencoder to estimate the intractable posterior distribution. The recurrent neural network is applied to build the temporal relationship in the data. We evaluate our method in the synthetic and semi-synthetic dataset. The result shows our estimated confounders has a better performance than the proxy variables in the non-linear Granger causality with multiple proxies in the semi-synthetic dataset. But the performances of the synthetic dataset and the different noise level of proxy seem terrible. Any advice can really help.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube