Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems (1411.2832v2)

Published 11 Nov 2014 in cs.IT, math.IT, and q-bio.NC

Abstract: To fully characterize the information that two source' variables carry about a thirdtarget' variable, one must decompose the total information into redundant, unique and synergistic components, i.e. obtain a partial information decomposition (PID). However Shannon's theory of information does not provide formulae to fully determine these quantities. Several recent studies have begun addressing this. Some possible definitions for PID quantities have been proposed, and some analyses have been carried out on systems composed of discrete variables. Here we present the first in-depth analysis of PIDs on Gaussian systems, both static and dynamical. We show that, for a broad class of Gaussian systems, previously proposed PID formulae imply that: (i) redundancy reduces to the minimum information provided by either source variable, and hence is independent of correlation between sources; (ii) synergy is the extra information contributed by the weaker source when the stronger source is known, and can either increase or decrease with correlation between sources. We find that Gaussian systems frequently exhibit net synergy, i.e. the information carried jointly by both sources is greater than the sum of informations carried by each source individually. Drawing from several explicit examples, we discuss the implications of these findings for measures of information transfer and information-based measures of complexity, both generally and within a neuroscience setting. Importantly, by providing independent formulae for synergy and redundancy applicable to continuous time-series data, we open up a new approach to characterizing and quantifying information sharing amongst complex system variables.

Citations (158)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.