State-dependent information
State-dependent measures that converge to the mutual information
From Wikipedia, the free encyclopedia
In information theory, state-dependent information is the generic name given to the family of state-dependent measures that in expectation converge to the mutual information.
State-dependent informations often appear in neuroscience applications.
Let and be random variables and be a state within . The state-dependent information between a random variable and a state is written as . There are currently three known varieties of state-dependent information: specific-surprise, specific-information, and state-specific-information.
Specific-Surprise
The specific-surprise, , is defined by a Kullback–Leibler divergence,
- .
As a special case of the chain-rule for Kullback-Liebler divergerences, specific-surprise follows the chain-rule for variables. Using as a random variable, this is specifically,
- .
Intuitively, specific-surprise is thought of as “how much did my beliefs about change upon learning that ”? Which is zero when there’s no change. It is nonnegative. Specific-surprise has also been called “Bayesian Surprise”.
Specific-Information
The specific-information, , is defined by a difference of entropies,
- .
Specific-information follows the chain-rule for states. Using a state as a state of random variable , this is specifically,
- .
Specific-information is interpreted as "how did the uncertainty about change upon learning ?" This can be in the positive or negative. When follows a uniform distribution, the and are equivalent.
State-Specific-Information
The state-specific information, , is a synonym for the Pointwise mutual information.
References
- Timme, Nicholas; Lapish, Christopher (2018). "A Tutorial for Information Theory in Neuroscience". eNeuro. 5 (3). doi:10.1523/ENEURO.0052-18.2018. PMC 6131830.
- Deweese, Michael; Meister, Markus (1999). "How to measure the information gained from one symbol". Network: Computation in Neural Systems. 10 (4): 325–40. CiteSeerX 10.1.1.553.8013. doi:10.1088/0954-898X/10/4/303. PMID 10695762.
- Butts, Daniel (2003). "How much information is associated with a particular stimulus?". Network: Computation in Neural Systems. 14 (2): 177–87. doi:10.1088/0954-898X/14/2/301. PMID 12790180.
- Itti, Laurent; Baldi, Pierre (2009). "Bayesian surprise attracts human attention". Vision Research. 49 (10): 1295–1306. doi:10.1016/j.visres.2008.09.007. PMC 2782645.