Shannon mutual information provides a measure of how much information is, on average, contained in a set of neural activities about a set of stimuli. It has been extensively used to study neural coding in different brain areas. To apply a similar approach to investigate single stimulus encoding, the authors need to introduce a quantity specific for a single stimulus.
Single Stimulus.

A scientific paper by Michele Bezzi
Accenture Technology Labs, Sophia Antipolis, France


This quantity has been defined in literature by four different measures, but none of them satisfies the same intuitive properties (non-negativity, additivity), that characterize mutual information. The authors present here a detailed analysis of the different meanings and properties of these four definitions and show that all these measures satisfy, at least, a weaker additivity condition, i.e. limited to the response set. This allows their use for analysing correlated coding, as illustrated in a toy-example from hippocampal place cells.
Access this article at the Cornell University Library e-print service
or
try the Cornell’s University e-print service pages

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to LinkedIn Post to Slashdot Post to StumbleUpon Post to Technorati