Information theory quantifies how much information a neural response carries about the stimulus. This can be compared to the information transferred in particular models of the stimulus-response function and to maximum possible information transfer. Such comparisons are crucial because they validate assumptions present in any neurophysiological analysis.
An example of reverse reconstruction
The authors review information-theory basics before demonstrating its use in neural coding, validating simple stimulus-response models of neural coding of dynamic stimuli.
By Alexander Borst & Frederic E. Theunissen
Nature Neuroscience 2, 947 – 957 (1999)
doi:10.1038/14731


Because these models require specification of spike timing precision, they can reveal which time scales contain information in neural coding. This approach shows that dynamic stimuli can be encoded efficiently by single neurons and that each spike contributes to information transmission.
The authors argue, however, that the data obtained so far do not suggest a temporal code, in which the placement of spikes relative to each other yields additional information.
The paper is published in Nature Neuroscience
and can be reached online here, or downloaded in pdf format here.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to LinkedIn Post to Slashdot Post to StumbleUpon Post to Technorati