Next: Spike
Time Information Up: Parameter
Adaptation Previous: Adapted
Parameter Values
Information in the Firing Rate
Information is not defined in the abstract, but instead always relative
to a given, fixed environment. (This
environment goes by the technical name of the ''statistical
ensemble of inputs.'' )
Hence, we should and will make no statements as to how poorly or well
the neuron
transmits information about any stimulus ensemble other than the one
used for learning.
If the number of (stimulus,firing rate) pairs in the data are limited,
statistical biases
[Treves
and Panzeri (1995)] ,
Panzeri
and Treves (1996) , Strong
et al. (1997) ] lead to underestimating the mutual information.
In order to minimize
these biases in estimating the information-theoretic quantitites, we
take 125,000 samples of the stimuli and associated firing rates and produced
by the model neuron with the conductance parameters frozen to the final
values given in the section Adapted
Parameters . Bias correction terms are computed as in
Panzeri
and Treves (1996).
We estimate three quantities:
-
1)
-
The entropy rate
where
is the normalized frequency of observing
spikes within the stimulus interval
.
This measure reflects how broad the spike count distribution is.
-
2)
-
The mutual information rate between the count
and the stimulus
:
-
where each stimulus was assigned to one of 256 uniform bins that spanned
a width of four standard deviations (
)
around the mean of the Gaussian distribution of synaptic conductance stimuli.
-
3)
-
The lower bound of the mutual information given by eq. 2 of the text, normalized
to a rate:
Table 7: Entropy and mutual information in the spike count
before and after the model neuron ``learns'' to produce a uniform distribution
of firing rates.
| |
Mutual Information
Rate (bits/sec) |
MI Lower Bound
Rate (bits/sec) |
Entropy Rate (bits/sec) |
| Before Adaptation |
4.70 |
1.25 |
8.95 |
| After Adaptation |
11.85 |
8.30 |
17.35 |
| Difference |
+7.15 |
+7.05 |
+8.40 |
|
Since
is a lower bound precisely because it substitutes a Gaussian distribution
for the conditional spike count distribution, it can be a fairly weak bound,
given that the spike count distributions are actually discrete and heavily
skewed. Note that most of the entropy rate increase is mirrored
by an increase in the mutual information rate, indicating that the amplification
of the noise by the modulatory calcium and potassium conductances in the
dendritic compartment is marginal.
Table 8: Entropy and mutual information in the spike count
before and after the model neuron ``learns'' to produce an exponential
distribution of firing rates.
| |
Mutual Information
Rate (bits/sec) |
MI Lower Bound
Rate (bits/sec) |
Entropy Rate (bits/sec) |
| Before Adaptation |
4.70 |
1.25 |
8.95 |
| After Adaptation |
11.25 |
2.16 |
17.35 |
| Difference |
+6.55 |
+0.91 |
+8.40 |
|
Next: Spike
Time Information Up: Parameter
Adaptation Previous: Adapted
Parameter Values
Martin Stemmler
1998-08-16