1 research outputs found
Neural computation at the thermal limit
Although several measurements and analyses support the idea that the brain is
energy-optimized, there is one disturbing, contradictory observation: In
theory, computation limited by thermal noise can occur as cheaply as ~ joules per bit (kTln2). Unfortunately, for a neuron the ostensible
discrepancy from this minimum is startling - ignoring inhibition the
discrepancy is times this amount and taking inhibition into account
. Here we point out that what has been defined as neural computation is
actually a combination of computation and neural communication: the
communication costs, transmission from each excitatory postsynaptic activation
to the S4-gating-charges of the fast Na+ channels of the initial segment
(fNa's), dominate the joule-costs. Making this distinction between
communication to the initial segment and computation at the initial segment
(i.e., adding up of the activated fNa's) implies that the size of the average
synaptic event reaching the fNa's is the size of the standard deviation of the
thermal noise. Moreover, defining computation as the addition of activated
fNa's, yields a biophysically plausible mechanism for approaching the desired
minimum. This mechanism, requiring something like the electrical engineer's
equalizer (not much more than the action potential generating conductances),
only operates at threshold. This active filter modifies the last few synaptic
excitations, providing barely enough energy to allow the last sub-threshold
gating charge to transport. That is, the last, threshold-achieving S4-subunit
activation requires an energy that matches the information being provided by
the last few synaptic events, a ratio that is near kTln2 joules per bit.Comment: 2 figure