We study the computational capacity of a model neuron, the Tempotron, which
classifies sequences of spikes by linear-threshold operations. We use
statistical mechanics and extreme value theory to derive the capacity of the
system in random classification tasks. In contrast to its static analog, the
Perceptron, the Tempotron's solutions space consists of a large number of small
clusters of weight vectors. The capacity of the system per synapse is finite in
the large size limit and weakly diverges with the stimulus duration relative to
the membrane and synaptic time constants.Comment: 4 page, 4 figures, Accepted to Physical Review Letters on 19th Oct.
201