Theoretical neuroscientists often try to understand how the structure of a
neural network relates to its function by focusing on structural features that
would either follow from optimization or occur consistently across possible
implementations. Both optimization theories and ensemble modeling approaches
have repeatedly proven their worth, and it would simplify theory building
considerably if predictions from both theory types could be derived and tested
simultaneously. Here we show how tensor formalism from theoretical physics can
be used to unify and solve many optimization and ensemble modeling approaches
to predicting synaptic connectivity from neuronal responses. We specifically
focus on analyzing the solution space of synaptic weights that allow a
threshold-linear neural network to respond in a prescribed way to a limited
number of input conditions. For optimization purposes, we compute the synaptic
weight vector that minimizes an arbitrary quadratic loss function. For ensemble
modeling, we identify synaptic weight features that occur consistently across
all solutions bounded by an arbitrary quadratic function. We derive a common
solution to this suite of nonlinear problems by showing how each of them
reduces to an equivalent linear problem that can be solved analytically.
Although identifying the equivalent linear problem is nontrivial, our tensor
formalism provides an elegant geometrical perspective that allows us to solve
the problem numerically. The final algorithm is applicable to a wide range of
interesting neuroscience problems, and the associated geometric insights may
carry over to other scientific problems that require constrained optimization.Comment: 25 pages, 5 figure