1 research outputs found

    On the Physical Underpinnings of the Unusual Effectiveness of Probabilistic and Neural Computation

    No full text
    International audience—Probabilistic and neural approaches, through their incorporation of nonlinearities and compression of states, enable a broader sampling of the phase space. For a broad set of complex questions that are encountered in conventional computation, this approach is very effective. In these patterns-oriented tasks a fluctuation in the size of data is akin to a thermal fluctuation. A thermodynamic view naturally applies to this computational style to information processing and from this reasoning one may estimate a variety of interesting consequences for computing: (a) efficiencies in energy, (b) complexity of tasks that can be tackled, (c) inaccuracies in inferences, and (d) limitations arising in the incompleteness of inputs and models. We employ toy model examples to reflect on these important themes to establish the following: • A dissipation minimum can be predicted predicated on the averaged information being discarded under constraints of minimization of energy and maximization of information preservation and entropy. Analogous to the kBT ln 2 for the randomization of a bit, under biological constraints, the ∼ −70 mV base and ∼ 40 mV peak spike potential are then a natural consequence in a biological neural environment. Non-biological, that is, physical implementations can be analyzed by a similar approach for noisy and variability-prone thermodynamic setting. • In drawing inference, the resorting to Occam's razor as a statistical equivalent to the choice of simplest and least number of axioms in developing of a theory conflicts with Mencken's rule—for every complex problem, there is an answer that is clear, simple and wrong—as a reflection of dimensionality reduction. • Between these two factors, it is possible to make a measure of the error bound predicated on the averaged information being discarded and being filled in, and • This lets one predict the upper limits of information processing rate under constraints. These observations point to what may be achievable using neural and probabilistic computation through their physical implementation as reflected in the thermodynamics of the implementation of a statistical information mechanic engine that avoids computation via deterministic linear algebra
    corecore