21,601 research outputs found

    Balancing Error and Dissipation in Computing

    Get PDF
    Modern digital electronics support remarkably reliable computing, especially given the challenge of controlling nanoscale logical components that interact in fluctuating environments. However, we demonstrate that the high-reliability limit is subject to a fundamental error-energy-efficiency tradeoff that arises from time-symmetric control: Requiring a low probability of error causes energy consumption to diverge as logarithm of the inverse error rate for nonreciprocal logical transitions. The reciprocity (self-invertibility) of a computation is a stricter condition for thermodynamic efficiency than logical reversibility (invertibility), the latter being the root of Landauer's work bound on erasing information. Beyond engineered computation, the results identify a generic error-dissipation tradeoff in steady-state transformations of genetic information carried out by biological organisms. The lesson is that computation under time-symmetric control cannot reach, and is often far above, the Landauer limit. In this way, time-asymmetry becomes a design principle for thermodynamically efficient computing.Comment: 19 pages, 8 figures; Supplementary material 7 pages, 1 figure; http://csc.ucdavis.edu/~cmg/compmech/pubs/tsp.ht

    Collective behaviours: from biochemical kinetics to electronic circuits

    Get PDF
    In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.Comment: 15 pages, 6 figures; to appear on Scientific Reports: Nature Publishing Grou

    Large-Scale Optical Neural Networks based on Photoelectric Multiplication

    Full text link
    Recent success in deep neural networks has generated strong interest in hardware accelerators to improve speed and energy consumption. This paper presents a new type of photonic accelerator based on coherent detection that is scalable to large (N106N \gtrsim 10^6) networks and can be operated at high (GHz) speeds and very low (sub-aJ) energies per multiply-and-accumulate (MAC), using the massive spatial multiplexing enabled by standard free-space optical components. In contrast to previous approaches, both weights and inputs are optically encoded so that the network can be reprogrammed and trained on the fly. Simulations of the network using models for digit- and image-classification reveal a "standard quantum limit" for optical neural networks, set by photodetector shot noise. This bound, which can be as low as 50 zJ/MAC, suggests performance below the thermodynamic (Landauer) limit for digital irreversible computation is theoretically possible in this device. The proposed accelerator can implement both fully-connected and convolutional networks. We also present a scheme for back-propagation and training that can be performed in the same hardware. This architecture will enable a new class of ultra-low-energy processors for deep learning.Comment: Text: 10 pages, 5 figures, 1 table. Supplementary: 8 pages, 5, figures, 2 table

    Complete integrability of information processing by biochemical reactions

    Get PDF
    Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling -- based on spin systems -- has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis--Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy -- based on completely integrable hydrodynamic-type systems of PDEs -- which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.Comment: 24 pages, 10 figures; accepted for publication in Scientific Report

    Beyond Moore's technologies: operation principles of a superconductor alternative

    Full text link
    The predictions of Moore's law are considered by experts to be valid until 2020 giving rise to "post-Moore's" technologies afterwards. Energy efficiency is one of the major challenges in high-performance computing that should be answered. Superconductor digital technology is a promising post-Moore's alternative for the development of supercomputers. In this paper, we consider operation principles of an energy-efficient superconductor logic and memory circuits with a short retrospective review of their evolution. We analyze their shortcomings in respect to computer circuits design. Possible ways of further research are outlined.Comment: OPEN ACCES

    Ferromagnetic models for cooperative behavior: Revisiting Universality in complex phenomena

    Full text link
    Ferromagnetic models are harmonic oscillators in statistical mechanics. Beyond their original scope in tackling phase transition and symmetry breaking in theoretical physics, they are nowadays experiencing a renewal applicative interest as they capture the main features of disparate complex phenomena, whose quantitative investigation in the past were forbidden due to data lacking. After a streamlined introduction to these models, suitably embedded on random graphs, aim of the present paper is to show their importance in a plethora of widespread research fields, so to highlight the unifying framework reached by using statistical mechanics as a tool for their investigation. Specifically we will deal with examples stemmed from sociology, chemistry, cybernetics (electronics) and biology (immunology).Comment: Contributing to the proceedings of the Conference "Mathematical models and methods for Planet Heart", INdAM, Rome 201

    Bi-stability resistant to fluctuations

    Full text link
    We study a simple micro-mechanical device that does not lose its snap-through behavior in an environment dominated by fluctuations. The main idea is to have several degrees of freedom that can cooperatively resist the de-synchronizing effect of random perturbations. As an inspiration we use the power stroke machinery of skeletal muscles, which ensures at sub-micron scales and finite temperatures a swift recovery of an abruptly applied slack. In addition to hypersensitive response at finite temperatures, our prototypical Brownian snap spring also exhibits criticality at special values of parameters which is another potentially interesting property for micro-scale engineering applications

    A walk in the statistical mechanical formulation of neural networks

    Full text link
    Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we first review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. One step forward, we highlight the structural equivalence between Hopfield networks (modeling retrieval) and Boltzmann machines (modeling learning), hence realizing a deep bridge linking two inseparable aspects of biological and robotic spontaneous cognition. As a sideline, in this walk we derive two alternative (with respect to the original Hebb proposal) ways to recover the Hebbian paradigm, stemming from ferromagnets and from spin-glasses, respectively. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers and between antiferromagnets and flip-flops (as neural networks -built by op-amp and flip-flops- are particular spin-glasses and the latter are indeed combinations of ferromagnets and antiferromagnets), hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.Comment: Contribute to the proceeding of the conference: NCTA 2014. Contains 12 pages,7 figure
    corecore