13,028 research outputs found

    Line creep in paper peeling

    Full text link
    The dynamics of a "peeling front" or an elastic line is studied under creep (constant load) conditions. Our experiments show an exponential dependence of the creep velocity on the inverse force (mass) applied. In particular, the dynamical correlations of the avalanche activity are discussed here. We compare various avalanche statistics to those of a line depinning model with non-local elasticity, and study various measures of the experimental avalanche-avalanche and temporal correlations such as the autocorrelation function of the released energy and aftershock activity. From all these we conclude, that internal avalanche dynamics seems to follow "line depinning" -like behavior, in rough agreement with the depinning model. Meanwhile, the correlations reveal subtle complications not implied by depinning theory. Moreover, we also show how these results can be understood from a geophysical point of view.Comment: 22 pages, 14 fig

    A modeling approach of the chemostat

    Get PDF
    Population dynamics and in particular microbial population dynamics, though they are complex but also intrinsically discrete and random, are conventionally represented as deterministic differential equations systems. We propose to revisit this approach by complementing these classic formalisms by stochastic formalisms and to explain the links between these representations in terms of mathematical analysis but also in terms of modeling and numerical simulations. We illustrate this approach on the model of chemostat.Comment: arXiv admin note: substantial text overlap with arXiv:1308.241

    Unexpected drop of dynamical heterogeneities in colloidal suspensions approaching the jamming transition

    Full text link
    As the glass (in molecular fluids\cite{Donth}) or the jamming (in colloids and grains\cite{LiuNature1998}) transitions are approached, the dynamics slow down dramatically with no marked structural changes. Dynamical heterogeneity (DH) plays a crucial role: structural relaxation occurs through correlated rearrangements of particle ``blobs'' of size ξ\xi\cite{WeeksScience2000,DauchotPRL2005,Glotzer,Ediger}. On approaching these transitions, ξ\xi grows in glass-formers\cite{Glotzer,Ediger}, colloids\cite{WeeksScience2000,BerthierScience2005}, and driven granular materials\cite{KeysNaturePhys2007} alike, strengthening the analogies between the glass and the jamming transitions. However, little is known yet on the behavior of DH very close to dynamical arrest. Here, we measure in colloids the maximum of a ``dynamical susceptibility'', χ\chi^*, whose growth is usually associated to that of ξ\xi\cite{LacevicPRE}. χ\chi^* initially increases with volume fraction ϕ\phi, as in\cite{KeysNaturePhys2007}, but strikingly drops dramatically very close to jamming. We show that this unexpected behavior results from the competition between the growth of ξ\xi and the reduced particle displacements associated with rearrangements in very dense suspensions, unveiling a richer-than-expected scenario.Comment: 1st version originally submitted to Nature Physics. See the Nature Physics website fro the final, published versio

    Optimistic barrier synchronization

    Get PDF
    Barrier synchronization is fundamental operation in parallel computation. In many contexts, at the point a processor enters a barrier it knows that it has already processed all the work required of it prior to synchronization. The alternative case, when a processor cannot enter a barrier with the assurance that it has already performed all the necessary pre-synchronization computation, is treated. The problem arises when the number of pre-sychronization messages to be received by a processor is unkown, for example, in a parallel discrete simulation or any other computation that is largely driven by an unpredictable exchange of messages. We describe an optimistic O(log sup 2 P) barrier algorithm for such problems, study its performance on a large-scale parallel system, and consider extensions to general associative reductions as well as associative parallel prefix computations

    Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

    Get PDF
    Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines, a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections. Synaptic stochasticity plays the dual role of an efficient mechanism for sampling, and a regularizer during learning akin to DropConnect. A local synaptic plasticity rule implementing an event-driven form of contrastive divergence enables the learning of generative models in an on-line fashion. Synaptic sampling machines perform equally well using discrete-timed artificial units (as in Hopfield networks) or continuous-timed leaky integrate & fire neurons. The learned representations are remarkably sparse and robust to reductions in bit precision and synapse pruning: removal of more than 75% of the weakest connections followed by cursory re-learning causes a negligible performance loss on benchmark classification tasks. The spiking neuron-based synaptic sampling machines outperform existing spike-based unsupervised learners, while potentially offering substantial advantages in terms of power and complexity, and are thus promising models for on-line learning in brain-inspired hardware

    Shortcomings in ground testing, environment simulations, and performance predictions for space applications

    Get PDF
    This paper addresses the issues involved in radiation testing of devices and subsystems to obtain the data that are required to predict the performance and survivability of satellite systems for extended missions in space. The problems associated with space environmental simulations, or the lack thereof, in experiments intended to produce information to describe the degradation and behavior of parts and systems are discussed. Several types of radiation effects in semiconductor components are presented, as for example: ionization dose effects, heavy ion and proton induced Single Event Upsets (SEUs), and Single Event Transient Upsets (SETUs). Examples and illustrations of data relating to these ground testing issues are provided. The primary objective of this presentation is to alert the reader to the shortcomings, pitfalls, variabilities, and uncertainties in acquiring information to logically design electronic subsystems for use in satellites or space stations with long mission lifetimes, and to point out the weaknesses and deficiencies in the methods and procedures by which that information is obtained

    Fluid-Structure Interaction with the Entropic Lattice Boltzmann Method

    Full text link
    We propose a novel fluid-structure interaction (FSI) scheme using the entropic multi-relaxation time lattice Boltzmann (KBC) model for the fluid domain in combination with a nonlinear finite element solver for the structural part. We show validity of the proposed scheme for various challenging set-ups by comparison to literature data. Beyond validation, we extend the KBC model to multiphase flows and couple it with FEM solver. Robustness and viability of the entropic multi-relaxation time model for complex FSI applications is shown by simulations of droplet impact on elastic superhydrophobic surfaces

    Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

    Full text link
    Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.Comment: (Under review
    corecore