2 research outputs found

    Comment on "Tricritical Behavior in Rupture Induced by Disorder"

    Full text link
    In their letter, Andersen, Sornette, and Leung [Phys. Rev. Lett. 78, 2140 (1997)] describe possible behaviors for rupture in disordered media, based on the mean field-like democratic fiber bundle model. In this model, fibers are pulled with a force which is distributed uniformly. A fiber breaks if the stress on it exceeds a threshold chosen from a probability distribution, and the force is then redistributed over the intact fibers. Andersen et al. claim the existence of a tricritical point, separating a "first-order" regime, characterized by a sudden global failure, from a "second-order" regime, characterized by a divergence in the breaking rate. We show that a first-order transition is an artifact of a (large enough) discontinuity put by hand in the disorder distribution. Thus, in generic physical cases, a first-order regime is not present. This result is obtained from a graphical method, which, unlike Andersen at al.'s analytical solution, enables us to distinguish the various classes of qualitatively different behaviors of the model.Comment: 1 page, 1 figure included, revte

    A simple method for estimating the entropy of neural activity

    Full text link
    The number of possible activity patterns in a population of neurons grows exponentially with the size of the population. Typical experiments explore only a tiny fraction of the large space of possible activity patterns in the case of populations with more than 10 or 20 neurons. It is thus impossible, in this undersampled regime, to estimate the probabilities with which most of the activity patterns occur. As a result, the corresponding entropy - which is a measure of the computational power of the neural population - cannot be estimated directly. We propose a simple scheme for estimating the entropy in the undersampled regime, which bounds its value from both below and above. The lower bound is the usual 'naive' entropy of the experimental frequencies. The upper bound results from a hybrid approximation of the entropy which makes use of the naive estimate, a maximum entropy fit, and a coverage adjustment. We apply our simple scheme to artificial data, in order to check their accuracy; we also compare its performance to those of several previously defined entropy estimators. We then apply it to actual measurements of neural activity in populations with up to 100 cells. Finally, we discuss the similarities and differences between the proposed simple estimation scheme and various earlier methods. © 2013 IOP Publishing Ltd and SISSA Medialab srl
    corecore