207,084 research outputs found

    Synchronization in Complex Systems Following the Decision Based Queuing Process: The Rhythmic Applause as a Test Case

    Full text link
    Living communities can be considered as complex systems, thus a fertile ground for studies related to their statistics and dynamics. In this study we revisit the case of the rhythmic applause by utilizing the model proposed by V\'azquez et al. [A. V\'azquez et al., Phys. Rev. E 73, 036127 (2006)] augmented with two contradicted {\it driving forces}, namely: {\it Individuality} and {\it Companionship}. To that extend, after performing computer simulations with a large number of oscillators we propose an explanation on the following open questions (a) why synchronization occurs suddenly, and b) why synchronization is observed when the clapping period (TcT_c) is 1.5Ts<Tc<2.0Ts1.5 \cdot T_s < T_c < 2.0 \cdot T_s (TsT_s is the mean self period of the spectators) and is lost after a time. Moreover, based on the model, a weak preferential attachment principle is proposed which can produce complex networks obeying power law in the distribution of number edges per node with exponent greater than 3.Comment: 16 pages, 5 figure

    Varying the Explanatory Span: Scientific Explanation for Computer Simulations

    Get PDF
    This article aims to develop a new account of scientific explanation for computer simulations. To this end, two questions are answered: what is the explanatory relation for computer simulations? And what kind of epistemic gain should be expected? For several reasons tailored to the benefits and needs of computer simulations, these questions are better answered within the unificationist model of scientific explanation. Unlike previous efforts in the literature, I submit that the explanatory relation is between the simulation model and the results of the simulation. I also argue that our epistemic gain goes beyond the unificationist account, encompassing a practical dimension as well

    Viscoelasticity and primitive path analysis of entangled polymer liquids: From f-actin to polyethylene

    Get PDF
    We combine computer simulations and scaling arguments to develop a unified view of polymer entanglement based on the primitive path analysis (PPA) of the microscopic topological state. Our results agree with experimentally measured plateau moduli for three different polymer classes over a wide rangeof reduced polymer densities: (i) semi-dilute theta solutions of synthetic polymers, (ii) the corresponding dense melts above the glass transition or crystallization temperature, and (iii) solutions of semi-flexible (bio)polymers such as f-actin or suspensions of rodlike viruses. Together these systems cover the entire range from loosely to tightly entangled polymers. In particular, we argue that the primitive path analysis renormalizes a loosely to a tightly entangled system and provide a new explanation of the successful Lin-Noolandi packing conjecture for polymer melts.Comment: To appear in J. Chem. Phys

    On the narrative form of simulations.

    Get PDF
    Understanding complex physical systems through the use of simulations often takes on a narrative character. That is, scientists using simulations seek an understanding of processes occurring in time by generating them from a dynamic model, thereby producing something like a historical narrative. This paper focuses on simulations of the Diels-Alder reaction, which is widely used in organic chemistry. It calls on several well-known works on historical narrative to draw out the ways in which use of these simulations mirrors aspects of narrative understanding: Gallie for "followability" and "contingency"; Mink for "synoptic judgment"; Ricoeur for "temporal dialectic"; and Hawthorn for a related dialectic of the "actual and the possible". Through these reflections on narrative, the paper aims for a better grasp of the role that temporal development sometimes plays in understanding physical processes and of how considerations of possibility enhance that understanding

    The Self-Organization of Speech Sounds

    Get PDF
    The speech code is a vehicle of language: it defines a set of forms used by a community to carry information. Such a code is necessary to support the linguistic interactions that allow humans to communicate. How then may a speech code be formed prior to the existence of linguistic interactions? Moreover, the human speech code is discrete and compositional, shared by all the individuals of a community but different across communities, and phoneme inventories are characterized by statistical regularities. How can a speech code with these properties form? We try to approach these questions in the paper, using the ``methodology of the artificial''. We build a society of artificial agents, and detail a mechanism that shows the formation of a discrete speech code without pre-supposing the existence of linguistic capacities or of coordinated interactions. The mechanism is based on a low-level model of sensory-motor interactions. We show that the integration of certain very simple and non language-specific neural devices leads to the formation of a speech code that has properties similar to the human speech code. This result relies on the self-organizing properties of a generic coupling between perception and production within agents, and on the interactions between agents. The artificial system helps us to develop better intuitions on how speech might have appeared, by showing how self-organization might have helped natural selection to find speech

    Closing the door on quantum nonlocality

    Get PDF
    Bell-type inequalities are proven using oversimplified probabilistic models and/or counterfactual definiteness (CFD). If setting-dependent variables describing measuring instruments are correctly introduced, none of these inequalities may be proven. In spite of this, a belief in a mysterious quantum nonlocality is not fading. Computer simulations of Bell tests allow people to study the different ways in which the experimental data might have been created. They also allow for the generation of various counterfactual experiments’ outcomes, such as repeated or simultaneous measurements performed in different settings on the same “photon-pair”, and so forth. They allow for the reinforcing or relaxing of CFD compliance and/or for studying the impact of various “photon identification procedures”, mimicking those used in real experiments. Data samples consistent with quantum predictions may be generated by using a specific setting-dependent identification procedure. It reflects the active role of instruments during the measurement process. Each of the setting-dependent data samples are consistent with specific setting-dependent probabilistic models which may not be deduced using non-contextual local realistic or stochastic hidden variables. In this paper, we will be discussing the results of these simulations. Since the data samples are generated in a locally causal way, these simulations provide additional strong arguments for closing the door on quantum nonlocality
    corecore