5,194 research outputs found

    Transport on randomly evolving trees

    Full text link
    The time process of transport on randomly evolving trees is investigated. By introducing the notions of living and dead nodes a model of random tree evolution is constructed which describes the spreading in time of objects corresponding to nodes. By using the method of the age-dependent branching processes we derive the joint distribution function of the number of living and dead nodes, and determine the correlation between these node numbers as a function of time. Also analyzed are the stochastic properties of the end-nodes; and the correlation between the numbers of living and dead end-nodes is shown to change its character suddenly at the very beginning of the evolution process. The survival probability of random trees is investigated and expressions are derived for this probability.Comment: 16 pages, 8 figures, published in Phys. Rev. E 72, 051101 (2005

    Hidden Order in Crackling Noise during Peeling of an Adhesive Tape

    Full text link
    We address the long standing problem of recovering dynamical information from noisy acoustic emission signals arising from peeling of an adhesive tape subject to constant traction velocity. Using phase space reconstruction procedure we demonstrate the deterministic chaotic dynamics by establishing the existence of correlation dimension as also a positive Lyapunov exponent in a mid range of traction velocities. The results are explained on the basis of the model that also emphasizes the deterministic origin of acoustic emission by clarifying its connection to sticks-slip dynamics.Comment: 5 pages, 10 figure

    On the nature of surface roughness with application to contact mechanics, sealing, rubber friction and adhesion

    Full text link
    Surface roughness has a huge impact on many important phenomena. The most important property of rough surfaces is the surface roughness power spectrum C(q). We present surface roughness power spectra of many surfaces of practical importance, obtained from the surface height profile measured using optical methods and the Atomic Force Microscope. We show how the power spectrum determines the contact area between two solids. We also present applications to sealing, rubber friction and adhesion for rough surfaces, where the power spectrum enters as an important input.Comment: Topical review; 82 pages, 61 figures; Format: Latex (iopart). Some figures are in Postscript Level

    Constraining the Randall-Sundrum modulus in the light of recent PVLAS data

    Full text link
    Recent PVLAS data put stringent constraints on the measurement of birefringence and dichroism of electromagnetic waves travelling in a constant and homogeneous magnetic field. There have been theoretical predictions in favour of such phenomena when appropriate axion-electromagnetic coupling is assumed. Origin of such a coupling can be traced in a low energy string action from the requirement of quantum consistency. The resulting couplings in such models are an artifact of the compactification of the extra dimensions present inevitably in a string scenario. The moduli parameters which encode the compact manifold therefore play a crucial role in determining the axion-photon coupling. In this work we examine the possible bounds on the value of compact modulus that emerge from the experimental limits on the coupling obtained from the PVLAS data. In particular we focus into the Randall-Sundrum (RS) type of warped geometry model whose modulus parameter is already restricted from the requirement of the resolution of gauge hierarchy problem in connection with the mass of the Higgs. We explore the bound on the modulus for a wide range of the axion mass for both the birefringence and the dichroism data in PVLAS. We show that the proposed value of the modulus in the RS scenario can only be accommodated for axion mass \gsim 0.3 eV.Comment: 26 pages, 1 figure, LaTex; added references, typos corrected. Minor changes in the text, a comment added in the Conclusio

    Granger causality and transfer entropy are equivalent for Gaussian variables

    Full text link
    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.Comment: In review, Phys. Rev. Lett., Nov. 200

    Multivariate Granger Causality and Generalized Variance

    Get PDF
    Granger causality analysis is a popular method for inference on directed interactions in complex systems of many variables. A shortcoming of the standard framework for Granger causality is that it only allows for examination of interactions between single (univariate) variables within a system, perhaps conditioned on other variables. However, interactions do not necessarily take place between single variables, but may occur among groups, or "ensembles", of variables. In this study we establish a principled framework for Granger causality in the context of causal interactions among two or more multivariate sets of variables. Building on Geweke's seminal 1982 work, we offer new justifications for one particular form of multivariate Granger causality based on the generalized variances of residual errors. Taken together, our results support a comprehensive and theoretically consistent extension of Granger causality to the multivariate case. Treated individually, they highlight several specific advantages of the generalized variance measure, which we illustrate using applications in neuroscience as an example. We further show how the measure can be used to define "partial" Granger causality in the multivariate context and we also motivate reformulations of "causal density" and "Granger autonomy". Our results are directly applicable to experimental data and promise to reveal new types of functional relations in complex systems, neural and otherwise.Comment: added 1 reference, minor change to discussion, typos corrected; 28 pages, 3 figures, 1 table, LaTe

    A New Approach to Time Domain Classification of Broadband Noise in Gravitational Wave Data

    Get PDF
    Broadband noise in gravitational wave (GW) detectors, also known as triggers, can often be a deterrant to the efficiency with which astrophysical search pipelines detect sources. It is important to understand their instrumental or environmental origin so that they could be eliminated or accounted for in the data. Since the number of triggers is large, data mining approaches such as clustering and classification are useful tools for this task. Classification of triggers based on a handful of discrete properties has been done in the past. A rich information content is available in the waveform or 'shape' of the triggers that has had a rather restricted exploration so far. This paper presents a new way to classify triggers deriving information from both trigger waveforms as well as their discrete physical properties using a sequential combination of the Longest Common Sub-Sequence (LCSS) and LCSS coupled with Fast Time Series Evaluation (FTSE) for waveform classification and the multidimensional hierarchical classification (MHC) analysis for the grouping based on physical properties. A generalized k-means algorithm is used with the LCSS (and LCSS+FTSE) for clustering the triggers using a validity measure to determine the correct number of clusters in absence of any prior knowledge. The results have been demonstrated by simulations and by application to a segment of real LIGO data from the sixth science run.Comment: 16 pages, 16 figure

    Reading Videogames as (authorless) Literature

    Get PDF
    This article presents the outcomes of research, funded by the Arts and Humanities Research Council in England and informed by work in the fields of new literacy research, gaming studies and the socio-cultural framing of education, for which the videogame L.A. Noire (Rockstar Games, 2011) was studied within the orthodox framing of the English Literature curriculum at A Level (pre-University) and Undergraduate (degree level). There is a plethora of published research into the kinds of literacy practices evident in videogame play, virtual world engagement and related forms of digital reading and writing (Gee, 2003; Juul, 2005; Merchant, Gillen, Marsh and Davies, 2012; Apperley and Walsh, 2012; Bazalgette and Buckingham, 2012) as well as the implications of such for home / school learning (Dowdall, 2006; Jenkins, 2006; Potter, 2012) and for teachers’ own digital lives (Graham, 2012). Such studies have tended to focus on younger children and this research is also distinct from such work in the field in its exploration of the potential for certain kinds of videogame to be understood as 'digital transformations' of conventional ‘schooled’ literature. The outcomes of this project raise implications of such a conception for a further implementation of a ‘reframed’ literacy (Marsh, 2007) within the contemporary curriculum of a traditional and conservative ‘subject’. A mixed methods approach was adopted. Firstly, students contributing to a gamplay blog requiring them to discuss their in-game experience through the ‘language game’ of English Literature, culminating in answering a question constructed with the idioms of the subject’s set text ‘final examination’. Secondly, students taught their teachers to play L.A. Noire, with free choice over the context for this collaboration. Thirdly, participants returned to traditional roles in order to work through a set of study materials provided, designed to reproduce the conventions of the ‘study guide’ for literature education. Interviews were conducted after each phase and the outcomes informed a redrafting of the study materials which are now available online for teachers – this being the ‘practical’ outcome of the research (Berger and McDougall, 2012). In the act of inserting the study of L.A. Noire into the English Literature curriculum as currently framed, this research moves, through a practical ‘implementation’ beyond longstanding debates around narratology and ludology (Frasca, 2003; Juul, 2005) in the field of game studies (Leaning, 2012) through a direct connection to new literacy studies and raises epistemological questions about ‘subject identity’, informed by Bernstein (1996) and Bourdieu (1986) and the implications for digital transformations of texts for both ideas about cultural value in schooled literacy (Kendall and McDougall, 2011) and the politics of ‘expertise’ in pedagogic relations (Ranciere, 2009, Bennett, Kendall and McDougall, 2012a)
    corecore