323 research outputs found

    Measuring Information Transfer

    Full text link
    An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish driving and responding elements and to detect asymmetry in the coupling of subsystems.Comment: 4 pages, 4 Figures, Revte

    Webteaching: sequencing of subject matter in relation to prior knowledge of pupils

    Get PDF
    Two experiments are discussed in which the sequencing procedure of webteaching is compared with a linear sequence for the presentation of text material.\ud \ud In the first experiment variations in the level of prior knowledge of pupils were studied for their influence on the sequencing mode of text presentation. Prior knowledge greatly reduced the effect of the size of sequencing procedures.\ud \ud In the second experiment pupils with a low level of prior knowledge studied a text, following either a websequence or a linear sequence. Webteaching was superior to linear teaching on a number of dependent variables. It is concluded that webteaching is an effective sequencing procedure in those cases where substantial new learning is required

    Test your surrogate data before you test for nonlinearity

    Get PDF
    The schemes for the generation of surrogate data in order to test the null hypothesis of linear stochastic process undergoing nonlinear static transform are investigated as to their consistency in representing the null hypothesis. In particular, we pinpoint some important caveats of the prominent algorithm of amplitude adjusted Fourier transform surrogates (AAFT) and compare it to the iterated AAFT (IAAFT), which is more consistent in representing the null hypothesis. It turns out that in many applications with real data the inferences of nonlinearity after marginal rejection of the null hypothesis were premature and have to be re-investigated taken into account the inaccuracies in the AAFT algorithm, mainly concerning the mismatching of the linear correlations. In order to deal with such inaccuracies we propose the use of linear together with nonlinear polynomials as discriminating statistics. The application of this setup to some well-known real data sets cautions against the use of the AAFT algorithm.Comment: 14 pages, 15 figures, submitted to Physical Review

    A dynamic and multifunctional account of middle‐range theories

    Get PDF
    This article develops a novel account of middle‐range theories for combining theoretical and empirical analysis in explanatory sociology. I first revisit Robert K. Merton’s original ideas on middle‐range theories and identify a tension between his developmental approach to middle‐range theorizing that recognizes multiple functions of theories in sociological research and his static definition of the concept of middle‐range theory that focuses only on empirical testing of theories. Drawing on Merton's ideas on theorizing and recent discussions on mechanism‐based explanations, I argue that this tension can be resolved by decomposing a middle‐range theory into three interrelated and evolving components that perform different functions in sociological research: (i) a conceptual framework about social phenomena that is a set of interrelated concepts that evolve in close connection with empirical analysis; (ii) a mechanism schema that is an abstract and incomplete description of a social mechanism; and (iii) a cluster of all mechanism‐based explanations of social phenomena that are based on the particular mechanism schema. I show how these components develop over time and how they serve different functions in sociological theorizing and research. Finally, I illustrate these ideas by discussing Merton’s theory of the Matthew effect in science and its more recent applications in sociology.This article develops a novel account of middle‐range theories for combining theoretical and empirical analysis in explanatory sociology. I first revisit Robert K. Merton’s original ideas on middle‐range theories and identify a tension between his developmental approach to middle‐range theorizing that recognizes multiple functions of theories in sociological research and his static definition of the concept of middle‐range theory that focuses only on empirical testing of theories. Drawing on Merton's ideas on theorizing and recent discussions on mechanism‐based explanations, I argue that this tension can be resolved by decomposing a middle‐range theory into three interrelated and evolving components that perform different functions in sociological research: (i) a conceptual framework about social phenomena that is a set of interrelated concepts that evolve in close connection with empirical analysis; (ii) a mechanism schema that is an abstract and incomplete description of a social mechanism; and (iii) a cluster of all mechanism‐based explanations of social phenomena that are based on the particular mechanism schema. I show how these components develop over time and how they serve different functions in sociological theorizing and research. Finally, I illustrate these ideas by discussing Merton’s theory of the Matthew effect in science and its more recent applications in sociology.Peer reviewe

    Exploration of the beliefs and experiences of Aboriginal people with cancer in Western Australia: a methodology to acknowledge cultural difference and build understanding

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Aboriginal Australians experience poorer outcomes, and are 2.5 times more likely to die from cancer than non-Aboriginal people, even after adjustment for stage of diagnosis, cancer treatment and comorbidities. They are also less likely to present early as a result of symptoms and to access treatment. Psycho-social factors affect Aboriginal people's willingness and ability to participate in cancer-related screening and treatment services, but little exploration of this has occurred within Australia to date. The current research adopted a phenomenological qualitative approach to understand and explore the lived experiences of Aboriginal Australians with cancer and their beliefs and understanding around this disease in Western Australia (WA). This paper details considerations in the design and process of conducting the research.</p> <p>Methods/Design</p> <p>The National Health and Medical Research Council (NHMRC) guidelines for ethical conduct of Aboriginal research were followed. Researchers acknowledged the past negative experiences of Aboriginal people with research and were keen to build trust and relationships prior to conducting research with them. Thirty in-depth interviews with Aboriginal people affected by cancer and twenty with health service providers were carried out in urban, rural and remote areas of WA. Interviews were audio-recorded, transcribed verbatim and coded independently by two researchers. NVivo7 software was used to assist data management and analysis. Participants' narratives were divided into broad categories to allow identification of key themes and discussed by the research team.</p> <p>Discussion and conclusion</p> <p>Key issues specific to Aboriginal research include the need for the research process to be relationship-based, respectful, culturally appropriate and inclusive of Aboriginal people. Researchers are accountable to both participants and the wider community for reporting their findings and for research translation so that the research outcomes benefit the Aboriginal community. There are a number of factors that influence whether the desired level of engagement can be achieved in practice. These include the level of resourcing for the project and the researchers' efforts to ensure dissemination and research translation; and the capacity of the Aboriginal community to engage with research given other demands upon their time.</p

    Statistical Theory of Spin Relaxation and Diffusion in Solids

    Full text link
    A comprehensive theoretical description is given for the spin relaxation and diffusion in solids. The formulation is made in a general statistical-mechanical way. The method of the nonequilibrium statistical operator (NSO) developed by D. N. Zubarev is employed to analyze a relaxation dynamics of a spin subsystem. Perturbation of this subsystem in solids may produce a nonequilibrium state which is then relaxed to an equilibrium state due to the interaction between the particles or with a thermal bath (lattice). The generalized kinetic equations were derived previously for a system weakly coupled to a thermal bath to elucidate the nature of transport and relaxation processes. In this paper, these results are used to describe the relaxation and diffusion of nuclear spins in solids. The aim is to formulate a successive and coherent microscopic description of the nuclear magnetic relaxation and diffusion in solids. The nuclear spin-lattice relaxation is considered and the Gorter relation is derived. As an example, a theory of spin diffusion of the nuclear magnetic moment in dilute alloys (like Cu-Mn) is developed. It is shown that due to the dipolar interaction between host nuclear spins and impurity spins, a nonuniform distribution in the host nuclear spin system will occur and consequently the macroscopic relaxation time will be strongly determined by the spin diffusion. The explicit expressions for the relaxation time in certain physically relevant cases are given.Comment: 41 pages, 119 Refs. Corrected typos, added reference

    Proteomic profiling of cardiac tissue by isolation of nuclei tagged in specific cell types (INTACT)

    Get PDF
    The proper dissection of the molecular mechanisms governing the specification and differentiation of specific cell types requires isolation of pure cell populations from heterogeneous tissues and whole organisms. Here, we describe a method for purification of nuclei from defined cell or tissue types in vertebrate embryos using INTACT (isolation of nuclei tagged in specific cell types). This method, previously developed in plants, flies and worms, utilizes in vivo tagging of the nuclear envelope with biotin and the subsequent affinity purification of the labeled nuclei. In this study we successfully purified nuclei of cardiac and skeletal muscle from Xenopus using this strategy. We went on to demonstrate the utility of this approach by coupling the INTACT approach with liquid chromatography-tandem mass spectrometry (LC-MS/MS) proteomic methodologies to profile proteins expressed in the nuclei of developing hearts. From these studies we have identified the Xenopus orthologs of 12 human proteins encoded by genes, which when mutated in human lead to congenital heart disease. Thus, by combining these technologies we are able to identify tissue-specific proteins that are expressed and required for normal vertebrate organ development

    Effect of promoter architecture on the cell-to-cell variability in gene expression

    Get PDF
    According to recent experimental evidence, the architecture of a promoter, defined as the number, strength and regulatory role of the operators that control the promoter, plays a major role in determining the level of cell-to-cell variability in gene expression. These quantitative experiments call for a corresponding modeling effort that addresses the question of how changes in promoter architecture affect noise in gene expression in a systematic rather than case-by-case fashion. In this article, we make such a systematic investigation, based on a simple microscopic model of gene regulation that incorporates stochastic effects. In particular, we show how operator strength and operator multiplicity affect this variability. We examine different modes of transcription factor binding to complex promoters (cooperative, independent, simultaneous) and how each of these affects the level of variability in transcription product from cell-to-cell. We propose that direct comparison between in vivo single-cell experiments and theoretical predictions for the moments of the probability distribution of mRNA number per cell can discriminate between different kinetic models of gene regulation.Comment: 35 pages, 6 figures, Submitte

    Distributive justice with and without culture

    Get PDF
    Academic treatments of distributive justice normally adopt a static approach centred on resource allocation among a set of individual agents. The resulting models, expressed in mathematical language, make no allowance for culture, as they never engage with the society’s way of life or the moulding of individuals within society. This paper compares the static approach to distributive justice with a cultural one, arguing that a case for redistribution should rest upon its cultural effects in assisting well-being and social cohesion. Unless we recognise culture, we can have little understanding of why inequalities matter, where they come from, and how they might be reduced. Redistribution may be motivated by universal value judgements taken from external sources, but it also entails internal cultural changes that refashion social relations through cumulative causation. In practical terms, it has to penetrate beyond reallocating resource endowments to bring revised attitudes in a society less tolerant of unequal outcomes. Egalitarian reforms will flourish only if they generate and reflect an egalitarian culture
    corecore