1,818 research outputs found

    On Exceptional Times for generalized Fleming-Viot Processes with Mutations

    Full text link
    If Y\mathbf Y is a standard Fleming-Viot process with constant mutation rate (in the infinitely many sites model) then it is well known that for each t>0t>0 the measure Yt\mathbf Y_t is purely atomic with infinitely many atoms. However, Schmuland proved that there is a critical value for the mutation rate under which almost surely there are exceptional times at which Y\mathbf Y is a finite sum of weighted Dirac masses. In the present work we discuss the existence of such exceptional times for the generalized Fleming-Viot processes. In the case of Beta-Fleming-Viot processes with index α]1,2[\alpha\in\,]1,2[ we show that - irrespectively of the mutation rate and α\alpha - the number of atoms is almost surely always infinite. The proof combines a Pitman-Yor type representation with a disintegration formula, Lamperti's transformation for self-similar processes and covering results for Poisson point processes

    Addressing Missing Data in Patient-Reported Outcome Measures (PROMS) : Implications for the Use of PROMS for Comparing Provider Performance

    Get PDF
    Patient-reported outcome measures (PROMs) are now routinely collected in the English National Health Service and used to compare and reward hospital performance within a high-powered pay-for-performance scheme. However, PROMs are prone to missing data. For example, hospitals often fail to administer the pre-operative questionnaire at hospital admission, or patients may refuse to participate or fail to return their post-operative questionnaire. A key concern with missing PROMs is that the individuals with complete information tend to be an unrepresentative sample of patients within each provider and inferences based on the complete cases will be misleading. This study proposes a strategy for addressing missing data in the English PROM survey using multiple imputation techniques and investigates its impact on assessing provider performance. We find that inferences about relative provider performance are sensitive to the assumptions made about the reasons for the missing data

    Severe accordion effect: Myocardial ischemia due to wire complication during percutaneous coronary intervention: A case report

    Get PDF
    A mechanical alteration during manoeuvring of stiff guidewires in tortuous coronary arteries frequently induces vessel wall shortening and coronary psedostenosis, referred as accordion phenomenon. Subtraction of the guidewires normally leads to the entire resolution of the lesions. A case of this transient angiographic finding, during percutaneous coronary intervention in a tortuous right coronary artery, which resulted in a flow limiting effect and myocardial ischemia, is described in the present report. Differential diagnosis from potential procedure complications and interventional methodology issues are discussed, while similar reports are reviewed

    Quantum fluctuations can promote or inhibit glass formation

    Full text link
    The very nature of glass is somewhat mysterious: while relaxation times in glasses are of sufficient magnitude that large-scale motion on the atomic level is essentially as slow as it is in the crystalline state, the structure of glass appears barely different than that of the liquid that produced it. Quantum mechanical systems ranging from electron liquids to superfluid helium appear to form glasses, but as yet no unifying framework exists connecting classical and quantum regimes of vitrification. Here we develop new insights from theory and simulation into the quantum glass transition that surprisingly reveal distinct regions where quantum fluctuations can either promote or inhibit glass formation.Comment: Accepted for publication in Nature Physics. 22 pages, 3 figures, 1 Tabl

    Systemic Risk and Default Clustering for Large Financial Systems

    Full text link
    As it is known in the finance risk and macroeconomics literature, risk-sharing in large portfolios may increase the probability of creation of default clusters and of systemic risk. We review recent developments on mathematical and computational tools for the quantification of such phenomena. Limiting analysis such as law of large numbers and central limit theorems allow to approximate the distribution in large systems and study quantities such as the loss distribution in large portfolios. Large deviations analysis allow us to study the tail of the loss distribution and to identify pathways to default clustering. Sensitivity analysis allows to understand the most likely ways in which different effects, such as contagion and systematic risks, combine to lead to large default rates. Such results could give useful insights into how to optimally safeguard against such events.Comment: in Large Deviations and Asymptotic Methods in Finance, (Editors: P. Friz, J. Gatheral, A. Gulisashvili, A. Jacqier, J. Teichmann) , Springer Proceedings in Mathematics and Statistics, Vol. 110 2015

    Ultraviolet radiation shapes seaweed communities

    Get PDF

    A general modeling and visualization tool for comparing different members of a group: application to studying tau-mediated regulation of microtubule dynamics

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Innumerable biological investigations require comparing collections of molecules, cells or organisms to one another with respect to one or more of their properties. Almost all of these comparisons are performed manually, which can be susceptible to inadvertent bias as well as miss subtle effects. The development and application of computer-assisted analytical and interpretive tools could help address these issues and thereby dramatically improve these investigations.</p> <p>Results</p> <p>We have developed novel computer-assisted analytical and interpretive tools and applied them to recent studies examining the ability of 3-repeat and 4-repeat tau to regulate the dynamic behavior of microtubules in vitro. More specifically, we have developed an automated and objective method to define growth, shortening and attenuation events from real time videos of dynamic microtubules, and demonstrated its validity by comparing it to manually assessed data. Additionally, we have used the same data to develop a general strategy of building different models of interest, computing appropriate dissimilarity functions to compare them, and embedding them on a two-dimensional plot for visualization and easy comparison. Application of these methods to assess microtubule growth rates and growth rate distributions established the validity of the embedding procedure and revealed non-linearity in the relationship between the tau:tubulin molar ratio and growth rate distribution.</p> <p>Conclusion</p> <p>This work addresses the need of the biological community for rigorously quantitative and generally applicable computational tools for comparative studies. The two-dimensional embedding method retains the inherent structure of the data, and yet markedly simplifies comparison between models and parameters of different samples. Most notably, even in cases where numerous parameters exist by which to compare the different samples, our embedding procedure provides a generally applicable computational strategy to detect subtle relationships between different molecules or conditions that might otherwise escape manual analyses.</p

    The 9p21 susceptibility locus for coronary artery disease and the severity of coronary atherosclerosis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Case-control Genome-Wide Association Studies (GWAS) have identified single nucleotide polymorphisms (SNPs) at the 9p21 locus as risk factors for coronary artery disease (CAD). The locus does not contain a clear candidate gene. Hence, the results of GWAS have raised an intense interest in delineating the basis for the observed association. We analyzed association of 4 SNPs at the 9p21 locus with the severity and progression of coronary atherosclerosis, as determined by serial quantitative coronary angiograms (QCA) in the well-characterized Lipoprotein Coronary Atherosclerosis Study (LCAS) population. The LCAS is a randomized placebo-control longitudinal follow-up study in patients with CAD conducted to test the effects of fluvastatin on progression or regression of coronary atherosclerosis.</p> <p>Methods</p> <p>Extensive plasma lipid levels were measured at the baseline and 2 1/2 years after randomization. Likewise serial QCA was performed at the baseline and upon completion of the study. We genotyped the population for 4 SNPs, previously identified as the susceptibility SNPs for CAD in GWAS, using fluorogenic 5' nuclease assays. We reconstructed the haplotypes using Phase 2, analyzed SNP and haplotype effects using the Thesias software as well as by the conventional statistical methods.</p> <p>Results</p> <p>Only Caucasians were included since they comprised 90% of the study population (332/371 with available DNA sample). The 4 SNPs at the 9p21 locus were in tight linkage disequilibrium, leading to 3 common haplotypes in the LCAS population. We found no significant association between quantitative indices of severity of coronary atherosclerosis, such as minimal lumen diameter and number of coronary lesions or occlusions and the 9p21 SNPs and haplotypes. Likewise, there was no association between quantitative indices of progression of coronary atherosclerosis and the SNPs or haplotypes. Similarly, we found no significant SNP or haplotype effect on severity and progression of coronary atherosclerosis.</p> <p>Conclusion</p> <p>We conclude the 4 SNPs at the 9p21 locus analyzed in this study do not impart major effects on the severity or progression of coronary atherosclerosis. The effect size may be very modest or the observed association of the CAD with SNPs at the 9p21 locus in the case-control GWAS reflect involvement of vascular mechanisms not directly related to the severity or progression of coronary atherosclerosis.</p

    Deferred imitation and declarative memory in domestic dogs

    Get PDF
    This study demonstrates for the first time deferred imitation of novel actions in dogs (Canis familiaris) with retention intervals of 1.5 min and memory of familiar actions with intervals ranging from 0.40 to 10 min. Eight dogs were trained using the 'Do as I do' method to match their own behaviour to actions displayed by a human demonstrator. They were then trained to wait for a short interval to elapse before they were allowed to show the previously demonstrated action. The dogs were then tested for memory of the demonstrated behaviour in various conditions, also with the so-called two-action procedure and in a control condition without demonstration. Dogs were typically able to reproduce familiar actions after intervals as long as 10 min, even if distracted by different activities during the retention interval and were able to match their behaviour to the demonstration of a novel action after a delay of 1 min. In the two-action procedure, dogs were typically able to imitate the novel demonstrated behaviour after retention intervals of 1.5 min. The ability to encode and recall an action after a delay implies that facilitative processes cannot exhaustively explain the observed behavioural similarity and that dogs' imitative abilities are rather based on an enduring mental representation of the demonstration. Furthermore, the ability to imitate a novel action after a delay without previous practice suggests presence of declarative memory in dogs. © 2013 Springer-Verlag Berlin Heidelberg
    corecore