10,971 research outputs found

    Analytic regularity for a singularly perturbed system of reaction-diffusion equations with multiple scales: proofs

    Full text link
    We consider a coupled system of two singularly perturbed reaction-diffusion equations, with two small parameters 0<ϵμ10< \epsilon \le \mu \le 1, each multiplying the highest derivative in the equations. The presence of these parameters causes the solution(s) to have \emph{boundary layers} which overlap and interact, based on the relative size of ϵ\epsilon and % \mu. We construct full asymptotic expansions together with error bounds that cover the complete range 0<ϵμ10 < \epsilon \leq \mu \leq 1. For the present case of analytic input data, we derive derivative growth estimates for the terms of the asymptotic expansion that are explicit in the perturbation parameters and the expansion order

    Quantum resource estimates for computing elliptic curve discrete logarithms

    Get PDF
    We give precise quantum resource estimates for Shor's algorithm to compute discrete logarithms on elliptic curves over prime fields. The estimates are derived from a simulation of a Toffoli gate network for controlled elliptic curve point addition, implemented within the framework of the quantum computing software tool suite LIQUiUi|\rangle. We determine circuit implementations for reversible modular arithmetic, including modular addition, multiplication and inversion, as well as reversible elliptic curve point addition. We conclude that elliptic curve discrete logarithms on an elliptic curve defined over an nn-bit prime field can be computed on a quantum computer with at most 9n+2log2(n)+109n + 2\lceil\log_2(n)\rceil+10 qubits using a quantum circuit of at most 448n3log2(n)+4090n3448 n^3 \log_2(n) + 4090 n^3 Toffoli gates. We are able to classically simulate the Toffoli networks corresponding to the controlled elliptic curve point addition as the core piece of Shor's algorithm for the NIST standard curves P-192, P-224, P-256, P-384 and P-521. Our approach allows gate-level comparisons to recent resource estimates for Shor's factoring algorithm. The results also support estimates given earlier by Proos and Zalka and indicate that, for current parameters at comparable classical security levels, the number of qubits required to tackle elliptic curves is less than for attacking RSA, suggesting that indeed ECC is an easier target than RSA.Comment: 24 pages, 2 tables, 11 figures. v2: typos fixed and reference added. ASIACRYPT 201

    A filament of dark matter between two clusters of galaxies

    Full text link
    It is a firm prediction of the concordance Cold Dark Matter (CDM) cosmological model that galaxy clusters live at the intersection of large-scale structure filaments. The thread-like structure of this "cosmic web" has been traced by galaxy redshift surveys for decades. More recently the Warm-Hot Intergalactic Medium (WHIM) residing in low redshift filaments has been observed in emission and absorption. However, a reliable direct detection of the underlying Dark Matter skeleton, which should contain more than half of all matter, remained elusive, as earlier candidates for such detections were either falsified or suffered from low signal-to-noise ratios and unphysical misalignements of dark and luminous matter. Here we report the detection of a dark matter filament connecting the two main components of the Abell 222/223 supercluster system from its weak gravitational lensing signal, both in a non-parametric mass reconstruction and in parametric model fits. This filament is coincident with an overdensity of galaxies and diffuse, soft X-ray emission and contributes mass comparable to that of an additional galaxy cluster to the total mass of the supercluster. Combined with X-ray observations, we place an upper limit of 0.09 on the hot gas fraction, the mass of X-ray emitting gas divided by the total mass, in the filament.Comment: Nature, in pres

    Circulating markers of arterial thrombosis and late-stage age-related macular degeneration: a case-control study.

    No full text
    PURPOSE: The aim of this study was to examine the relation of late-stage age-related macular degeneration (AMD) with markers of systemic atherothrombosis. METHODS: A hospital-based case-control study of AMD was undertaken in London, UK. Cases of AMD (n=81) and controls (n=77) were group matched for age and sex. Standard protocols were used for colour fundus photography and to classify AMD; physical examination included height, weight, history of or treatment for vascular-related diseases and smoking status. Blood samples were taken for measurement of fibrinogen, factor VIIc (FVIIc), factor VIIIc, prothrombin fragment F1.2 (F1.2), tissue plasminogen activator, and von Willebrand factor. Odds ratios from logistic regression analyses of each atherothrombotic marker with AMD were adjusted for age, sex, and established cardiovascular disease risk factors, including smoking, blood pressure, body mass index, and total cholesterol. RESULTS: After adjustment FVIIc and possibly F1.2 were inversely associated with the risk of AMD; per 1 standard deviation increase in these markers the odds ratio were, respectively, 0.62 (95% confidence interval 0.40, 0.95) and 0.71 (0.46, 1.09). None of the other atherothrombotic risk factors appeared to be related to AMD status. There was weak evidence that aspirin is associated with a lower risk of AMD. CONCLUSIONS: This study does not provide strong evidence of associations between AMD and systematic markers of arterial thrombosis, but the potential effects of FVIIc, and F1.2 are worthy of further investigation

    Considering the Case for Biodiversity Cycles: Reexamining the Evidence for Periodicity in the Fossil Record

    Get PDF
    Medvedev and Melott (2007) have suggested that periodicity in fossil biodiversity may be induced by cosmic rays which vary as the Solar System oscillates normal to the galactic disk. We re-examine the evidence for a 62 million year (Myr) periodicity in biodiversity throughout the Phanerozoic history of animal life reported by Rohde & Mueller (2005), as well as related questions of periodicity in origination and extinction. We find that the signal is robust against variations in methods of analysis, and is based on fluctuations in the Paleozoic and a substantial part of the Mesozoic. Examination of origination and extinction is somewhat ambiguous, with results depending upon procedure. Origination and extinction intensity as defined by RM may be affected by an artifact at 27 Myr in the duration of stratigraphic intervals. Nevertheless, when a procedure free of this artifact is implemented, the 27 Myr periodicity appears in origination, suggesting that the artifact may ultimately be based on a signal in the data. A 62 Myr feature appears in extinction, when this same procedure is used. We conclude that evidence for a periodicity at 62 Myr is robust, and evidence for periodicity at approximately 27 Myr is also present, albeit more ambiguous.Comment: Minor modifications to reflect final published versio

    Evaluation of physicians' professional performance: An iterative development and validation study of multisource feedback instruments

    Get PDF
    Contains fulltext : 107798.pdf (publisher's version ) (Open Access)BACKGROUND: There is a global need to assess physicians' professional performance in actual clinical practice. Valid and reliable instruments are necessary to support these efforts. This study focuses on the reliability and validity, the influences of some sociodemographic biasing factors, associations between self and other evaluations, and the number of evaluations needed for reliable assessment of a physician based on the three instruments used for the multisource assessment of physicians' professional performance in the Netherlands. METHODS: This observational validation study of three instruments underlying multisource feedback (MSF) was set in 26 non-academic hospitals in the Netherlands. In total, 146 hospital-based physicians took part in the study. Each physician's professional performance was assessed by peers (physician colleagues), co-workers (including nurses, secretary assistants and other healthcare professionals) and patients. Physicians also completed a self-evaluation. Ratings of 864 peers, 894 co-workers and 1960 patients on MSF were available. We used principal components analysis and methods of classical test theory to evaluate the factor structure, reliability and validity of instruments. We used Pearson's correlation coefficient and linear mixed models to address other objectives. RESULTS: The peer, co-worker and patient instruments respectively had six factors, three factors and one factor with high internal consistencies (Cronbach's alpha 0.95 - 0.96). It appeared that only 2 percent of variance in the mean ratings could be attributed to biasing factors. Self-ratings were not correlated with peer, co-worker or patient ratings. However, ratings of peers, co-workers and patients were correlated. Five peer evaluations, five co-worker evaluations and 11 patient evaluations are required to achieve reliable results (reliability coefficient >/= 0.70). CONCLUSIONS: The study demonstrated that the three MSF instruments produced reliable and valid data for evaluating physicians' professional performance in the Netherlands. Scores from peers, co-workers and patients were not correlated with self-evaluations. Future research should examine improvement of performance when using MSF

    “Excellence R Us”: university research and the fetishisation of excellence

    Get PDF
    The rhetoric of “excellence” is pervasive across the academy. It is used to refer to research outputs as well as researchers, theory and education, individuals and organisations, from art history to zoology. But does “excellence” actually mean anything? Does this pervasive narrative of “excellence” do any good? Drawing on a range of sources we interrogate “excellence” as a concept and find that it has no intrinsic meaning in academia. Rather it functions as a linguistic interchange mechanism. To investigate whether this linguistic function is useful we examine how the rhetoric of excellence combines with narratives of scarcity and competition to show that the hypercompetition that arises from the performance of “excellence” is completely at odds with the qualities of good research. We trace the roots of issues in reproducibility, fraud, and homophily to this rhetoric. But we also show that this rhetoric is an internal, and not primarily an external, imposition. We conclude by proposing an alternative rhetoric based on soundness and capacity-building. In the final analysis, it turns out that that “excellence” is not excellent. Used in its current unqualified form it is a pernicious and dangerous rhetoric that undermines the very foundations of good research and scholarship

    New directions in cellular therapy of cancer: a summary of the summit on cellular therapy for cancer

    Get PDF
    A summit on cellular therapy for cancer discussed and presented advances related to the use of adoptive cellular therapy for melanoma and other cancers. The summit revealed that this field is advancing rapidly. Conventional cellular therapies, such as tumor infiltrating lymphocytes (TIL), are becoming more effective and more available. Gene therapy is becoming an important tool in adoptive cell therapy. Lymphocytes are being engineered to express high affinity T cell receptors (TCRs), chimeric antibody-T cell receptors (CARs) and cytokines. T cell subsets with more naïve and stem cell-like characteristics have been shown in pre-clinical models to be more effective than unselected populations and it is now possible to reprogram T cells and to produce T cells with stem cell characteristics. In the future, combinations of adoptive transfer of T cells and specific vaccination against the cognate antigen can be envisaged to further enhance the effectiveness of these therapies
    corecore