4,335 research outputs found

    On the explanatory power of principal components

    Full text link
    We show that if we have an orthogonal base (u1,…,upu_1,\ldots,u_p) in a pp-dimensional vector space, and select p+1p+1 vectors v1,…,vpv_1,\ldots, v_p and ww such that the vectors traverse the origin, then the probability of ww being to closer to all the vectors in the base than to v1,…,vpv_1,\ldots, v_p is at least 1/2 and converges as pp increases to infinity to a normal distribution on the interval [-1,1]; i.e., Φ(1)−Φ(−1)≈0.6826\Phi(1)-\Phi(-1)\approx0.6826. This result has relevant consequences for Principal Components Analysis in the context of regression and other learning settings, if we take the orthogonal base as the direction of the principal components.Comment: 10 pages, 3 figure

    Impact of public release of performance data on the behaviour of healthcare consumers and providers.

    Get PDF
    BACKGROUND: It is becoming increasingly common to publish information about the quality and performance of healthcare organisations and individual professionals. However, we do not know how this information is used, or the extent to which such reporting leads to quality improvement by changing the behaviour of healthcare consumers, providers, and purchasers. OBJECTIVES: To estimate the effects of public release of performance data, from any source, on changing the healthcare utilisation behaviour of healthcare consumers, providers (professionals and organisations), and purchasers of care. In addition, we sought to estimate the effects on healthcare provider performance, patient outcomes, and staff morale. SEARCH METHODS: We searched CENTRAL, MEDLINE, Embase, and two trials registers on 26 June 2017. We checked reference lists of all included studies to identify additional studies. SELECTION CRITERIA: We searched for randomised or non-randomised trials, interrupted time series, and controlled before-after studies of the effects of publicly releasing data regarding any aspect of the performance of healthcare organisations or professionals. Each study had to report at least one main outcome related to selecting or changing care. DATA COLLECTION AND ANALYSIS: Two review authors independently screened studies for eligibility and extracted data. For each study, we extracted data about the target groups (healthcare consumers, healthcare providers, and healthcare purchasers), performance data, main outcomes (choice of healthcare provider, and improvement by means of changes in care), and other outcomes (awareness, attitude, knowledge of performance data, and costs). Given the substantial degree of clinical and methodological heterogeneity between the studies, we presented the findings for each policy in a structured format, but did not undertake a meta-analysis. MAIN RESULTS: We included 12 studies that analysed data from more than 7570 providers (e.g. professionals and organisations), and a further 3,333,386 clinical encounters (e.g. patient referrals, prescriptions). We included four cluster-randomised trials, one cluster-non-randomised trial, six interrupted time series studies, and one controlled before-after study. Eight studies were undertaken in the USA, and one each in Canada, Korea, China, and The Netherlands. Four studies examined the effect of public release of performance data on consumer healthcare choices, and four on improving quality.There was low-certainty evidence that public release of performance data may make little or no difference to long-term healthcare utilisation by healthcare consumers (3 studies; 18,294 insurance plan beneficiaries), or providers (4 studies; 3,000,000 births, and 67 healthcare providers), or to provider performance (1 study; 82 providers). However, there was also low-certainty evidence to suggest that public release of performance data may slightly improve some patient outcomes (5 studies, 315,092 hospitalisations, and 7502 providers). There was low-certainty evidence from a single study to suggest that public release of performance data may have differential effects on disadvantaged populations. There was no evidence about effects on healthcare utilisation decisions by purchasers, or adverse effects. AUTHORS\u27 CONCLUSIONS: The existing evidence base is inadequate to directly inform policy and practice. Further studies should consider whether public release of performance data can improve patient outcomes, as well as healthcare processes

    Predicting a Protein's Stability under a Million Mutations

    Full text link
    Stabilizing proteins is a foundational step in protein engineering. However, the evolutionary pressure of all extant proteins makes identifying the scarce number of mutations that will improve thermodynamic stability challenging. Deep learning has recently emerged as a powerful tool for identifying promising mutations. Existing approaches, however, are computationally expensive, as the number of model inferences scales with the number of mutations queried. Our main contribution is a simple, parallel decoding algorithm. Our Mutate Everything is capable of predicting the effect of all single and double mutations in one forward pass. It is even versatile enough to predict higher-order mutations with minimal computational overhead. We build Mutate Everything on top of ESM2 and AlphaFold, neither of which were trained to predict thermodynamic stability. We trained on the Mega-Scale cDNA proteolysis dataset and achieved state-of-the-art performance on single and higher-order mutations on S669, ProTherm, and ProteinGym datasets. Code is available at https://github.com/jozhang97/MutateEverythingComment: NeurIPS 2023. Code available at https://github.com/jozhang97/MutateEverythin

    On the Transverse-Traceless Projection in Lattice Simulations of Gravitational Wave Production

    Full text link
    It has recently been pointed out that the usual procedure employed in order to obtain the transverse-traceless (TT) part of metric perturbations in lattice simulations was inconsistent with the fact that those fields live in the lattice and not in the continuum. It was claimed that this could lead to a larger amplitude and a wrong shape for the gravitational wave (GW) spectra obtained in numerical simulations of (p)reheating. In order to address this issue, we have defined a consistent prescription in the lattice for extracting the TT part of the metric perturbations. We demonstrate explicitly that the GW spectra obtained with the old continuum-based TT projection only differ marginally in amplitude and shape with respect to the new lattice-based ones. We conclude that one can therefore trust the predictions appearing in the literature on the spectra of GW produced during (p)reheating and similar scenarios simulated on a lattice.Comment: 22 pages, 8 figures, Submitted to JCA

    Katanin P60 Targets Microtubules with Defects

    Get PDF

    Polyamines as an ecofriendly postharvest tool to maintain fruit quality

    Get PDF
    Polyamines (PAs) are natural compounds involved in a wide range of plant growth and developmental process, such as cell division, dormancy breaking, germination, development of flower buds, fruit set, growth and ripening, as well as in plant responses to environmental stresses including chilling injury. This chapter will focus on the role of PAs in fruit growth and ripening, with special emphasis on the effects of pre- and postharvest PA treatment on fruit quality attributes, bioactive constituents with antioxidant activity, and tolerance of fruit to chilling injury damage. The results of this chapter provide evidence for the numerous beneficial effects of the exogenous PA treatments, both at pre- and postharvest time in fruit quality attributes including their concentration in antioxidant compounds. Taking into account that PAs are naturally occurring molecules their application as pre- or postharvest treatment could be considered as an environmentally compatible tool as they can be metabolized by fruit cells. In addition, it should be pointed out that although exogenous application of PAs enhances their endogenous levels, the concentrations remain far lower than the toxic ones. Since modern agriculture is searching for effective biological molecules with well-known metabolic effects but without toxicological effects, a possible answer may be related to PA treatments

    Dark Vector-Gauge-Boson Model

    Full text link
    A model based on SU(3)_C X SU(2)_L X U(1)_Y X SU(2)_N has recently been proposed, where the SU(2)_N vector gauge bosons are neutral, so that a vector dark-matter candidate is possible and constrained by data to be less than about 1 TeV. We explore further implications of this model, including a detailed study of its Higgs sector. We improve on its dark-matter phenomenology, as well as its discovery reach at the LHC (Large Hadron Collider).Comment: 15 pages, 4 figure

    Chesapeake Bay benthic community restoration goals

    Get PDF
    Benthic macroinvertebrate assemblages have been an integral part of the Chesapeake Bay monitoring program since its inception due to their ecological importance and their value as biological indicators. The condition of benthic assemblages reflects an integration of temporally variable environmental conditions and the effects of multiple types of environmental stresses. As such, benthic assemblages provide a useful complement to more temporally variable chemical and water quality monitoring measures. While assessments using benthic monitoring data have been useful for characterizing changes in environmental conditions at individual sites over time, and for relating the condition of sites to pollution loadings and sources, the full potential of these assessments for addressing larger management questions, such as What is the overall condition of the Bay? or How does the condition of various tributaries compare? has not yet been realized. Regional-scale assessments of ecological status and trends using benthic assemblages are limited by the fact that benthic assemblages are strongly influenced by naturally varying habitat elements, such as salinity, sediment type, and depth. Such natural variability confounds interpretation of differences in the benthic community differences as simple responses to anthropogenic environmental perturbations. An additional limitation is that different sampling methodologies used in various programs often constrain the extent to which the benthic data can be integrated for a unified assessment. The objective of this project was to develop a practical and conceptually sound framework for assessing benthic environmental conditions in Chesapeake Bay that would address the general constraints and limitations just described. This was accomplished by standardizing benthic data from several different monitoring programs to allow their integration into a single, coherent data base. From that data base a set of measures (Chesapeake Bay Benthic Restoration Goals) was developed to describe characteristics of benthic assemblages expected at sites having little evidence of environmental stress or disturbance. Using these goals, benthic data from any part of the Bay could be compared to determine whether conditions at that site met, were above, or were below expectations defined for reference sites in similar habitats
    • …
    corecore