413,818 research outputs found

    Profiling medical sites based on adverse events data for multi-center clinical trials

    Full text link
    Profiling medical sites is an important activity in both clinical research and practice. Many organizations provide public report cards comparing outcomes across hospitals. An analogous concept applied in multicenter clinical trials, such “report cards” guide sponsors to choose sites while designing a study, help identify areas of improvement for sites, and motivate sites to perform better. Sponsors include comparative performance of sites, a concept to perform risk-based monitoring and central statistical monitoring. In clinical research, report cards are powerful tools for relating site performance to treatment benefits. This study evaluates approaches to estimating the proportion of adverse events at the site-level in a multicenter clinical trial setting and also methods in detecting outlying sites. We address three topics. First we assess the performance of different models for obtaining estimates of adverse events rates utilizing Bayesian beta-binomial and binomial logit-normal models with MCMC estimation and fixed effects maximum likelihood estimation (MLE) methods. We - vary sample sizes, number of medical sites, overall adverse event rates, and intraclass correlation within sites. Second, we compare the performance of these methods in identifying outlier sites, contrasting MLE and Bayesian approaches. A fixed threshold method detects sites as outliers under a Bayesian approach, while in the fixed effects assumption, a 95% interval-based approach is applied. Third, we extend this approach in estimating multiple outcomes at the site level and detecting outlier sites. A standard bivariate normal MLE method is compared to a Bayesian bivariate binomial logit-normal MCMC. These are examined using simulation studies. Results show for single outcomes, Bayesian beta-binomial MCMC method perform well under certain parametric conditions for estimation and detecting outlier sites. For multiple outcomes with higher adverse event rate and larger difference between outliers and non-outliers, for detecting outlier sites, both methods – Bayesian MCMC and MLE work well, irrespective of the correlation between outcomes.2020-02-14T00:00:00

    DNA Metabarcoding of Meiofaunal Communities Along the California Coast and Potential Abiotic Drivers of Distribution

    Get PDF
    Meiofauna are abundant and diverse infaunal organisms between 50-1000μm that are specially adapted to live in the interstitial spaces between sedimentary particles. Despite their ubiquity, they remain an understudied component of benthic systems. Modern DNA metabarcoding tools allow for the total sequencing of mixed communities from a single sample. Samples of meiofauna (n = 148) were collected at 10 sandy beaches along the coast of California during summer 2017 to characterize meiofauna community structure based on (1) latitude, (2) location north or south of Point Conception (a known biogeographic break), (3) tidal height and (4) sediment grain size and mineralogy. Meiofauna were found to form distinct communities within each domain (north/central/south California) and at each site. Southern sites had larger and more diverse meiofauna communities than northern sites, suggesting a rough fit to a latitudinal diversity gradient. Broad oceanographic conditions, particularly temperature and salinity, appear to impact meiofauna across larger scales. No significant distributional break was seen in the area surrounding Point Conception, although small changes to the proportions of major phyla were observed for Point Conception and San Francisco Bay. Within the tidal range, meiofauna richness generally decreased with increasing tidal height although patterns were not uniform across sites. Meiofauna responded significantly to both sedimentologic and mineralogical properties. Richness declined with increasing mean grain size, and the most distinct communities correlated most strongly with the presence of potassium feldspars. This study is the broadest examination of meiofauna within California, and these results form a necessary foundation for future comparison studies using meiofauna as an ecological monitoring group

    Road traffic pollution monitoring and modelling tools and the UK national air quality strategy.

    Get PDF
    This paper provides an assessment of the tools required to fulfil the air quality management role now expected of local authorities within the UK. The use of a range of pollution monitoring tools in assessing air quality is discussed and illustrated with evidence from a number of previous studies of urban background and roadside pollution monitoring in Leicester. A number of approaches to pollution modelling currently available for deployment are examined. Subsequently, the modelling and monitoring tools are assessed against the requirements of Local Authorities establishing Air Quality Management Areas. Whilst the paper examines UK based policy, the study is of wider international interest

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Soil biodiversity: functions, threats and tools for policy makers

    Get PDF
    Human societies rely on the vast diversity of benefits provided by nature, such as food, fibres, construction materials, clean water, clean air and climate regulation. All the elements required for these ecosystem services depend on soil, and soil biodiversity is the driving force behind their regulation. With 2010 being the international year of biodiversity and with the growing attention in Europe on the importance of soils to remain healthy and capable of supporting human activities sustainably, now is the perfect time to raise awareness on preserving soil biodiversity. The objective of this report is to review the state of knowledge of soil biodiversity, its functions, its contribution to ecosystem services and its relevance for the sustainability of human society. In line with the definition of biodiversity given in the 1992 Rio de Janeiro Convention, soil biodiversity can be defined as the variation in soil life, from genes to communities, and the variation in soil habitats, from micro-aggregates to entire landscapes. Bio Intelligence Service, IRD, and NIOO, Report for European Commission (DG Environment

    Supporting Success: Why and How to Improve Quality in After-School Programs

    Get PDF
    This report examines the program improvement strategies, step-by-step, that allowed The James Irvine Foundation's CORAL initiative to achieve the levels of quality needed to boost the academic success of participating students. And, it makes specific policy and funding suggestions for improving program performance. Communities Organizing Resources to Advance Learning (CORAL) is an eight-year, $58 million after-school initiative to improve educational achievement in low-performing schools in five California cities

    LCOGT Network Observatory Operations

    Full text link
    We describe the operational capabilities of the Las Cumbres Observatory Global Telescope Network. We summarize our hardware and software for maintaining and monitoring network health. We focus on methodologies to utilize the automated system to monitor availability of sites, instruments and telescopes, to monitor performance, permit automatic recovery, and provide automatic error reporting. The same jTCS control system is used on telescopes of apertures 0.4m, 0.8m, 1m and 2m, and for multiple instruments on each. We describe our network operational model, including workloads, and illustrate our current tools, and operational performance indicators, including telemetry and metrics reporting from on-site reductions. The system was conceived and designed to establish effective, reliable autonomous operations, with automatic monitoring and recovery - minimizing human intervention while maintaining quality. We illustrate how far we have been able to achieve that.Comment: 13 pages, 9 figure

    Rainfall-runoff and other modelling for ungauged/low-benefit locations: Operational Guidelines

    Get PDF
    corecore