219 research outputs found

    Simulation sample sizes for Monte Carlo partial EVPI calculations

    Get PDF
    Partial expected value of perfect information (EVPI) quantifies the value of removing uncertainty about unknown parameters in a decision model. EVPIs can be computed via Monte Carlo methods. An outer loop samples values of the parameters of interest, and an inner loop samples the remaining parameters from their conditional distribution. This nested Monte Carlo approach can result in biased estimates if small numbers of inner samples are used and can require a large number of model runs for accurate partial EVPI estimates. We present a simple algorithm to estimate the EVPI bias and confidence interval width for a specified number of inner and outer samples. The algorithm uses a relatively small number of model runs (we suggest approximately 600), is quick to compute, and can help determine how many outer and inner iterations are needed for a desired level of accuracy. We test our algorithm using three case studies. (C) 2010 Elsevier B.V. All rights reserved

    Calculating partial expected value of perfect information via Monte Carlo sampling algorithms

    Get PDF
    Partial expected value of perfect information (EVPI) calculations can quantify the value of learning about particular subsets of uncertain parameters in decision models. Published case studies have used different computational approaches. This article examines the computation of partial EVPI estimates via Monte Carlo sampling algorithms. The mathematical definition shows 2 nested expectations, which must be evaluated separately because of the need to compute a maximum between them. A generalized Monte Carlo sampling algorithm uses nested simulation with an outer loop to sample parameters of interest and, conditional upon these, an inner loop to sample remaining uncertain parameters. Alternative computation methods and shortcut algorithms are discussed and mathematical conditions for their use considered. Maxima of Monte Carlo estimates of expectations are biased upward, and the authors show that the use of small samples results in biased EVPI estimates. Three case studies illustrate 1) the bias due to maximization and also the inaccuracy of shortcut algorithms 2) when correlated variables are present and 3) when there is nonlinearity in net benefit functions. If relatively small correlation or nonlinearity is present, then the shortcut algorithm can be substantially inaccurate. Empirical investigation of the numbers of Monte Carlo samples suggests that fewer samples on the outer level and more on the inner level could be efficient and that relatively small numbers of samples can sometimes be used. Several remaining areas for methodological development are set out. A wider application of partial EVPI is recommended both for greater understanding of decision uncertainty and for analyzing research priorities

    Mapping randomized controlled trials of treatments for eczema - The GREAT database (The Global Resource of Eczema Trials: a collection of key data on randomized controlled trials of treatments for eczema from 2000 to 2010)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Massive duplication of effort occurs when researchers all over the world undertake extensive searches for randomized controlled trials when preparing systematic reviews, when developing evidence-based guidelines and when applying for research funding for eczema treatments. Such duplication wastes valuable resources.</p> <p>Searching for randomized controlled trials of eczema is a laborious task involving scrutiny of thousands of individual references from diverse electronic databases in order to obtain a few papers of interest. Clinicians and patients who wish to find out more about a particular treatment are at risk of missing the relevant evidence if they are not trained in electronic bibliographic searching. Systematic reviews cannot be relied upon to comprehensively inform current optimal eczema treatments due to incomplete coverage and because many may be out of date.</p> <p>An international, publically available and comprehensive resource which brings together all randomized controlled trials on eczema treatment using a highly sensitive search has the potential to release more filtered knowledge about patient care to those who need it most and to significantly shorten the duration and costs of many clinical eczema research and guideline projects.</p> <p>Description</p> <p>The Global Resource of EczemA Trials brings together information on all randomized controlled trials of eczema treatments published from the beginning of 2000 up to the end of 2010 and will be updated every month.</p> <p>We searched the Cochrane Central Register of Controlled Trials in <it>The Cochrane Library </it>and the Cochrane Skin Group Specialised Register, MEDLINE, EMBASE, LILACS, AMED and CINHAL databases. We included 268 RCTs (24<sup>th </sup>March 2011) covering over 70 different treatment interventions.</p> <p>The structure of the Global Resource of Eczema Trials allows the user as much, or as little, specificity when retrieving information on trials as they wish, in an easy to use format. For each trial, the database gives the citation for the published report and also provides enough information to enable a user to decide whether the trial is worth further scrutiny.</p> <p>Conclusions</p> <p>The Global Resource of Eczema Trials has been created to facilitate knowledge mobilization into healthcare and to reduce wastage of research time through unnecessary duplication. The collective time saved by research groups around the world can now be used to make strides in optimising the treatment of eczema, in order to further benefit people with eczema. The database can be accessed free of charge at <url>http://www.greatdatabase.org.uk</url></p

    Prediction of Dengue Disease Severity among Pediatric Thai Patients Using Early Clinical Laboratory Indicators

    Get PDF
    Patients with severe dengue illness typically develop complications in the later stages of illness, making early clinical management of all patients with suspected dengue infection difficult. An early prediction tool to identify which patients will have a severe dengue illness will improve the utilization of limited hospital resources in dengue endemic regions. We performed classification and regression tree (CART) analysis to establish predictive algorithms of severe dengue illness. Using a Thai hospital pediatric cohort of patients presenting within the first 72 hours of a suspected dengue illness, we developed diagnostic decision algorithms using simple clinical laboratory data obtained on the day of presentation. These algorithms correctly classified near 100% of patients who developed a severe dengue illness while excluding upwards of 50% of patients with mild dengue or other febrile illnesses. Our algorithms utilized white blood cell counts, percent white blood cell differentials, platelet counts, elevated aspartate aminotransferase, hematocrit, and age. If these algorithms can be validated in other regions and age groups, they will help in the clinical management of patients with suspected dengue illness who present within the first three days of fever onset

    Toward a 21st-century health care system: Recommendations for health care reform

    Get PDF
    The coverage, cost, and quality problems of the U.S. health care system are evident. Sustainable health care reform must go beyond financing expanded access to care to substantially changing the organization and delivery of care. The FRESH-Thinking Project (www.fresh-thinking.org) held a series of workshops during which physicians, health policy experts, health insurance executives, business leaders, hospital administrators, economists, and others who represent diverse perspectives came together. This group agreed that the following 8 recommendations are fundamental to successful reform: 1. Replace the current fee-for-service payment system with a payment system that encourages and rewards innovation in the efficient delivery of quality care. The new payment system should invest in the development of outcome measures to guide payment. 2. Establish a securely funded, independent agency to sponsor and evaluate research on the comparative effectiveness of drugs, devices, and other medical interventions. 3. Simplify and rationalize federal and state laws and regulations to facilitate organizational innovation, support care coordination, and streamline financial and administrative functions. 4. Develop a health information technology infrastructure with national standards of interoperability to promote data exchange. 5. Create a national health database with the participation of all payers, delivery systems, and others who own health care data. Agree on methods to make de-identified information from this database on clinical interventions, patient outcomes, and costs available to researchers. 6. Identify revenue sources, including a cap on the tax exclusion of employer-based health insurance, to subsidize health care coverage with the goal of insuring all Americans. 7. Create state or regional insurance exchanges to pool risk, so that Americans without access to employer-based or other group insurance could obtain a standard benefits package through these exchanges. Employers should also be allowed to participate in these exchanges for their employees' coverage. 8. Create a health coverage board with broad stakeholder representation to determine and periodically update the affordable standard benefit package available through state or regional insurance exchanges

    Burden of musculoskeletal disorders in the Eastern Mediterranean Region, 1990–2013: findings from the Global Burden of Disease Study 2013

    Get PDF
    Moradi-Lakeh M, Forouzanfar MH, Vollset SE, et al. Burden of musculoskeletal disorders in the Eastern Mediterranean Region, 1990–2013: findings from the Global Burden of Disease Study 2013. Annals of the Rheumatic Diseases. 2017;76(8):annrheumdis-2016-210146
    corecore