2,217 research outputs found

    The cost-effectiveness of an early interventional strategy in non-ST-elevation acute coronary syndrome based on the RITA 3 trial

    Get PDF
    The published version of the aritcle can be found at the link below.Background: Evidence suggests that an early interventional strategy for patients with non-ST-elevation acute coronary syndrome (NSTE-ACS) can improve health outcomes but also increase costs when compared with a conservative strategy.Objective: The aim of this study was to assess the cost-effectiveness of an early interventional strategy in different risk groups from a UK health-service perspective.Design: Decision-analytic model based on randomised clinical trial data.Main outcome measures: Costs in UK Sterling at 2003/2004 prices and quality-adjusted life years (QALYs) combined into an incremental cost-effectiveness ratio.Methods: Data from the third Randomised Intervention Trial of unstable Angina (RITA 3) was employed to estimate rates of cardiovascular death and myocardial infarction, costs and health-related quality of life. Cost-effectiveness was estimated over patients' lifetimes within the decision-analytic model.Results: The mean incremental cost per QALY gained for an early interventional strategy was approximately £55000, £22000 and £12000 for patients at low, intermediate and high risk, respectively. The early interventional strategy is approximately 1%, 35% and 95% likely to be cost-effective for patients at low, intermediate and high risk, respectively, at a threshold of £20000 per QALY. The cost-effectiveness of early intervention in low-risk patients is sensitive to assumptions about the duration of the treatment effect.Conclusion: An early interventional strategy in patients presenting with NSTE-ACS is likely to be considered cost-effective for patients at high and intermediate risk, but this is less likely to be the case for patients at low risk

    Socially Optimal Mining Pools

    Full text link
    Mining for Bitcoins is a high-risk high-reward activity. Miners, seeking to reduce their variance and earn steadier rewards, collaborate in pooling strategies where they jointly mine for Bitcoins. Whenever some pool participant is successful, the earned rewards are appropriately split among all pool participants. Currently a dozen of different pooling strategies (i.e., methods for distributing the rewards) are in use for Bitcoin mining. We here propose a formal model of utility and social welfare for Bitcoin mining (and analogous mining systems) based on the theory of discounted expected utility, and next study pooling strategies that maximize the social welfare of miners. Our main result shows that one of the pooling strategies actually employed in practice--the so-called geometric pay pool--achieves the optimal steady-state utility for miners when its parameters are set appropriately. Our results apply not only to Bitcoin mining pools, but any other form of pooled mining or crowdsourcing computations where the participants engage in repeated random trials towards a common goal, and where "partial" solutions can be efficiently verified

    Validation of a contemporary prostate cancer grading system using prostate cancer death as outcome

    Get PDF
    BACKGROUND: Gleason scoring (GS) has major deficiencies and a novel system of five grade groups (GS⩽6; 3+4; 4+3; 8; ⩾9) has been recently agreed and included in the WHO 2016 classification. Although verified in radical prostatectomies using PSA relapse for outcome, it has not been validated using prostate cancer death as an outcome in biopsy series. There is debate whether an ‘overall' or ‘worst' GS in biopsies series should be used. METHODS: Nine hundred and eighty-eight prostate cancer biopsy cases were identified between 1990 and 2003, and treated conservatively. Diagnosis and grade was assigned to each core as well as an overall grade. Follow-up for prostate cancer death was until 31 December 2012. A log-rank test assessed univariable differences between the five grade groups based on overall and worst grade seen, and using univariable and multivariable Cox proportional hazards. Regression was used to quantify differences in outcome. RESULTS: Using both ‘worst' and ‘overall' GS yielded highly significant results on univariate and multivariate analysis with overall GS slightly but insignificantly outperforming worst GS. There was a strong correlation with the five grade groups and prostate cancer death. CONCLUSIONS: This is the largest conservatively treated prostate cancer cohort with long-term follow-up and contemporary assessment of grade. It validates the formation of five grade groups and suggests that the ‘worst' grade is a valid prognostic measure

    HTA – algorithm or process? Comment on ‘Expanded HTA: enhancing fairness and legitimacy’

    Get PDF
    Daniels, Porteny and Urrutia et al make a good case for the idea that that public decisions ought to be made not only “in the light of ” evidence but also “on the basis of ” budget impact, financial protection and equity. Health technology assessment (HTA) should, they say, be accordingly expanded to consider matters additional to safety and cost-effectiveness. They also complain that most HTA reports fail to develop ethical arguments and generally do not even mention ethical issues. This comment argues that some of these defects are more apparent than real and are not inherent in HTA – as distinct from being common characteristics found in poorly conducted HTAs. More generally, HTA does not need “extension” since (1) ethical issues are already embedded in HTA processes, not least in their scoping phases, and (2) HTA processes are already sufficiently flexible to accommodate evidence about a wide range of factors, and will not need fundamental change in order to accommodate the new forms of decision-relevant evidence about distributional impact and financial protection that are now starting to emerge. HTA and related techniques are there to support decision-makers who have authority to make decisions. Analysts like us are there to support and advise them (and not to assume the responsibilities for which they, and not we, are accountable). The required quality in HTA then becomes its effectiveness as a means of addressing the issues of concern to decisionmakers. What is also required is adherence by competent analysts to a standard template of good analytical practice. The competencies include not merely those of the usual disciplines (particularly biostatistics, cognitive psychology, health economics, epidemiology, and ethics) but also the imaginative and interpersonal skills for exploring the “real” question behind the decision-maker’s brief (actual or postulated) and eliciting the social values that necessarily pervade the entire analysis. The product of such exploration defines the authoritative scope of an HTA

    A Universal Model of Global Civil Unrest

    Get PDF
    Civil unrest is a powerful form of collective human dynamics, which has led to major transitions of societies in modern history. The study of collective human dynamics, including collective aggression, has been the focus of much discussion in the context of modeling and identification of universal patterns of behavior. In contrast, the possibility that civil unrest activities, across countries and over long time periods, are governed by universal mechanisms has not been explored. Here, we analyze records of civil unrest of 170 countries during the period 1919-2008. We demonstrate that the distributions of the number of unrest events per year are robustly reproduced by a nonlinear, spatially extended dynamical model, which reflects the spread of civil disorder between geographic regions connected through social and communication networks. The results also expose the similarity between global social instability and the dynamics of natural hazards and epidemics.Comment: 8 pages, 3 figure

    Volume-based referral for cardiovascular procedures in the United States: a cross-sectional regression analysis

    Get PDF
    BACKGROUND: We sought to estimate the numbers of patients affected and deaths avoided by adopting the Leapfrog Group's recommended hospital procedure volume minimums for coronary artery bypass graft (CABG) surgery and percutaneous coronary intervention (PCI). In addition to hospital risk-adjusted mortality standards, the Leapfrog Group recommends annual hospital procedure minimums of 450 for CABG and 400 for PCI to reduce procedure-associated mortality. METHODS: We conducted a retrospective analysis of a national hospital discharge database to evaluate in-hospital mortality among patients who underwent PCI (n = 2,500,796) or CABG (n = 1,496,937) between 1998 and 2001. We calculated the number of patients treated at low volume hospitals and simulated the number of deaths potentially averted by moving all patients to high volume hospitals under best-case conditions (i.e., assuming the full volume-associated reduction in mortality and the capacity to move all patients to high volume hospitals with no related harms). RESULTS: Multivariate adjusted odds of in-hospital mortality were higher for patients treated in low volume hospitals compared with high volume hospitals for CABG (OR 1.16, 95% CI 1.10–1.24) and PCI (OR 1.12, 95% CI 1.05–1.20). A policy of hospital volume minimums would have required moving 143,687 patients for CABG and 87,661 patients for PCI from low volume to high volume hospitals annually and prevented an estimated 619 CABG deaths and 109 PCI deaths. Thus, preventing a single death would have required moving 232 CABG patients or 805 PCI patients from low volume to high volume hospitals. CONCLUSION: Recommended hospital CABG and PCI volume minimums would prevent 728 deaths annually in the United States, fewer than previously estimated. It is unclear whether a policy requiring the movement of large numbers of patients to avoid relatively few deaths is feasible or effective

    Uncovering the Oppenheimer Siddur: using scientific analysis to reveal the production process of a medieval illuminated Hebrew manuscript

    Get PDF
    The aim of this research was to use non-invasive scientifc analysis to uncover evidence of the planning process and relationship between pigments used in text copying and artwork production in the Oppenheimer Siddur (Oxford Bodleian Library MS Opp. 776), an illuminated 15th-century Hebrew prayer book. In many medieval Hebrew illuminated manuscripts, the authorship of the artwork is unknown. This manuscript’s colophon states that it was copied by its scribe-owner for personal family use but does not confrm who was responsible for the artwork. Prior deductive analysis suggested that the scribe-owner may also have been the manuscript’s artist, based on common motifs and an apparent shared colour palette appearing in both texts and artwork. Visual examination using high resolution digital images also identifed points of contact between pigments used in the manuscript’s texts and artwork, raising questions about the pigment application sequence, and concurrent versus sequential text copying and artwork production. An in-house developed remote spectral imaging system (PRISMS) with 10 flters spanning the spectral range from 400 to 880 nm was modifed for close-range application to image two of the folios to examine the sequence of production, identify the pigments and compare the materials used for the illumination and the text. Optical microscopy and Fourier Transform Infrared spectroscopy in the attenuated total refection mode (FTIR-ATR) were used directly on the folios to complement the spectral imaging data in binding media and pigment identifcation. The results revealed close matches in refectance spectra for the colorants and inks used in both text copying and illuminations, suggesting that the same mixture of colorants and inks have been used. The spectral imaging in the near infrared bands revealed a hidden underdrawing, indicating a design change during production of the manuscript, and the outlining of letters prior to coloured pigment being applied. The pigment use, the variation in the binder for diferent pigments and some elements of its production were found to be consistent with those described in historical sources. The evidence from this study supports the hypothesis that the scribe applied pigments for the manuscript’s artwork at the same time he did some of the scribal work which has implications for understandings of Jewish medieval visual cultures

    "Meaning" as a sociological concept: A review of the modeling, mapping, and simulation of the communication of knowledge and meaning

    Full text link
    The development of discursive knowledge presumes the communication of meaning as analytically different from the communication of information. Knowledge can then be considered as a meaning which makes a difference. Whereas the communication of information is studied in the information sciences and scientometrics, the communication of meaning has been central to Luhmann's attempts to make the theory of autopoiesis relevant for sociology. Analytical techniques such as semantic maps and the simulation of anticipatory systems enable us to operationalize the distinctions which Luhmann proposed as relevant to the elaboration of Husserl's "horizons of meaning" in empirical research: interactions among communications, the organization of meaning in instantiations, and the self-organization of interhuman communication in terms of symbolically generalized media such as truth, love, and power. Horizons of meaning, however, remain uncertain orders of expectations, and one should caution against reification from the meta-biological perspective of systems theory

    Eco-intelligent factories: Timescales for environmental decision support

    Get PDF
    Manufacturing decisions are currently made based on considerations of cost, time and quality. However there is increasing pressure to also routinely incorporate environmental considerations into the decision making processes. Despite the existence of a number of tools for environmental analysis of manu-facturing activities, there does not appear to be a structured approach for gener-ating relevant environmental information that can be fed into manufacturing decision making. This research proposes an overarching structure that leads to three approaches, pertaining to different timescales that enable the generation of environmental information, suitable for consideration during decision making. The approaches are demonstrated through three industrial case studies

    Cross-protection against European swine influenza viruses in the context of infection immunity against the 2009 pandemic H1N1 virus : studies in the pig model of influenza

    Get PDF
    Pigs are natural hosts for the same influenza virus subtypes as humans and are a valuable model for cross-protection studies with influenza. In this study, we have used the pig model to examine the extent of virological protection between a) the 2009 pandemic H1N1 (pH1N1) virus and three different European H1 swine influenza virus (SIV) lineages, and b) these H1 viruses and a European H3N2 SIV. Pigs were inoculated intranasally with representative strains of each virus lineage with 6- and 17-week intervals between H1 inoculations and between H1 and H3 inoculations, respectively. Virus titers in nasal swabs and/or tissues of the respiratory tract were determined after each inoculation. There was substantial though differing cross-protection between pH1N1 and other H1 viruses, which was directly correlated with the relatedness in the viral hemagglutinin (HA) and neuraminidase (NA) proteins. Cross-protection against H3N2 was almost complete in pigs with immunity against H1N2, but was weak in H1N1/pH1N1-immune pigs. In conclusion, infection with a live, wild type influenza virus may offer substantial cross-lineage protection against viruses of the same HA and/or NA subtype. True heterosubtypic protection, in contrast, appears to be minimal in natural influenza virus hosts. We discuss our findings in the light of the zoonotic and pandemic risks of SIVs
    corecore