10 research outputs found

    A comparison of two methods for expert elicitation in health technology assessments.

    Get PDF
    BACKGROUND: When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. METHODS: Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. RESULTS: Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. CONCLUSIONS: Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.This paper presents independent research funded by the National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) for the South West Peninsula

    Economic Crisis Policy Analytics Based on Artificial Intelligence

    No full text
    Part 4: AI, Data Analytics and Automated Decision MakingInternational audienceAn important trend in the area of digital government is its expansion beyond the support of internal processes and operations, as well as transactions and consultations with citizens and firms, which were the main objectives of its first generations, towards the support of higher-level functions of government agencies, with main emphasis on public policy making. This gives rise to the gradual development of policy analytics. Another important trend in the area of digital government is the increasing exploitation of artificial intelligence techniques by government agencies, mainly for the automation, support and enhancement of operational tasks and lower-level decision making, but only to a very limited extent for the support of higher-level functions, and especially policy making. Our paper contributes towards the advancement and the combination of these two important trends: it proposes a policy analytics methodology for the exploitation of existing public and private sector data, using a big data oriented artificial intelligence technique, feature selection, in order to support policy making concerning one of the most serious problems that governments face, the economic crises. In particular, we present a methodology for exploiting existing data of taxation authorities, statistical agencies, and also of private sector business information and consulting firms, in order to identify characteristics of a firm (e.g. with respect to strategic directions, resources, capabilities, practices, etc.) as well as its external environment (e.g. with respect to competition, dynamism, etc.) that affect (positively or negatively) its resilience to the crisis with respect to sales revenue; for this purpose an advanced artificial intelligence feature selection algorithm, the Boruta ‘all-relevant’ variables identification one, is used. Furthermore, an application of the proposed economic crisis policy analytics methodology is presented, which provides a first validation of the usefulness of our methodology

    Improving reliability of judgmental forecasts

    No full text
    All judgmental forecasts will be affected by the inherent unreliability, or inconsistency, of the judgment process. Psychologists have studied this problem extensively, but forecasters rarely address it. Researchers and theorists describe two types of unreliability that can reduce the accuracy of judgmental forecasts: (1) unreliability of information acquisition, and (2) unreliability of information processing. Studies indicate that judgments are less reliable when the task is more complex; when the environment is more uncertain; when the acquisition of information relies on perception, pattern recognition, or memory; and when people use intuition instead of analysis. Five principles can improve reliability in judgmental forecasting: 1. Organize and present information in a form that clearly emphasizes relevant information. 2. Limit the amount of information used in judgmental forecasting. Use a small number of really important cues. 3. Use mechanical methods to process infomation. 4. Combine several forecasts. 5. Require justification of forecasts

    The complexity of prostate cancer: genomic alterations and heterogeneity

    No full text
    corecore