33,119 research outputs found

    Integration of Statistical Methods and Judgment for Time Series

    Get PDF
    We consider how judgment and statistical methods should be integrated for time-series forecasting. Our review of published empirical research identified 47 studies, all but four published since 1985. Five procedures were identified: revising judgment; combining forecasts; revising extrapolations; rule-based forecasting; and econometric forecasting. This literature suggests that integration generally improves accuracy when the experts have domain knowledge and when significant trends are involved. Integration is valuable to the extent that judgments are used as inputs to the statistical methods, that they contain additional relevant information, and that the integration scheme is well structured. The choice of an integration approach can have a substantial impact on the accuracy of the resulting forecasts. Integration harms accuracy when judgment is biased or its use is unstructured. Equal-weights combining should be regarded as the benchmark and it is especially appropriate where series have high uncertainty or high instability. When the historical data involve high uncertainty or high instability, we recommend revising judgment, revising extrapolations, or combining. When good domain knowledge is available for the future as well as for the past, we recommend rule- based forecasting or econometric methods.statistical methods, statistics, time series, forecasting, empirical research

    Using graphical models and multi-attribute utility theory for probabilistic uncertainty handling in large systems, with application to nuclear emergency management

    Get PDF
    Although many decision-making problems involve uncertainty, uncertainty handling within large decision support systems (DSSs) is challenging. One domain where uncertainty handling is critical is emergency response management, in particular nuclear emergency response, where decision making takes place in an uncertain, dynamically changing environment. Assimilation and analysis of data can help to reduce these uncertainties, but it is critical to do this in an efficient and defensible way. After briefly introducing the structure of a typical DSS for nuclear emergencies, the paper sets up a theoretical structure that enables a formal Bayesian decision analysis to be performed for environments like this within a DSS architecture. In such probabilistic DSSs many input conditional probability distributions are provided by different sets of experts overseeing different aspects of the emergency. These probabilities are then used by the decision maker (DM) to find her optimal decision. We demonstrate in this paper that unless due care is taken in such a composite framework, coherence and rationality may be compromised in a sense made explicit below. The technology we describe here builds a framework around which Bayesian data updating can be performed in a modular way, ensuring both coherence and efficiency, and provides sufficient unambiguous information to enable the DM to discover her expected utility maximizing policy

    Demand Forecasting: Evidence-based Methods

    Get PDF
    We looked at evidence from comparative empirical studies to identify methods that can be useful for predicting demand in various situations and to warn against methods that should not be used. In general, use structured methods and avoid intuition, unstructured meetings, focus groups, and data mining. In situations where there are sufficient data, use quantitative methods including extrapolation, quantitative analogies, rule-based forecasting, and causal methods. Otherwise, use methods that structure judgement including surveys of intentions and expectations, judgmental bootstrapping, structured analogies, and simulated interaction. Managers' domain knowledge should be incorporated into statistical forecasts. Methods for combining forecasts, including Delphi and prediction markets, improve accuracy. We provide guidelines for the effective use of forecasts, including such procedures as scenarios. Few organizations use many of the methods described in this paper. Thus, there are opportunities to improve efficiency by adopting these forecasting practices.Accuracy, expertise, forecasting, judgement, marketing.

    Causal Forces: Structuring Knowledge for Time-series Extrapolation

    Get PDF
    This paper examines a strategy for structuring one type of domain knowledge for use in extrapolation. It does so by representing information about causality and using this domain knowledge to select and combine forecasts. We use five categories to express causal impacts upon trends: growth, decay, supporting, opposing, and regressing. An identification of causal forces aided in the determination of weights for combining extrapolation forecasts. These weights improved average ex ante forecast accuracy when tested on 104 annual economic and demographic time series. Gains in accuracy were greatest when (1) the causal forces were clearly specified and (2) stronger causal effects were expected, as in longer- range forecasts. One rule suggested by this analysis was: “Do not extrapolate trends if they are contrary to causal forces.” We tested this rule by comparing forecasts from a method that implicitly assumes supporting trends (Holt’s exponential smoothing) with forecasts from the random walk. Use of the rule improved accuracy for 20 series where the trends were contrary; the MdAPE (Median Absolute Percentage Error) was 18% less for the random walk on 20 one-year ahead forecasts and 40% less for 20 six-year-ahead forecasts. We then applied the rule to four other data sets. Here, the MdAPE for the random walk forecasts was 17% less than Holt’s error for 943 short-range forecasts and 43% less for 723 long-range forecasts. Our study suggests that the causal assumptions implicit in traditional extrapolation methods are inappropriate for many applications.Causal forces Combining Contrary trends Damped trends Exponential smoothing Judgment Rule-based forecasting Selecting methods

    Research on Forecasting: A Quarter-Century Review, 1960-1984

    Get PDF
    Before 1960, little empirical research was done on forecasting methods. Since then, the literature has grown rapidly, especially in the area of judgmental forecasting. This research supports and adds to the forecasting guidelines proposed before 1960, such as the value of combining forecasts. New findings have led to significant gains in our ability to forecast and to help people to use forecasts. What have we reamed about forecasting over the past quarter century? Does recent research provide guidance for making more accurate forecasts, obtaining better assessments of uncertainty, or gaining acceptance of our forecasts? I will first describe forecasting principles that were believed to be the most advanced in 1960. Following that, I will examine the evidence produced since 1960.forecasting, forecasting research

    Rule Based Forecasting [RBF] - Improving Efficacy of Judgmental Forecasts Using Simplified Expert Rules

    Get PDF
    Rule-based Forecasting (RBF) has emerged to be an effective forecasting model compared to well-accepted benchmarks. However, the original RBF model, introduced in1992, incorporates 99 production rules and is, therefore, difficult to apply judgmentally. In this research study, we present a core rule-set from RBF that can be used to inform both judgmental forecasting practice and pedagogy. The simplified rule-set, called coreRBF, is validated by asking forecasters to judgmentally apply the rules to time series forecasting tasks. Results demonstrate that forecasting accuracy from judgmental use of coreRBF is not statistically different from that reported from similar applications of RBF. Further, we benchmarked these coreRBF forecasts against forecasts from (a) untrained forecasters, (b) an expert system based on RBF, and (c) the original 1992 RBF study. Forecast accuracies were in the hypothesized direction, arguing for the generalizability and validity of the coreRBF rules
    • …
    corecore