1,913 research outputs found

    Stochastic optimization and worst-case analysis in monetary policy design

    Get PDF
    In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E6

    Prioritizing Invasive Species Threats Under Uncertainty

    Get PDF
    Prioritizing exotic or invasive pest threats in terms of agricultural, environmental, or human health damages is an important resource allocation issue for programs charged with preventing or responding to the entry of such organisms. Under extreme uncertainty, program managers may decide to research the severity of threats, develop prevention or control actions, and estimate cost-effectiveness in order to provide better information and more options when making decisions to choose strategies for specific pests. We examine decision rules based on the minimax and relative cost criteria in order to express a cautious approach for decisions regarding severe, irreversible consequences, discuss the strengths and weaknesses of these rules, examine the roles of simple rules and sophisticated analyses in decision making, and apply a simple rule to develop a list of priority plant pests.invasive species, decision criteria, uncertainty, Resource /Energy Economics and Policy,

    Stochastic Optimization and Worst-Case Analysis in Monetary Policy Design

    Get PDF
    In this paper we compare expected loss minimization to worst-case or minimax analysis in the design of simple Taylor-style rules for monetary policy using a small model estimated for the euro area by Orphanides and Wieland (2000). We find that rules optimized under a minimax objective in the presence of general parameter and shock uncertainty do not imply extreme policy activism. Such rules tend to obey the Brainard principle of cautionary policymaking in much the same way as rules derived by expected loss minimization. Rules derived by means of minimax analysis are effective insurance policies imiting maximum loss over ranges of parameter values to be set by the policy maker. In practice, we propose to set these ranges with an eye towards the cost of such insurance cover in terms of the implied increase in expected inflation variability.Worst-case analysis, robust control, minimax, monetary policy rules, euro area

    Robustness and macroeconomic policy

    Get PDF
    This paper considers the design of macroeconomic policies in the face of uncertainty. In recent years, several economists have advocated that when policymakers are uncertain about the environment they face and find it difficult to assign precise probabilities to the alternative scenarios that may characterize this environment, they should design policies to be robust in the sense that they minimize the worstcase loss these policies could ever impose. I review and evaluate the objections cited by critics of this approach. I further argue that, contrary to what some have inferred, concern about worst-case scenarios does not always lead to policies that respond more aggressively to incoming news than the optimal policy would respond absent any uncertainty.Macroeconomics - Econometric models

    Approximate Models and Robust Decisions

    Full text link
    Decisions based partly or solely on predictions from probabilistic models may be sensitive to model misspecification. Statisticians are taught from an early stage that "all models are wrong", but little formal guidance exists on how to assess the impact of model approximation on decision making, or how to proceed when optimal actions appear sensitive to model fidelity. This article presents an overview of recent developments across different disciplines to address this. We review diagnostic techniques, including graphical approaches and summary statistics, to help highlight decisions made through minimised expected loss that are sensitive to model misspecification. We then consider formal methods for decision making under model misspecification by quantifying stability of optimal actions to perturbations to the model within a neighbourhood of model space. This neighbourhood is defined in either one of two ways. Firstly, in a strong sense via an information (Kullback-Leibler) divergence around the approximating model. Or using a nonparametric model extension, again centred at the approximating model, in order to `average out' over possible misspecifications. This is presented in the context of recent work in the robust control, macroeconomics and financial mathematics literature. We adopt a Bayesian approach throughout although the methods are agnostic to this position

    Modeling Model Uncertainty

    Get PDF
    Recently there has been a great deal of interest in studying monetary policy under model uncertainty. We point out that different assumptions about the uncertainty may result in drastically different robust' policy recommendations. Therefore, we develop new methods to analyze uncertainty about the parameters of a model, the lag specification, the serial correlation of shocks, and the effects of real time data in one coherent structure. We consider both parametric and nonparametric specifications of this structure and use them to estimate the uncertainty in a small model of the US economy. We then use our estimates to compute robust Bayesian and minimax monetary policy rules, which are designed to perform well in the face of uncertainty. Our results suggest that the aggressiveness recently found in robust policy rules is likely to be caused by overemphasizing uncertainty about economic dynamics at low frequencies.

    Global Changes: Facets of Robust Decisions

    Get PDF
    The aim of this paper is to provide an overview of existing concepts of robustness and to identify promising directions for coping with uncertainty and risks of global changes. Unlike statistical robustness, general decision problems may have rather different facets of robustness. In particular, a key issue is the sensitivity with respect to low-probability catastrophic events. That is, robust decisions in the presence of catastrophic events are fundamentally different from decisions ignoring them. Specifically, proper treatment of extreme catastrophic events requires new sets of feasible decisions, adjusted to risk performance indicators, and new spatial, social and temporal dimensions. The discussion is deliberately kept at a level comprehensible to a broad audience through the use of simple examples that can be extended to rather general models. In fact, these examples often illustrate fragments of models that are being developed at IIASA

    Robustness - a challenge also for the 21st century: A review of robustness phenomena in technical, biological and social systems as well as robust approaches in engineering, computer science, operations research and decision aiding

    Get PDF
    Notions on robustness exist in many facets. They come from different disciplines and reflect different worldviews. Consequently, they contradict each other very often, which makes the term less applicable in a general context. Robustness approaches are often limited to specific problems for which they have been developed. This means, notions and definitions might reveal to be wrong if put into another domain of validity, i.e. context. A definition might be correct in a specific context but need not hold in another. Therefore, in order to be able to speak of robustness we need to specify the domain of validity, i.e. system, property and uncertainty of interest. As proofed by Ho et al. in an optimization context with finite and discrete domains, without prior knowledge about the problem there exists no solution what so ever which is more robust than any other. Similar to the results of the No Free Lunch Theorems of Optimization (NLFTs) we have to exploit the problem structure in order to make a solution more robust. This optimization problem is directly linked to a robustness/fragility tradeoff which has been observed in many contexts, e.g. 'robust, yet fragile' property of HOT (Highly Optimized Tolerance) systems. Another issue is that robustness is tightly bounded to other phenomena like complexity for which themselves exist no clear definition or theoretical framework. Consequently, this review rather tries to find common aspects within many different approaches and phenomena than to build a general theorem for robustness, which anyhow might not exist because complex phenomena often need to be described from a pluralistic view to address as many aspects of a phenomenon as possible. First, many different robustness problems have been reviewed from many different disciplines. Second, different common aspects will be discussed, in particular the relationship of functional and structural properties. This paper argues that robustness phenomena are also a challenge for the 21st century. It is a useful quality of a model or system in terms of the 'maintenance of some desired system characteristics despite fluctuations in the behaviour of its component parts or its environment' (s. [Carlson and Doyle, 2002], p. 2). We define robustness phenomena as solution with balanced tradeoffs and robust design principles and robustness measures as means to balance tradeoffs. --

    Policymaking under uncertainty: Gradualism and robustness

    Get PDF
    Some economists have recommended the robust control approach to the formulation of monetary policy under uncertainty when policymakers cannot attach probabilities to the scenarios that concern them. One critique of this approach is that it seems to imply aggressive policies under uncertainty, contrary to the conventional wisdom of acting more gradually in an uncertain environment. This article argues that aggressiveness is not a generic feature of robust control.
    • 

    corecore