3,292 research outputs found

    Three great american disinflations

    Get PDF
    In this paper, we examine three famous episodes of deliberate deflation (or disinflation) in U.S. history, including episodes following the Civil War, World War I, and the Volcker disinflation of the early 1980s. These episodes were associated with widely divergent effects on the real economy, which we attribute both to differences in the policy actions undertaken, and to the transparency and credibility of the monetary authorities. We attempt to account for the salient features of each episode within the context of a stylized DSGE model. Our model simulations indicate how a more predictable policy of gradual deflation could have helped avoid the sharp post-WWI depression. But our analysis also suggests that the strong argument for gradualism under a transparent monetary regime becomes less persuasive if the monetary authority lacks credibility; in this case, an aggressive policy stance (as under Volcker) can play a useful signalling role by making a policy shift more apparent to private agents. JEL Classification: E31, E32, E5

    Three Great American Disinflations

    Get PDF
    This paper analyzes the role of transparency and credibility in accounting for the widely divergent macroeconomic effects of three episodes of deliberate monetary contraction: the post-Civil War deflation, the post-WWI deflation, and the Volcker disinflation. Using a dynamic general equilibrium model in which private agents use optimal filtering to infer the central bank's nominal anchor, we demonstrate that the salient features of these three historical episodes can be explained by differences in the design and transparency of monetary policy, even without any time variation in economic structure or model parameters. For a policy regime with relatively high credibility, our analysis highlights the benefits of a gradualist approach (as in the 1870s) rather than a sudden change in policy (as in 1920-21). In contrast, for a policy institution with relatively low credibility (such as the Federal Reserve in late 1980), an aggressive policy stance can play an important signalling role by making the policy shift more evident to private agents.

    Three Great American Disinflations

    Get PDF
    In this paper, we examine three famous episodes of deliberate deflation (or disinflation) in U.S. history, including episodes following the Civil War, World War I, and the Volcker disinflation of the early 1980s. These episodes were associated with widely divergent effects on the real economy, which we attribute both to differences in the policy actions undertaken, and to the transparency and credibility of the monetary authorities. We attempt to account for the salient features of each episode within the context of a stylized DSGE model. Our model simulations indicate how a more predictable policy of gradual deflation could have helped avoid the sharp post-WWI depression. But our analysis also suggests that the strong argument for gradualism under a transparent monetary regime becomes less persuasive if the monetary authority lacks credibility; in this case, an aggressive policy stance (as under Volcker) can play a useful signalling role by making a policy shift more apparent to private agents.DSGE Model, Credibility, Deflation

    Build It, Break It, Fix It: Contesting Secure Development

    Full text link
    Typical security contests focus on breaking or mitigating the impact of buggy systems. We present the Build-it Break-it Fix-it BIBIFI contest which aims to assess the ability to securely build software not just break it. In BIBIFI teams build specified software with the goal of maximizing correctness performance and security. The latter is tested when teams attempt to break other teams submissions. Winners are chosen from among the best builders and the best breakers. BIBIFI was designed to be open-ended - teams can use any language tool process etc. that they like. As such contest outcomes shed light on factors that correlate with successfully building secure software and breaking insecure software. During we ran three contests involving a total of teams and two different programming problems. Quantitative analysis from these contests found that the most efficient build-it submissions used CC but submissions coded in a statically-typed language were less likely to have a security flaw build-it teams with diverse programming-language knowledge also produced more secure code. Shorter programs correlated with better scores. Break-it teams that were also build-it teams were significantly better at finding security bugs

    New evidence on inflation persistence and price stickiness in the Euro area: Implications for macro modelling

    Get PDF
    This paper evaluates new evidence on price setting practices and inflation persistence in the euro area with respect to its implications for macro modelling. It argues that several of the most commonly used assumptions in micro-founded macro models are seriously challenged by the new findings.Price setting practices, macro modellling

    Making Voting Easier: Convenience Voting in the 2008 Presidential Election

    Get PDF
    In this study we analyze the choice of voting mode in the 2008 presidential election. We use a large-sample survey with national coverage that allows us to overcome limitations of previous studies. Our analysis provides a number of insights into some of the important debates about convenience voting. Among other things, we find little support for the hypothesis that convenience voting methods have partisan implications; although we do find voter attributes that lead to the choice of some particular convenience voting mode. Results like these have important implications for future moves towards convenience voting and the design of new outreach campaigns.Pew Charitable Trust

    On chirality of slime mould

    Get PDF
    © 2015 Elsevier Ireland Ltd. Left-right patterning and lateralised behaviour is an ubiquitous aspect of plants and animals. The mechanisms linking cellular chirality to the large-scale asymmetry of multicellular structures are incompletely understood, and it has been suggested that the chirality of living cells is hardwired in their cytoskeleton. We examined the question of biased asymmetry in a unique organism: the slime mould Physarum polycephalum, which is unicellular yet possesses macroscopic, complex structure and behaviour. In laboratory experiment using a T-shape, we found that Physarum turns right in more than 74% of trials. The results are in agreement with previously published studies on asymmetric movement of muscle cells, neutrophils, liver cells and growing neural filaments, and for the first time reveal the presence of consistently-biased laterality in the fungi kingdom. Exact mechanisms of the slime mould's direction preference remain unknown

    The Empirical Status of Acceptance and Commitment Therapy: A Review of Meta-Analyses

    Get PDF
    The efficacy of Acceptance and Commitment Therapy (ACT) has been evaluated in many randomized controlled trials investigating a broad range of target conditions. This paper reviews the meta-analytic evidence on ACT. The 20 included meta-analyses reported 100 controlled effect sizes across n = 12,477 participants. Controlled effect sizes were grouped by target conditions and comparison group. Results showed that ACT is efficacious for all conditions examined, including anxiety, depression, substance use, pain, and transdiagnostic groups. Results also showed that ACT was generally superior to inactive controls (e.g. waitlist, placebo), treatment as usual, and most active intervention conditions (excluding CBT). Weaknesses and areas for future development are discussed

    CMS computing operations during run 1

    Get PDF
    During the first run, CMS collected and processed more than 10B data events and simulated more than 15B events. Up to 100k processor cores were used simultaneously and 100PB of storage was managed. Each month petabytes of data were moved and hundreds of users accessed data samples. In this document we discuss the operational experience from this first run. We present the workflows and data flows that were executed, and we discuss the tools and services developed, and the operations and shift models used to sustain the system. Many techniques were followed from the original computing planning, but some were reactions to difficulties and opportunities. We also address the lessons learned from an operational perspective, and how this is shaping our thoughts for 2015
    corecore