119 research outputs found

    How robust are popular models of nominal frictions?

    Get PDF
    This paper analyzes three popular models of nominal price and wage frictions to determine which best fits post-war U.S. data. We construct a dynamic stochastic general equilibrium (DSGE) model and use maximum likelihood to estimate each model's parameters. Because previous research finds that the conduct of monetary policy and the behavior of inflation changed in the early 1980s, we examine two distinct sample periods. Using a Bayesian, pseudo-odds measure as a means for comparison, a sticky price and wage model with dynamic indexation best fits the data in the early-sample period, whereas either a sticky price and wage model with static indexation or a sticky information model best fits the data in the late-sample period. Our results suggest that price- and wage-setting behavior may be sensitive to changes in the monetary policy regime. If true, the evaluation of alternative monetary policy rules may be even more complicated than previously believed.Econometric models - Evaluation ; Business cycles - Econometric models ; Monetary policy ; Price levels ; Wages

    Monetary policy and natural disasters in a DSGE model: how should the Fed have responded to Hurricane Katrina?

    Get PDF
    In the immediate aftermath of Hurricane Katrina, speculation arose that the Federal Reserve might respond by easing monetary policy. This paper uses a dynamic stochastic general equilibrium (DSGE) model to investigate the appropriate monetary policy response to a natural disaster. We show that the standard Taylor (1993) rule response in models with and without nominal rigidities is to increase the nominal interest rate. That finding is unchanged when we consider the optimal policy response to a disaster. A nominal interest rate increase following a disaster mitigates both temporary inflation effects and output distortions that are attributable to nominal rigidities.Monetary policy - United States ; Natural disasters - Economic aspects

    M2 growth in 1995: a return to normalcy?

    Get PDF
    A discussion of M2's demise as a reliable indicator of financial conditions in the economy, and a look at recent evidence suggesting that even though the aggregate has been behaving more normally over the past year or so, it is unlikely to regain its status as a key policy guide any time soon.Economic indicators ; Money supply

    The monetary instrument matters

    Get PDF
    This paper revisits the debate over the money supply versus the interest rate as the instrument of monetary policy. Using a dynamic stochastic general equilibrium framework, the authors examine the effects of alternative monetary policy rules on inflation persistence, the information content of monetary data, and real variables. They show that inflation persistence and the variability of inflation relative to money growth depend on whether the central bank follows a money growth rule or an interest rate rule. With a money growth rule, inflation is not persistent and the price level is much more volatile than the money supply. Those counterfactual implications are eliminated by the use of interest rate rules whether prices are sticky or not. A central bank's use of interest rate rules, however, obscures the information content of monetary aggregates and also leads to subtle problems for econometricians trying to estimate money demand functions or to identify shocks to the trend and cycle components of the money stock.Monetary policy ; Money supply ; Interest rates

    Taylor-type rules and total factor productivity

    Get PDF
    This paper examines the impact of a persistent shock to the growth rate of total factor productivity in a New Keynesian model in which the central bank does not observe the shock. The authors then investigate the performance of alternative policy rules in such an incomplete information environment. While some rules perform better than others, the authors demonstrate that inflation is more stable after a persistent productivity shock when monetary policy targets the output growth rate (not the output gap) or the price-level path (not the inflation rate). Both the output growth and price-level path rules generate less volatility in output and inflation following a persistent productivity shock compared with the Taylor rule.Taylor's rule ; Productivity ; Industrial productivity

    Inflation risk and optimal monetary policy

    Get PDF
    This paper shows that the optimal monetary policies recommended by New Keynesian models still imply a large amount of inflation risk. We calculate the term structure of inflation uncertainty in New Keynesian models when the monetary authority adopts the optimal policy. When the monetary policy rules are modified to include some weight on a price path, the economy achieves equilibria with substantially lower long-run inflation risk. With either sticky prices or sticky wages, a price path target reduces the variance of inflation by an order of magnitude more than it increases the variability of the output gap.Monetary policy ; Inflation (Finance)

    Results of a study of the stability of cointegrating relations comprised of broad monetary aggregates

    Get PDF
    There is strong evidence of a stable “money demand” relationship for MZM and M2 through the 1990s. Though the M2 relationship breaks down somewhere around 1990, evidence has been accumulating that the disturbance is well characterized as a permanent upward shift in M2 velocity that began around 1990 and was largely over by 1994. This paper’s results support the hypothesis that households permanently reallocated a portion of their wealth from time deposits to mutual funds. This reallocation may have been induced by depository restructuring, but it could also be explained by appropriately measured opportunity cost.Demand for money

    Demonstrating a superconducting dual-rail cavity qubit with erasure-detected logical measurements

    Full text link
    A critical challenge in developing scalable error-corrected quantum systems is the accumulation of errors while performing operations and measurements. One promising approach is to design a system where errors can be detected and converted into erasures. A recent proposal aims to do this using a dual-rail encoding with superconducting cavities. In this work, we implement such a dual-rail cavity qubit and use it to demonstrate a projective logical measurement with erasure detection. We measure logical state preparation and measurement errors at the 0.01%0.01\%-level and detect over 99%99\% of cavity decay events as erasures. We use the precision of this new measurement protocol to distinguish different types of errors in this system, finding that while decay errors occur with probability 0.2%\sim 0.2\% per microsecond, phase errors occur 6 times less frequently and bit flips occur at least 170 times less frequently. These findings represent the first confirmation of the expected error hierarchy necessary to concatenate dual-rail erasure qubits into a highly efficient erasure code
    corecore