1,167 research outputs found

    The New Keynesian Hybrid Phillips Curve: An Assessment of Competing Specifications for the United States

    Get PDF
    Inflation forecasting is fundamental to monetary policy. In practice, however, economists are faced with competing goals: accuracy and theoretical consistency. Recent work by Fuhrer and Moore (1995), Galí and Gertler (1999), Galí, Gertler, and Lopez-Salido (2001), Sbordone (2002), and Kozicki and Tinsley (2002a, b) suggests that the two objectives need not be mutually exclusive in the context of inflation forecasts. The New Keynesian Phillips curve is theoretically appealing, because its purely forward-looking specification is based on a model of optimal pricing behaviour with rational expectations. This specification, however, does not properly capture observed inflation persistence. The author estimates three structural models of U.S. inflation that incorporate price frictions to justify the presence of lags in the forward-looking New Keynesian Phillips curve. The models, based on Galí and Gertler (1999) and Kozicki and Tinsley (2002a, b), are tested on the basis of forecast performances. The results show that the new Keynesian hybrid Phillips curve with the output gap as an explanatory variable performs marginally better than the two alternative specifications.Inflation and prices; Economic models

    The U.S. Stock Market and Fundamentals: A Historical Decomposition

    Get PDF
    The authors identify the fundamentals behind the dynamics of the U.S. stock market over the past 30 years. They specify a structural vector-error-correction model following the methodology of King, Plosser, Stock, and Watson (1991). This methodology identifies structural shocks with the imposition of long-run restrictions. It allows the authors to calculate an equilibrium measure of stock market value based on the permanent components of the time series. A better understanding of the components that drive stock market movements could provide insight into the potential effects of the recent technological revolution on the dynamics of the stock market's equilibrium value, as suggested by Hobijn and Jovanovic (2001).Transmission of monetary policy

    A Model of Housing Stock for Canada

    Get PDF
    Using an error-correction model (ECM) framework, the authors attempt to quantify the degree of disequilibrium in Canadian housing stock over the period 1961–2008 for the national aggregate and over 1981–2008 for the provinces. They find that, based on quarterly data, the level of housing stock in the long run is associated with population, real per capita disposable income, and real house prices. Population growth (net migration, particularly for the western provinces) is also an important determinant of the short-run dynamics of housing stock, after controlling for serial correlation in the dependent variable. Real mortgage rates, consumer confidence, and a number of other variables identified in the literature are found to play a small role in the short run. The authors’ model suggests that the Canadian housing stock was 2 per cent above its equilibrium level at the end of 2008. There was likely overbuilding, to varying degrees, in Saskatchewan, New Brunswick, British Columbia, Ontario, and Quebec.Domestic demand and components

    The Effects of Recent Relative Price Movements on the Canadian Economy

    Get PDF
    Although the standard of living of Canadians has improved as a result of terms-of-trade gains created by the sharp rise in real commodity prices over the past five years or so, the commodity-price increase, combined with an exchange rate appreciation and real income gain, triggered structural adjustments by altering underlying economic incentives. The frictions generated in adjusting to the relative price shock have likely contributed to hold back aggregate productivity growth. Dupuis and Marcil examine the structural adjustments that have been required-in particular, the resource reallocation among the different sectors of the economy-and its effects on employment, output, and productivity, as well as the responses of final domestic demand and external trade flows.

    The OpenPicoAmp : an open-source planar lipid bilayer amplifier for hands-on learning of neuroscience

    Full text link
    Neuroscience education can be promoted by the availability of low cost and engaging teaching materials. To address this issue, we developed an open-source lipid bilayer amplifier, the OpenPicoAmp, which is appropriate for use in introductory courses in biophysics or neurosciences dealing with the electrical properties of the cell membrane. The amplifier is designed using the common lithographic printed circuit board fabrication process and off-the-shelf electronic components. In addition, we propose a specific design for experimental chambers allowing the insertion of a commercially available polytetrafluoroethylene film. This experimental setup can be used in simple experiments in which students monitor the bilayer formation by capacitance measurement and record unitary currents produced by ionic channels like gramicidin A. Used in combination with a low-cost data acquisition board this system provides a complete solution for hands-on lessons, therefore improving the effectiveness in teaching basic neurosciences or biophysics.Comment: 13 pages, 6 figures and supplementary information (9 files including one movie). Added references, added figure, corrected typos, corrected board components list, more detailled implementation documen

    Alien Registration- Dupuis, David (Lewiston, Androscoggin County)

    Get PDF
    https://digitalmaine.com/alien_docs/29541/thumbnail.jp

    Alien Registration- Dupuis, David (Lewiston, Androscoggin County)

    Get PDF
    https://digitalmaine.com/alien_docs/29541/thumbnail.jp

    Computing the Accuracy of Complex Non-Random Sampling Methods: The Case of the Bank of Canada's Business Outlook Survey

    Get PDF
    A number of central banks publish their own business conditions survey based on non-random sampling methods. The results of these surveys influence monetary policy decisions and thus affect expectations in financial markets. To date, however, no one has computed the statistical accuracy of these surveys because their respective non-random sampling method renders this assessment non-trivial. This paper describes a methodology for modeling complex non-random sampling behaviour, and computing relevant measures of statistical confidence, based on a given survey's historical sample selection practice. We apply this framework to the Bank of Canada's Business Outlook Survey by describing the sampling method in terms of historical practices and Bayesian probabilities. This allows us to replicate the firm selection process using Monte Carlo simulations on a comprehensive micro-dataset of Canadian firms. We find, under certain assumptions, no evidence that the Bank's firm selection process results in biased estimates and/or wider confidence intervals.Econometric and statistical methods; Central bank research; Regional economic developments
    corecore