13 research outputs found
Metodologia para avaliação genética de populações multirraciais usando modelos hierárquicos bayesianos.
bitstream/item/55803/1/DT75.pd
Monetary Policy Regimes and the Volatility of Long-Term Interest Rates
This paper addresses two important questions that have, so far, been studied separately in the literature. First, the paper aims at explaining the high volatility of long-term interest rates observed in the data, which is hard to replicate using standard macro models. Building a small-scale macroeconomic model and estimating it on U.S. and U.K. data, I show that the policy responses of a central bank that is uncertain about the natural rate of unemployment can explain this volatility puzzle. Second, the paper aims at shedding new light on the distinction between rules and discretion in monetary policy. My empirical results show that using yield curve data may facilitate the empirical discrimination between different monetary policy regimes and that U.S. monetary policy is best understood as originating from a discretionary regime since 1960
The Macroeconomic Effects of the Euro Area's Fiscal Consolidation 2011-2013: A Simulation-Based Approach
We simulate the Euro Area's fiscal consolidation between 2011 and 2013 by employing two DSGE models used by the ECB and the European Commission, respectively. The cumulative multiplier amounts to 0.7 and 1.0 in the baseline, but increases to 1.3 with a reasonably calibrated financial accelerator and a crisis-related increase of the share of liquidity constrained households. In the latter scenario, fiscal consolidation would be largely responsible for the decline in the output gap from 2011-2013. Postponing the fiscal consolidation to a period of unconstrained monetary policy (until after the economic recovery) would have avoided most of these losses.Wir simulieren die Haushaltskonsolidierung im Euroraum im Zeitraum 2011 bis 2013 mit Hilfe von zwei DSGE Modellen der EZB und der Europäischen Kommission. Der kumulative Multiplikator beträgt 0.7 bzw. 1.0 in der Basislinie, steigt aber auf 1.3, wenn die Modelle um einen plausibel kalibrierter Finanzakzelerator erweitert und der Anteil liquiditätsbeschränkter Haushalte krisenbedingt erhöht wird. Im letzteren Szenario trägt die Haushaltskonsolidierung die maßgebliche Verantwortung für die Verschlechterung der Produktionslücke im Zeitraum 2011 bis 2013. Wäre die Haushaltskonsolidierung erst in einer Phase mit uneingeschränktem geldpolitischen Handlungsspielraum vorgenommen worden (d.h. nach der Erholung der Wirtschaft) hätte der Großteil der BIP-Verluste verhindert werden können
Monetary Policy and Fiscal Stimulus with the Zero Lower Bound and Financial Frictions
Recent developments in many industrialized countries have triggered a debate on whether monetary policy is effective when the nominal interest rate is close to zero. When the nominal interest rate hits its lower bound, the monetary authority is no longer in a position to pursue a policy of monetary easing by lowering nominal interest rates further. In this paper, I assess the implications of the zero lower bound in a DSGE model with financial frictions. The analysis shows that in a framework with financial frictions, when the interest rate is at the lower bound, the initial impact of a negative shock is amplified and the economy is more likely to plunge into a recession. I assess whether different macro policies, such as the management of expectations by the central bank or a counter-cyclical fiscal stimulus, may help recover the economy from the recession. I find that the monetary authority might alleviate the recession by targeting the price-level. Fiscal stimulus represents an alternative solution especially when the zero lower bound constraint becomes binding, as fiscal multipliers may become larger than one. In analyzing discretionary fiscal policy, this paper also focuses on two crucial aspects: the duration of the fiscal stimulus and the presence of implementation lags
Enhanced interpretation of newborn screening results without analyte cutoff values
Purpose: To improve quality of newborn screening by tandem mass spectrometry with a novel approach made possible by the collaboration of 154 laboratories in 49 countries. Methods: A database of 767,464 results from 12,721 cases affected with 60 conditions was used to build multivariate pattern recognition software that generates tools integrating multiple clinically significant results into a single score. This score is determined by the overlap between normal and disease ranges, penetration within the disease range, differences between conditions, and weighted correction factors. Results: Ninety tools target either a single condition or the differential diagnosis between multiple conditions. Scores are expressed as the percentile rank among all cases with the same condition and are compared to interpretation guidelines. Retrospective evaluation of past cases suggests that these tools could have avoided at least half of 279 false-positive outcomes caused by carrier status for fatty-acid oxidation disorders and could have prevented 88% of known false-negative events. Conclusion: Application of this computational approach to raw data is independent from single analyte cutoff values. In Minnesota, the tools have been a major contributing factor to the sustained achievement of a false-positive rate below 0.1% and a positive predictive value above 60%. © 2012 American College of Medical Genetics and Genomics