130,487 research outputs found
News and Financial Intermediation in Aggregate Fluctuations
We develop a two-sector DSGE model with financial intermediation to investigate the role of news as a driving force of the business cycle. We find that news about future capital quality is a significant source of aggregate fluctuations, accounting for around 37% in output variation in cyclical frequencies. Financial intermediation is essential for the importance and propagation of capital quality shocks. In addition, news shocks in capital quality generate aggregate and sectoral comovement as in the data and is consistent with procyclical movements in the value of capital. From a historical perspective, news shocks to capital quality are to a large extent responsible for the recession following the 1990s investment boom and the latest recession following the financial crisis, but played a much smaller role during the recession at the beginning of the 1990s. This is in line with the belief that revisions of overoptimistic expectations contributed to the last two recessions while movements in fundamentals played a much bigger role for the recession at the beginning of the 1990s
Recommended from our members
Toward Common Data Elements for International Research in Long-term Care Homes: Advancing Person-Centered Care
To support person-centered, residential long-term care internationally, a consortium of researchers in medicine, nursing, behavioral, and social sciences from 21 geographically and economically diverse countries have launched the WE-THRIVE consortium to develop a common data infrastructure. WE-THRIVE aims to identify measurement domains that are internationally relevant, including in low-, middle-, and high-income countries, prioritize concepts to operationalize domains, and specify a set of data elements to measure concepts that can be used across studies for data sharing and comparisons. This article reports findings from consortium meetings at the 2016 meeting of the Gerontological Society of America and the 2017 meeting of the International Association of Gerontology and Geriatrics, to identify domains and prioritize concepts, following best practices to identify common data elements (CDEs) that were developed through the US National Institutes of Health/National Institute of Nursing Research's CDEs initiative. Four domains were identified, including organizational context, workforce and staffing, person-centered care, and care outcomes. Using a nominal group process, WE-THRIVE prioritized 21 concepts across the 4 domains. Several concepts showed similarity to existing measurement structures, whereas others differed. Conceptual similarity (convergence; eg, concepts in the care outcomes domain of functional level and harm-free care) provides further support of the critical foundational work in LTC measurement endorsed and implemented by regulatory bodies. Different concepts (divergence; eg, concepts in the person-centered care domain of knowing the person and what matters most to the person) highlights current gaps in measurement efforts and is consistent with WE-THRIVE's focus on supporting resilience and thriving for residents, family, and staff. In alignment with the World Health Organization's call for comparative measurement work for health systems change, WE-THRIVE's work to date highlights the benefits of engaging with diverse LTC researchers, including those in low-, middle-, and high-income countries, to develop a measurement infrastructure that integrates the aspirations of person-centered LTC
Ratings and rankings: Voodoo or Science?
Composite indicators aggregate a set of variables using weights which are
understood to reflect the variables' importance in the index. In this paper we
propose to measure the importance of a given variable within existing composite
indicators via Karl Pearson's `correlation ratio'; we call this measure `main
effect'. Because socio-economic variables are heteroskedastic and correlated,
(relative) nominal weights are hardly ever found to match (relative) main
effects; we propose to summarize their discrepancy with a divergence measure.
We further discuss to what extent the mapping from nominal weights to main
effects can be inverted. This analysis is applied to five composite indicators,
including the Human Development Index and two popular league tables of
university performance. It is found that in many cases the declared importance
of single indicators and their main effect are very different, and that the
data correlation structure often prevents developers from obtaining the stated
importance, even when modifying the nominal weights in the set of nonnegative
numbers with unit sum.Comment: 28 pages, 7 figure
Implementing the New Structural Model of the Czech National Bank
The purpose of the paper is to introduce the new “g3†structural model of the Czech National Bank and illustrate how it is used for forecasting and policy analysis. As from January 2007 the model was regularly used for shadowing official forecasts, and in July 2008 it became the core model of the CNB. In the paper we highlight the most important and unusual features of the model and discuss tools and procedures that help us in forecasting and assessing the economy with the model. The paper is not meant to provide a full derivation of the model or the complete characteristics of its behavior and should not be regarded as model documentation. Rather, the paper demonstrates how the model is used and how it contributes to policy analysis.DSGE, filtering, forecasting, general equilibrium, monetary policy.
House price momentum and strategic complementarity
House prices exhibit substantially more momentum, positive autocorrelation in price changes, than existing theories can explain. I introduce an amplification mechanism to reconcile this discrepancy. Sellers do not set a unilaterally high or low list price because they face a concave demand curve: increasing the price of an above-average-priced house rapidly reduces its sale probability, but cutting the price of a below-average-priced house only slightly improves its sale probability. The resulting strategic complementarity amplifies frictions because sellers gradually adjust their price to stay near average. I provide empirical evidence for concave demand using a quantitative search model that amplifies momentum two- to threefold
Data uncertainty and the role of money as an information variable for monetary policy
In this study, we perform a quantitative assessment of the role of money as an indicator variable for monetary policy in the euro area. We document the magnitude of revisions to euro area-wide data on output, prices, and money, and find that monetary aggregates have a potentially significant role in providing information about current real output. We then proceed to analyze the information content of money in a forward-looking model in which monetary policy is optimally determined subject to incomplete information about the true state of the economy. We show that monetary aggregates may have substantial information content in an environment with high variability of output measurement errors, low variability of money demand shocks, and a strong contemporaneous linkage between money demand and real output. As a practical matter, however, we conclude that money has fairly limited information content as an indicator of contemporaneous aggregate demand in the euro area
Recommended from our members
GNSS Signal Authentication via Power and Distortion Monitoring
We propose a simple low-cost technique that enables
civil Global Positioning System (GPS) receivers and other civil
global navigation satellite system (GNSS) receivers to reliably
detect carry-off spoofing and jamming. The technique, which
we call the Power-Distortion detector, classifies received signals
as interference-free, multipath-afflicted, spoofed, or jammed
according to observations of received power and correlatio
n
function distortion. It does not depend on external hardware or
a network connection and can be readily implemented on many
receivers via a firmware update. Crucially, the detector can with
high probability distinguish low-power spoofing from ordinary
multipath. In testing against over 25 high-quality empirical data
sets yielding over 900,000 separate detection tests, the detector
correctly alarms on all malicious spoofing or jamming attack
s
while maintaining a
<0.5% single-channel false alarm rate.Aerospace Engineering and Engineering Mechanic
- …