2,133 research outputs found
Measuring output gap uncertainty
We propose a methodology for producing density forecasts for the output gap in real time using a large number of vector autoregessions in inflation and output gap measures. Density combination utilizes a linear mixture of experts framework to produce potentially non-Gaussian ensemble densities for the unobserved output gap. In our application, we show that data revisions alter substantially our probabilistic assessments of the output gap using a variety of output gap measures derived from univariate detrending filters. The resulting ensemble produces well-calibrated forecast densities for US inflation in real time, in contrast to those from simple univariate autoregressions which ignore the contribution of the output gap. Combining evidence from both linear trends and more flexible univariate detrending filters induces strong multi-modality in the predictive densities for the unobserved output gap. The peaks associated with these two detrending methodologies indicate output gaps of opposite sign for some observations, reflecting the pervasive nature of model uncertainty in our US data
Real-time inflation forecast densities from ensemble phillips curves
A popular macroeconomic forecasting strategy takes combinations across many models to hedge against model instabilities of unknown timing; see (among others) Stock andWatson (2004) and Clark and McCracken (2009). In this paper, we examine the effectiveness of recursive-weight and equal-weight combination strategies for density forecasting using a time-varying Phillips curve relationship between inflation and the output gap. The densities reflect the uncertainty across a large number of models using many statistical measures of the output gap, allowing for a single structural break of unknown timing. We use real-time data for the US, Australia, New Zealand and Norway. Our main finding is that the recursive-weight strategy performs well across the real-time data sets, consistently giving well-calibrated forecast densities. The equal-weight strategy generates poorly-calibrated forecast densities for the US and Australian samples. There is little difference between the two strategies for our New Zealand and Norwegian data. We also find that the ensemble modeling approach performs more consistently with real-time data than with revised data in all four countries
Some Considerations of the Ultimate Spatial Resolution Achievable in Scanning Transmission Electron Microscopy
The fundamental limitations on spatial resolution of X-ray microanalysis in the scanning transmission electron microscope are set by the interrelationships between the gun brightness, operating voltage, probe convergence angle, size and current, specimen thickness, beam broadening, the probability of characteristic and Bremsstrahlung X-ray production and the statistics of the X-ray spectrum. Manipulation of expressions describing these interrelationships leads to equations predicting the optimum probe size and specimen thickness for the best achievable spatial resolution (defined as the diameter of a cylinder containing 90% of the X-ray production) in microscopes fitted with different electron sources and operating at different voltages in foils of various elements. Application of these calculations to the special case of detecting monolayer segregation at grain boundaries results in predictions of the minimum amounts of such segregation that would be observable. It is found, for example, that in a microscope with a field-emission source operating at 500 keV, resolution of \u3c 1nm is obtainable in an iron foil 20nm thick, and in this case about 0.001 monolayer of chromium is detectable segregated at grain boundaries. The calculations do not take into account instrumental or experimental problems such as specimen drift, specimen preparation, etc., and represent the basic physical limits of performance of a perfect analytical microscope
Recommended from our members
Impact of the UK general elections on total government expenditure cycles: theory and evidence
This paper presents a testable theoretical framework that extends the standard demand-side approach to modeling government expenditure on goods and services. The focus is on the adjustment of expenditure to disequilibria: we investigate whether the adjustment of UK exhaustive government expenditure between 1966 and 2002 to its long-run equilibrium path is symmetric. The evidence points to asymmetric adjustment to the demands of a representative voter over the election cycle but not between Labour and Conservative governments. Convergence to equilibrium is found to be faster during the later stages of each election cycle
Homelessness in Oxford : risks and opportunities across housing and homeless transitions
This report presents initial findings from CSI’s Homelessness in Oxford project. The project was designed by Dr. Garratt and Dr. Flaherty to be the first systematic attempt to track and understand people’s transitions into and out of different experiences of homelessness in Oxford. The project also explored the roles played by statutory and non-statutory homelessness prevention and relief services in Oxford
Recommended from our members
Probing Postmeasurement Entanglement without Postselection
We study the problem of observing quantum collective phenomena emerging from large numbers of measurements. These phenomena are difficult to observe in conventional experiments because, in order to distinguish the effects of measurement from dephasing, it is necessary to postselect on sets of measurement outcomes with Born probabilities that are exponentially small in the number of measurements performed. An unconventional approach, which avoids this exponential “postselection problem”, is to construct cross-correlations between experimental data and the results of simulations on classical computers. However, these cross-correlations generally have no definite relation to physical quantities. We first show how to incorporate classical shadows into this framework, thereby allowing for the construction of quantum information-theoretic cross-correlations. We then identify cross-correlations that both upper and lower bound the measurement-averaged von Neumann entanglement entropy, as well as cross-correlations that lower bound the measurement-averaged purity and entanglement negativity. These bounds show that experiments can be performed to constrain postmeasurement entanglement without the need for postselection. To illustrate our technique, we consider how it could be used to observe the measurement-induced entanglement transition in Haar-random quantum circuits. We use exact numerical calculations as proxies for quantum simulations and, to highlight the fundamental limitations of classical memory, we construct cross-correlations with tensor-network calculations at finite bond dimension. Our results reveal a signature of measurement-induced criticality that can be observed using a quantum simulator in polynomial time and with polynomial classical memory.
Published by the American Physical Society
202
Many-body delocalisation as symmetry breaking
We present a framework in which the transition between a many-body localised
(MBL) phase and an ergodic one is symmetry breaking. We consider random Floquet
spin chains, expressing their averaged spectral form factor (SFF) as a function
of time in terms of a transfer matrix that acts in the space direction. The SFF
is determined by the leading eigenvalues of this transfer matrix. In the MBL
phase the leading eigenvalue is unique, as in a symmetry-unbroken phase, while
in the ergodic phase and at late times the leading eigenvalues are
asymptotically degenerate, as in a system with degenerate symmetry-breaking
phases. We identify the broken symmetry of the transfer matrix, introduce a
local order parameter for the transition, and show that the associated
correlation functions are long-ranged only in the ergodic phase.Comment: 5+5 page
Goldstone modes in the emergent gauge fields of a frustrated magnet
We consider magnon excitations in the spin-glass phase of geometrically
frustrated antiferromagnets with weak exchange disorder, focussing on the
nearest-neighbour pyrochlore-lattice Heisenberg model at large spin. The
low-energy degrees of freedom in this system are represented by three copies of
a U(1) emergent gauge field, related by global spin-rotation symmetry. We show
that the Goldstone modes associated with spin-glass order are excitations of
these gauge fields, and that the standard theory of Goldstone modes in
Heisenberg spin glasses (due to Halperin and Saslow) must be modified in this
setting.Comment: 6 pages, 2 figures, published in Phys. Rev.
Local pairing of Feynman histories in many-body Floquet models
We study many-body quantum dynamics using Floquet quantum circuits in one
space dimension as simple examples of systems with local interactions that
support ergodic phases. Physical properties can be expressed in terms of
multiple sums over Feynman histories, which for these models are paths or
many-body orbits in Fock space. A natural simplification of such sums is the
diagonal approximation, where the only terms that are retained are ones in
which each path is paired with a partner that carries the complex conjugate
weight. We identify the regime in which the diagonal approximation holds, and
the nature of the leading corrections to it. We focus on the behaviour of the
spectral form factor (SFF) and of matrix elements of local operators, averaged
over an ensemble of random circuits, making comparisons with the predictions of
random matrix theory (RMT) and the eigenstate thermalisation hypothesis (ETH).
We show that properties are dominated at long times by contributions to orbit
sums in which each orbit is paired locally with a conjugate, as in the diagonal
approximation, but that in large systems these contributions consist of many
spatial domains, with distinct local pairings in neighbouring domains. The
existence of these domains is reflected in deviations of the SFF from RMT
predictions, and of matrix element correlations from ETH predictions;
deviations of both kinds diverge with system size. We demonstrate that our
physical picture of orbit-pairing domains has a precise correspondence in the
spectral properties of a transfer matrix that acts in the space direction to
generate the ensemble-averaged SFF. In addition, we find that domains of a
second type control non-Gaussian fluctuations of the SFF. These domains are
separated by walls which are related to the entanglement membrane, known to
characterise the scrambling of quantum information.Comment: 22+7 page
Probing post-measurement entanglement without post-selection
We study the problem of observing quantum collective phenomena emerging from
large numbers of measurements. These phenomena are difficult to observe in
conventional experiments because, in order to distinguish the effects of
measurement from dephasing, it is necessary to post-select on sets of
measurement outcomes whose Born probabilities are exponentially small in the
number of measurements performed. An unconventional approach, which avoids this
exponential `post-selection problem', is to construct cross-correlations
between experimental data and the results of simulations on classical
computers. However, these cross-correlations generally have no definite
relation to physical quantities. We first show how to incorporate shadow
tomography into this framework, thereby allowing for the construction of
quantum information-theoretic cross-correlations. We then identify
cross-correlations which both upper and lower bound the measurement-averaged
von Neumann entanglement entropy. These bounds show that experiments can be
performed to constrain post-measurement entanglement without the need for
post-selection. To illustrate our technique we consider how it could be used to
observe the measurement-induced entanglement transition in Haar-random quantum
circuits. We use exact numerical calculations as proxies for quantum
simulations and, to highlight the fundamental limitations of classical memory,
we construct cross-correlations with tensor-network calculations at finite bond
dimension. Our results reveal a signature of measurement-induced criticality
that can be observed using a quantum simulator in polynomial time and with
polynomial classical memory.Comment: 13 page
- …