1,749 research outputs found
Managing Exchange Rate Volatility: A Comparative Counterfactual Analysis of Singapore 1994 to 2003
The objective of this paper is see how well Singapore’s exchange rate regime has coped with exchange rate volatility before and after the Asian financial crisis by comparing the performance of Singapore’s actual regime in minimising the volatility of the nominal effective exchange rate (NEER) and the bilateral rate against the US$ against some counterfactual regimes and the corresponding performance of eight other East Asian countries. In contrast to previous counterfactual exercises, such as Williamson (1998a) and Ohno (1999) which compute the weights for effective exchange rates on the basis of simple bloc aggregates, we apply a more disaggregated methodology using a larger number of trade partners. We also utilize ARCH/GARCH techniques to obtain estimates of heteroskedastic variances to better capture the time-varying characteristics of volatility for the actual and simulated exchange rate regimes. Our findings confirm that Singapore’s managed floating exchange rate system has delivered relatively low currency volatility. Although there are gains in volatility reduction for all countries in the sample from the adoption of either a unilateral or common basket peg, particularly post-crisis, these gains are relatively low for Singapore, largely because low actual volatility. Finally, there are additional gains for nondollar peggers from stabilizing intra-EA exchange rates against the dollar if they were to adopt a basket peg, especially post-crisis, but the gains for Singapore are again relatively modest.East Asia, exchange rates, counterfactuals.
Direct photons: a nonequilibrium signal of the expanding quark-gluon plasma
Direct photon production from a longitudinally expanding quark-gluon plasma
(QGP) at Relativistic Heavy Ion Collider (RHIC) and Large Hadron Collider (LHC)
energies is studied with a real-time kinetic description that is consistently
incorporated with hydrodynamics. Within Bjorken's hydrodynamical model, energy
nonconserving (anti)quark bremsstrahlung q(\bar{q})\to q(\bar{q})\gamma and
quark-antiquark annihilation q\bar{q}\to \gamma are shown to be the dominant
nonequilibrium effects during the transient lifetime of the QGP. For central
collisions we find a significant excess of direct photons in the range of
transverse momentum 1-2 \lesssim p_T \lesssim 5 GeV/c as compared to
equilibrium results. The photon rapidity distribution exhibits a central
plateau. The transverse momentum distribution at midrapidity falls off with a
{\em power law} p^{-\nu}_T with 2.5 \lesssim \nu \lesssim 3 as a consequence of
these energy nonconserving processes, providing a distinct experimental {\em
nonequilibrium signature}. The power law exponent \nu increases with the
initial temperature of the QGP and hence with the total multiplicity rapidity
distribution dN_\pi/dy.Comment: LaTeX (elsart.cls), 33 pages, 4 eps figures, updated with data for
LHC, to appear in Nucl. Phys.
Monte Carlo sampling from the quantum state space. II
High-quality random samples of quantum states are needed for a variety of
tasks in quantum information and quantum computation. Searching the
high-dimensional quantum state space for a global maximum of an objective
function with many local maxima or evaluating an integral over a region in the
quantum state space are but two exemplary applications of many. These tasks can
only be performed reliably and efficiently with Monte Carlo methods, which
involve good samplings of the parameter space in accordance with the relevant
target distribution. We show how the Markov-chain Monte Carlo method known as
Hamiltonian Monte Carlo, or hybrid Monte Carlo, can be adapted to this context.
It is applicable when an efficient parameterization of the state space is
available. The resulting random walk is entirely inside the physical parameter
space, and the Hamiltonian dynamics enable us to take big steps, thereby
avoiding strong correlations between successive sample points while enjoying a
high acceptance rate. We use examples of single and double qubit measurements
for illustration.Comment: 11 pages, 4 figures, 12 reference
Monte Carlo sampling from the quantum state space. I
High-quality random samples of quantum states are needed for a variety of
tasks in quantum information and quantum computation. Searching the
high-dimensional quantum state space for a global maximum of an objective
function with many local maxima or evaluating an integral over a region in the
quantum state space are but two exemplary applications of many. These tasks can
only be performed reliably and efficiently with Monte Carlo methods, which
involve good samplings of the parameter space in accordance with the relevant
target distribution. We show how the standard strategies of rejection sampling,
importance sampling, and Markov-chain sampling can be adapted to this context,
where the samples must obey the constraints imposed by the positivity of the
statistical operator. For a comparison of these sampling methods, we generate
sample points in the probability space for two-qubit states probed with a
tomographically incomplete measurement, and then use the sample for the
calculation of the size and credibility of the recently-introduced optimal
error regions [see New J. Phys. 15 (2013) 123026]. Another illustration is the
computation of the fractional volume of separable two-qubit states.Comment: 13 pages, 5 figures, 1 table, 26 reference
Collapse of Vacuum Bubbles in a Vacuum
Motivated by the discovery of a plenitude of metastable vacua in a string
landscape and the possibility of rapid tunneling between these vacua, we
revisit the dynamics of a false vacuum bubble in a background de Sitter
spacetime. We find that there exists a large parameter space that allows the
bubble to collapse into a black hole or to form a wormhole. This may have
interesting implications to inflationary physics.Comment: 8 pages including 6 figures, LaTex; references adde
- …