500 research outputs found
Chapter 17 - Economics of adaptation
This chapter assesses the literature on the economics of climate change adaptation, building on the Fourth Assessment Report (AR4) and the increasing role that economic considerations are playing in adaptation decisionmaking and policy. AR4 provided a limited assessment of the costs and benefits of adaptation, based on narrow and fragmented sectoral and regional literature (Adger et al, 2007). Substantial advances have been made in the economics of climate change adaptation after AR4
Localization dynamics in a binary two-dimensional cellular automaton: the Diffusion Rule
We study a two-dimensional cellular automaton (CA), called Diffusion Rule
(DR), which exhibits diffusion-like dynamics of propagating patterns. In
computational experiments we discover a wide range of mobile and stationary
localizations (gliders, oscillators, glider guns, puffer trains, etc), analyze
spatio-temporal dynamics of collisions between localizations, and discuss
possible applications in unconventional computing.Comment: Accepted to Journal of Cellular Automat
Personal identity (de)formation among lifestyle travellers: A double-edged sword?
This article explores the personal identity work of lifestyle travellers – individuals for whom extended leisure travel is a preferred lifestyle that they return to repeatedly. Qualitative findings from in-depth semi-structured interviews with lifestyle travellers in northern India and southern Thailand are interpreted in light of theories on identity formation in late modernity that position identity as problematic. It is suggested that extended leisure travel can provide exposure to varied cultural praxes that may contribute to a sense of social saturation. Whilst a minority of the respondents embraced a saturation of personal identity in the subjective formation of a cosmopolitan cultural identity, several of the respondents were paradoxically left with more identity questions than answers as the result of their travels
Electronic structure of MgB: X-ray emission and absorption studies
Measurements of x-ray emission and absorption spectra of the constituents of
MgB are presented. The results obtained are in good agreement with
calculated x-ray spectra, with dipole matrix elements taken into account. The
comparison of x-ray emission spectra of graphite, AlB, and MgB in the
binding energy scale supports the idea of charge transfer from to
bands, which creates holes at the top of the bonding bands and
drives the high-TComment: final version as published in PR
Tests of the random phase approximation for transition strengths
We investigate the reliability of transition strengths computed in the
random-phase approximation (RPA), comparing with exact results from
diagonalization in full shell-model spaces. The RPA and
shell-model results are in reasonable agreement for most transitions; however
some very low-lying collective transitions, such as isoscalar quadrupole, are
in serious disagreement. We suggest the failure lies with incomplete
restoration of broken symmetries in the RPA. Furthermore we prove, analytically
and numerically, that standard statements regarding the energy-weighted sum
rule in the RPA do not hold if an exact symmetry is broken.Comment: 11 pages, 7 figures; Appendix added with new proof regarding
violation of energy-weighted sum rul
Quantized W-algebra of sl(2,1) and quantum parafermions of U_q(sl(2))
In this paper, we establish the connection between the quantized W-algebra of
and quantum parafermions of that a
shifted product of the two quantum parafermions of
generates the quantized W-algebra of
The low-lying excitations of polydiacetylene
The Pariser-Parr-Pople Hamiltonian is used to calculate and identify the
nature of the low-lying vertical transition energies of polydiacetylene. The
model is solved using the density matrix renormalisation group method for a
fixed acetylenic geometry for chains of up to 102 atoms. The non-linear optical
properties of polydiacetylene are considered, which are determined by the
third-order susceptibility. The experimental 1Bu data of Giesa and Schultz are
used as the geometric model for the calculation. For short chains, the
calculated E(1Bu) agrees with the experimental value, within solvation effects
(ca. 0.3 eV). The charge gap is used to characterise bound and unbound states.
The nBu is above the charge gap and hence a continuum state; the 1Bu, 2Ag and
mAg are not and hence are bound excitons. For large chain lengths, the nBu
tends towards the charge gap as expected, strongly suggesting that the nBu is
the conduction band edge. The conduction band edge for PDA is agreed in the
literature to be ca. 3.0 eV. Accounting for the strong polarisation effects of
the medium and polaron formation gives our calculated E(nBu) ca. 3.6 eV, with
an exciton binding energy of ca. 1.0 eV. The 2Ag state is found to be above the
1Bu, which does not agree with relaxed transition experimental data. However,
this could be resolved by including explicit lattice relaxation in the Pariser-
Parr-Pople-Peierls model. Particle-hole separation data further suggest that
the 1Bu, 2Ag and mAg are bound excitons, and that the nBu is an unbound
exciton.Comment: LaTeX, 23 pages, 4 postscript tables and 8 postscript figure
Selective quantum evolution of a qubit state due to continuous measurement
We consider a two-level quantum system (qubit) which is continuously measured
by a detector. The information provided by the detector is taken into account
to describe the evolution during a particular realization of measurement
process. We discuss the Bayesian formalism for such ``selective'' evolution of
an individual qubit and apply it to several solid-state setups. In particular,
we show how to suppress the qubit decoherence using continuous measurement and
the feedback loop.Comment: 15 pages (including 9 figures
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
- …