10,283 research outputs found
What Are the New Implications of Chaos for Unpredictability?
From the beginning of chaos research until today, the unpredictability of
chaos has been a central theme. It is widely believed and claimed by
philosophers, mathematicians and physicists alike that chaos has a new
implication for unpredictability, meaning that chaotic systems are
unpredictable in a way that other deterministic systems are not. Hence one
might expect that the question 'What are the new implications of chaos for
unpredictability?' has already been answered in a satisfactory way. However,
this is not the case. I will critically evaluate the existing answers and argue
that they do not fit the bill. Then I will approach this question by showing
that chaos can be defined via mixing, which has not been explicitly argued for.
Based on this insight, I will propose that the sought-after new implication of
chaos for unpredictability is the following: for predicting any event all
sufficiently past events are approximately probabilistically irrelevant
Information measures and classicality in quantum mechanics
We study information measures in quantu mechanics, with particular emphasis
on providing a quantification of the notions of classicality and
predictability. Our primary tool is the Shannon - Wehrl entropy I. We give a
precise criterion for phase space classicality and argue that in view of this
a) I provides a measure of the degree of deviation from classicality for closed
system b) I - S (S the von Neumann entropy) plays the same role in open systems
We examine particular examples in non-relativistic quantum mechanics. Finally,
(this being one of our main motivations) we comment on field classicalisation
on early universe cosmology.Comment: 35 pages, LATE
Decoherence and classical predictability of phase space histories
We consider the decoherence of phase space histories in a class of quantum
Brownian motion models, consisting of a particle moving in a potential
in interaction with a heat bath at temperature and dissipation gamma, in
the Markovian regime. The evolution of the density operator for this open
system is thus described by a non-unitary master equation. The phase space
histories of the system are described by a class of quasiprojectors.
Generalizing earlier results of Hagedorn and Omn\`es, we show that a phase
space projector onto a phase space cell is approximately evolved under
the master equation into another phase space projector onto the classical
dissipative evolution of , and with a certain amount of degradation due
to the noise produced by the environment. We thus show that histories of phase
space samplings approximately decohere, and that the probabilities for these
histories are peaked about classical dissipative evolution, with a width of
peaking depending on the size of the noise.Comment: 34 pages, LATEX, revised version to avoid LATEX error
Kernel Analog Forecasting: Multiscale Test Problems
Data-driven prediction is becoming increasingly widespread as the volume of
data available grows and as algorithmic development matches this growth. The
nature of the predictions made, and the manner in which they should be
interpreted, depends crucially on the extent to which the variables chosen for
prediction are Markovian, or approximately Markovian. Multiscale systems
provide a framework in which this issue can be analyzed. In this work kernel
analog forecasting methods are studied from the perspective of data generated
by multiscale dynamical systems. The problems chosen exhibit a variety of
different Markovian closures, using both averaging and homogenization;
furthermore, settings where scale-separation is not present and the predicted
variables are non-Markovian, are also considered. The studies provide guidance
for the interpretation of data-driven prediction methods when used in practice.Comment: 30 pages, 14 figures; clarified several ambiguous parts, added
references, and a comparison with Lorenz' original method (Sec. 4.5
Variational Inference of Disentangled Latent Concepts from Unlabeled Observations
Disentangled representations, where the higher level data generative factors
are reflected in disjoint latent dimensions, offer several benefits such as
ease of deriving invariant representations, transferability to other tasks,
interpretability, etc. We consider the problem of unsupervised learning of
disentangled representations from large pool of unlabeled observations, and
propose a variational inference based approach to infer disentangled latent
factors. We introduce a regularizer on the expectation of the approximate
posterior over observed data that encourages the disentanglement. We also
propose a new disentanglement metric which is better aligned with the
qualitative disentanglement observed in the decoder's output. We empirically
observe significant improvement over existing methods in terms of both
disentanglement and data likelihood (reconstruction quality).Comment: ICLR 2018 Versio
Evaluating the Usability of Automatically Generated Captions for People who are Deaf or Hard of Hearing
The accuracy of Automated Speech Recognition (ASR) technology has improved,
but it is still imperfect in many settings. Researchers who evaluate ASR
performance often focus on improving the Word Error Rate (WER) metric, but WER
has been found to have little correlation with human-subject performance on
many applications. We propose a new captioning-focused evaluation metric that
better predicts the impact of ASR recognition errors on the usability of
automatically generated captions for people who are Deaf or Hard of Hearing
(DHH). Through a user study with 30 DHH users, we compared our new metric with
the traditional WER metric on a caption usability evaluation task. In a
side-by-side comparison of pairs of ASR text output (with identical WER), the
texts preferred by our new metric were preferred by DHH participants. Further,
our metric had significantly higher correlation with DHH participants'
subjective scores on the usability of a caption, as compared to the correlation
between WER metric and participant subjective scores. This new metric could be
used to select ASR systems for captioning applications, and it may be a better
metric for ASR researchers to consider when optimizing ASR systems.Comment: 10 pages, 8 figures, published in ACM SIGACCESS Conference on
Computers and Accessibility (ASSETS '17
Scheduling of data-intensive workloads in a brokered virtualized environment
Providing performance predictability guarantees is increasingly important in cloud platforms, especially for data-intensive applications, for which performance depends greatly on the available rates of data transfer between the various computing/storage hosts underlying the virtualized resources assigned to the application. With the increased prevalence of brokerage services in cloud platforms, there is a need for resource management solutions that consider the brokered nature of these workloads, as well as the special demands of their intra-dependent components. In this paper, we present an offline mechanism for scheduling batches of brokered data-intensive workloads, which can be extended to an online setting. The objective of the mechanism is to decide on a packing of the workloads in a batch that minimizes the broker's incurred costs, Moreover, considering the brokered nature of such workloads, we define a payment model that provides incentives to these workloads to be scheduled as part of a batch, which we analyze theoretically. Finally, we evaluate the proposed scheduling algorithm, and exemplify the fairness of the payment model in practical settings via trace-based experiments
Analog Forecasting with Dynamics-Adapted Kernels
Analog forecasting is a nonparametric technique introduced by Lorenz in 1969
which predicts the evolution of states of a dynamical system (or observables
defined on the states) by following the evolution of the sample in a historical
record of observations which most closely resembles the current initial data.
Here, we introduce a suite of forecasting methods which improve traditional
analog forecasting by combining ideas from kernel methods developed in harmonic
analysis and machine learning and state-space reconstruction for dynamical
systems. A key ingredient of our approach is to replace single-analog
forecasting with weighted ensembles of analogs constructed using local
similarity kernels. The kernels used here employ a number of dynamics-dependent
features designed to improve forecast skill, including Takens' delay-coordinate
maps (to recover information in the initial data lost through partial
observations) and a directional dependence on the dynamical vector field
generating the data. Mathematically, our approach is closely related to kernel
methods for out-of-sample extension of functions, and we discuss alternative
strategies based on the Nystr\"om method and the multiscale Laplacian pyramids
technique. We illustrate these techniques in applications to forecasting in a
low-order deterministic model for atmospheric dynamics with chaotic
metastability, and interannual-scale forecasting in the North Pacific sector of
a comprehensive climate model. We find that forecasts based on kernel-weighted
ensembles have significantly higher skill than the conventional approach
following a single analog.Comment: submitted to Nonlinearit
Prediction of invasion from the early stage of an epidemic
Predictability of undesired events is a question of great interest in many
scientific disciplines including seismology, economy, and epidemiology. Here,
we focus on the predictability of invasion of a broad class of epidemics caused
by diseases that lead to permanent immunity of infected hosts after recovery or
death. We approach the problem from the perspective of the science of
complexity by proposing and testing several strategies for the estimation of
important characteristics of epidemics, such as the probability of invasion.
Our results suggest that parsimonious approximate methodologies may lead to the
most reliable and robust predictions. The proposed methodologies are first
applied to analysis of experimentally observed epidemics: invasion of the
fungal plant pathogen \emph{Rhizoctonia solani} in replicated host microcosms.
We then consider numerical experiments of the SIR
(susceptible-infected-removed) model to investigate the performance of the
proposed methods in further detail. The suggested framework can be used as a
valuable tool for quick assessment of epidemic threat at the stage when
epidemics only start developing. Moreover, our work amplifies the significance
of the small-scale and finite-time microcosm realizations of epidemics
revealing their predictive power.Comment: Main text: 18 pages, 7 figures. Supporting information: 21 pages, 8
figure
On the selection of preferred consistent sets
The theme of this paper is the multiplicity of the consistent sets appearing
in the consistent histories approach to quantum mechanics. We propose one
criterion for choosing preferred families among them: that the physically
realizable quasiclassical domain ought to be one corresponding to classical
histories. We examine the way classical mechanics arises as a particular window
and the important role played by the canonical group and the Hamiltonian. We
finally discuss possible implications of our having a selection criterion
generally and of our criterion in particular.Comment: 14 page
- …