866 research outputs found
Coarse-Graining Auto-Encoders for Molecular Dynamics
Molecular dynamics simulations provide theoretical insight into the
microscopic behavior of materials in condensed phase and, as a predictive tool,
enable computational design of new compounds. However, because of the large
temporal and spatial scales involved in thermodynamic and kinetic phenomena in
materials, atomistic simulations are often computationally unfeasible.
Coarse-graining methods allow simulating larger systems, by reducing the
dimensionality of the simulation, and propagating longer timesteps, by
averaging out fast motions. Coarse-graining involves two coupled learning
problems; defining the mapping from an all-atom to a reduced representation,
and the parametrization of a Hamiltonian over coarse-grained coordinates.
Multiple statistical mechanics approaches have addressed the latter, but the
former is generally a hand-tuned process based on chemical intuition. Here we
present Autograin, an optimization framework based on auto-encoders to learn
both tasks simultaneously. Autograin is trained to learn the optimal mapping
between all-atom and reduced representation, using the reconstruction loss to
facilitate the learning of coarse-grained variables. In addition, a
force-matching method is applied to variationally determine the coarse-grained
potential energy function. This procedure is tested on a number of model
systems including single-molecule and bulk-phase periodic simulations.Comment: 8 pages, 6 figure
Pre-big-bang black-hole remnants and the past low entropy
Dark matter could be composed by black-hole remnants formed before the
big-bang era in a bouncing cosmology. This hypothetical scenario has major
implications on the issue of the arrow of time: it would upset a common
attribution of past low entropy to the state of the geometry, and provide a
concrete realisation to the perspectival interpretation of past low entropy
Sharp and fuzzy observables on effect algebras
Observables on effect algebras and their fuzzy versions obtained by means of
confidence measures (Markov kernels) are studied. It is shown that, on effect
algebras with the (E)-property, given an observable and a confidence measure,
there exists a fuzzy version of the observable. Ordering of observables
according to their fuzzy properties is introduced, and some minimality
conditions with respect to this ordering are found. Applications of some
results of classical theory of experiments are considered.Comment: 23 page
Global coherence of quantum evolutions based on decoherent histories: theory and application to photosynthetic quantum energy transport
Assessing the role of interference in natural and artificial quantum
dyanamical processes is a crucial task in quantum information theory. To this
aim, an appopriate formalism is provided by the decoherent histories framework.
While this approach has been deeply explored from different theoretical
perspectives, it still lacks of a comprehensive set of tools able to concisely
quantify the amount of coherence developed by a given dynamics. In this paper
we introduce and test different measures of the (average) coherence present in
dissipative (Markovian) quantum evolutions, at various time scales and for
different levels of environmentally induced decoherence. In order to show the
effectiveness of the introduced tools, we apply them to a paradigmatic quantum
process where the role of coherence is being hotly debated: exciton transport
in photosynthetic complexes. To spot out the essential features that may
determine the performance of the transport we focus on a relevant trimeric
subunit of the FMO complex and we use a simplified (Haken-Strobl) model for the
system-bath interaction. Our analysis illustrates how the high efficiency of
environmentally assisted transport can be traced back to a quantum recoil
avoiding effect on the exciton dynamics, that preserves and sustains the
benefits of the initial fast quantum delocalization of the exciton over the
network. Indeed, for intermediate levels of decoherence, the bath is seen to
selectively kill the negative interference between different exciton pathways,
while retaining the initial positive one. The concepts and tools here developed
show how the decoherent histories approach can be used to quantify the relation
between coherence and efficiency in quantum dynamical processes.Comment: 13 papges, 9 figure
Asymmetry, abstraction and autonomy: justifying coarse-graining in statistical mechanics
Whilst the fundamental laws of physics are time-reversal invariant, most macroscopic processes are irreversible. Given that the fundamental laws are taken to underpin all other processes, how can the fundamental time-symmetry be reconciled with the asymmetry manifest elsewhere?
In statistical mechanics, progress can be made with this question; what I dub the Zwanzig-Zeh-Wallace framework can be used to construct the irreversible equations of statistical mechanics from the underlying microdynamics. Yet this framework uses coarse-graining, a procedure that has faced much criticism.
I focus on two objections in the literature: claims that coarse-graining makes time-asymmetry (i) `illusory' and (ii) `anthropocentric'. I argue that these objections arise from an unsatisfactory justi�fication of coarse-graining prevalent in the literature, rather than from coarse-graining itself. This justification relies on the idea of measurement imprecision.
By considering the role that abstraction and autonomy play, I provide an alternative justi�fication and o�er replies to the illusory and anthropocentric objections. Finally I consider the broader consequences of this alternative justi�fication: the connection to debates about inter-theoretic reduction and further, the implication that the time-asymmetry in statistical mechanics is weakly emergent
Asymmetry, abstraction and autonomy: justifying coarse-graining in statistical mechanics
Whilst the fundamental laws of physics are time-reversal invariant, most macroscopic processes are irreversible. Given that the fundamental laws are taken to underpin all other processes, how can the fundamental time-symmetry be reconciled with the asymmetry manifest elsewhere?
In statistical mechanics, progress can be made with this question; what I dub the Zwanzig-Zeh-Wallace framework can be used to construct the irreversible equations of statistical mechanics from the underlying microdynamics. Yet this framework uses coarse-graining, a procedure that has faced much criticism.
I focus on two objections in the literature: claims that coarse-graining makes time-asymmetry (i) `illusory' and (ii) `anthropocentric'. I argue that these objections arise from an unsatisfactory justi�fication of coarse-graining prevalent in the literature, rather than from coarse-graining itself. This justification relies on the idea of measurement imprecision.
By considering the role that abstraction and autonomy play, I provide an alternative justi�fication and o�er replies to the illusory and anthropocentric objections. Finally I consider the broader consequences of this alternative justi�fication: the connection to debates about inter-theoretic reduction and further, the implication that the time-asymmetry in statistical mechanics is weakly emergent
- …