9,256 research outputs found
Continuity and boundary conditions in thermodynamics: From Carnot's efficiency to efficiencies at maximum power
[...] By the beginning of the 20th century, the principles of thermodynamics
were summarized into the so-called four laws, which were, as it turns out,
definitive negative answers to the doomed quests for perpetual motion machines.
As a matter of fact, one result of Sadi Carnot's work was precisely that the
heat-to-work conversion process is fundamentally limited; as such, it is
considered as a first version of the second law of thermodynamics. Although it
was derived from Carnot's unrealistic model, the upper bound on the
thermodynamic conversion efficiency, known as the Carnot efficiency, became a
paradigm as the next target after the failure of the perpetual motion ideal. In
the 1950's, Jacques Yvon published a conference paper containing the necessary
ingredients for a new class of models, and even a formula, not so different
from that of Carnot's efficiency, which later would become the new efficiency
reference. Yvon's first analysis [...] went fairly unnoticed for twenty years,
until Frank Curzon and Boye Ahlborn published their pedagogical paper about the
effect of finite heat transfer on output power limitation and their derivation
of the efficiency at maximum power, now known as the Curzon-Ahlborn (CA)
efficiency. The notion of finite rate explicitly introduced time in
thermodynamics, and its significance cannot be overlooked as shown by the
wealth of works devoted to what is now known as finite-time thermodynamics
since the end of the 1970's. [...] The object of the article is thus to cover
some of the milestones of thermodynamics, and show through the illustrative
case of thermoelectric generators, our model heat engine, that the shift from
Carnot's efficiency to efficiencies at maximum power explains itself naturally
as one considers continuity and boundary conditions carefully [...]
Reconciling cooperation, biodiversity and stability in complex ecological communities
Empirical observations show that ecological communities can have a huge
number of coexisting species, also with few or limited number of resources.
These ecosystems are characterized by multiple type of interactions, in
particular displaying cooperative behaviors. However, standard modeling of
population dynamics based on Lotka-Volterra type of equations predicts that
ecosystem stability should decrease as the number of species in the community
increases and that cooperative systems are less stable than communities with
only competitive and/or exploitative interactions. Here we propose a stochastic
model of population dynamics, which includes exploitative interactions as well
as cooperative interactions induced by cross-feeding. The model is exactly
solved and we obtain results for relevant macro-ecological patterns, such as
species abundance distributions and correlation functions. In the large system
size limit, any number of species can coexist for a very general class of
interaction networks and stability increases as the number of species grows.
For pure mutualistic/commensalistic interactions we determine the topological
properties of the network that guarantee species coexistence. We also show that
the stationary state is globally stable and that inferring species interactions
through species abundance correlation analysis may be misleading. Our
theoretical approach thus show that appropriate models of cooperation naturally
leads to a solution of the long-standing question about complexity-stability
paradox and on how highly biodiverse communities can coexist.Comment: 25 pages, 10 figure
Entropy production and the arrow of time
We present an exact relationship between the entropy production and the
distinguishability of a process from its time-reverse, quantified by the
relative entropy between forward and backward states. The relationship is shown
to remain valid for a wide family of initial conditions, such as canonical,
constrained canonical, multi-canonical and grand canonical distributions, as
well as both for classical and quantum systems.Comment: 15 pages, no figure
A Fundamentally Irreversible World as an Opportunity towards a Consistent Understanding of Quantum and Cosmological Contexts
In a preceding publication a fundamentally oriented and irreversible world was shown to be de- rivable from the important principle of least action. A consequence of such a paradigm change is avoidance of paradoxes within a “dynamic” quantum physics. This becomes essentially possible because fundamental irreversibility allows consideration of the “entropy” concept in elementary processes. For this reason, and for a compensation of entropy in the spread out energy of the wave, the duality of particle and wave has to be mediated via an information self-image of matter. In this publication considerations are extended to irreversible thermodynamics, to gravitation and cos- mology with its dependence on quantum interpretations. The information self-image of matter around particles could be identified with gravitation. Because information can also impose an al- ways constant light velocity there is no need any more to attribute such a property to empty space, as done in relativity theory. In addition, the possibility is recognized to consider entropy genera- tion by expanding photon fields in the universe. Via a continuous activation of information on matter photons can generate entropy and release small energy packages without interacting with matter. This facilitates a new interpretation of galactic redshift, emphasizes an information link between quantum- and cosmological phenomena, and evidences an information-triggered origin of the universe. Self-organized processes approach maximum entropy production within their constraints. In a far from equilibrium world also information, with its energy content, can self- organize to a higher hierarchy of computation. It is here identified with consciousness. This ap- pears to explain evolution of spirit and intelligence on a materialistic basis. Also gravitation, here identified as information on matter, could, under special conditions, self-organize to act as a su- per-gravitation, offering an alternative to dark matter. Time is not an illusion, but has to be understood as flux of action, which is the ultimate reality of change. The concept of an irreversible physical world opens a route towards a rational understanding of complex contexts in nature
Security Games with Information Leakage: Modeling and Computation
Most models of Stackelberg security games assume that the attacker only knows
the defender's mixed strategy, but is not able to observe (even partially) the
instantiated pure strategy. Such partial observation of the deployed pure
strategy -- an issue we refer to as information leakage -- is a significant
concern in practical applications. While previous research on patrolling games
has considered the attacker's real-time surveillance, our settings, therefore
models and techniques, are fundamentally different. More specifically, after
describing the information leakage model, we start with an LP formulation to
compute the defender's optimal strategy in the presence of leakage. Perhaps
surprisingly, we show that a key subproblem to solve this LP (more precisely,
the defender oracle) is NP-hard even for the simplest of security game models.
We then approach the problem from three possible directions: efficient
algorithms for restricted cases, approximation algorithms, and heuristic
algorithms for sampling that improves upon the status quo. Our experiments
confirm the necessity of handling information leakage and the advantage of our
algorithms
The entropy of keys derived from laser speckle
Laser speckle has been proposed in a number of papers as a high-entropy
source of unpredictable bits for use in security applications. Bit strings
derived from speckle can be used for a variety of security purposes such as
identification, authentication, anti-counterfeiting, secure key storage, random
number generation and tamper protection. The choice of laser speckle as a
source of random keys is quite natural, given the chaotic properties of
speckle. However, this same chaotic behaviour also causes reproducibility
problems. Cryptographic protocols require either zero noise or very low noise
in their inputs; hence the issue of error rates is critical to applications of
laser speckle in cryptography. Most of the literature uses an error reduction
method based on Gabor filtering. Though the method is successful, it has not
been thoroughly analysed.
In this paper we present a statistical analysis of Gabor-filtered speckle
patterns. We introduce a model in which perturbations are described as random
phase changes in the source plane. Using this model we compute the second and
fourth order statistics of Gabor coefficients. We determine the mutual
information between perturbed and unperturbed Gabor coefficients and the bit
error rate in the derived bit string. The mutual information provides an
absolute upper bound on the number of secure bits that can be reproducibly
extracted from noisy measurements
- …