24,113 research outputs found
Langevin PDF simulation of particle deposition in a turbulent pipe flow
The paper deals with the description of particle deposition on walls from a
turbulent flow over a large range of particle diameter, using a Langevin PDF
model. The first aim of the work is to test how the present Langevin model is
able to describe this phenomenon and to outline the physical as- pects which
play a major role in particle deposition. The general features and
characteristics of the present stochastic model are first recalled. Then,
results obtained with the standard form of the model are presented along with
an analysis which has been carried out to check the sensitivity of the
predictions on different mean fluid quantities. These results show that the
physical repre- sentation of the near-wall physics has to be improved and that,
in particular, one possible route is to introduce specific features related to
the near-wall coherent structures. In the following, we propose a simple
phenomenological model that introduces some of the effects due to the presence
of turbulent coherent structures on particles in a thin layer close to the
wall. The results obtained with this phenomenological model are in good
agreement with experimental evidence and this suggests to pursue in that
direction, towards the development of more general and rigorous stochastic
models that provide a link between a geometrical description of turbulent flow
and a statistical one.Comment: 40 pages, 8 figure
The second law of quantum thermodynamics as an equality
We investigate the connection between recent results in quantum
thermodynamics and fluctuation relations by adopting a fully quantum mechanical
description of thermodynamics. By including a work system whose energy is
allowed to fluctuate, we derive a set of equalities which all thermodynamical
transitions have to satisfy. This extends the condition for maps to be
Gibbs-preserving to the case of fluctuating work, providing a more general
characterisation of maps commonly used in the information theoretic approach to
thermodynamics. For final states, block diagonal in the energy basis, this set
of equalities are necessary and sufficient conditions for a thermodynamical
state transition to be possible. The conditions serves as a parent equation
which can be used to derive a number of results. These include writing the
second law of thermodynamics as an equality featuring a fine-grained notion of
the free energy. It also yields a generalisation of the Jarzynski fluctuation
theorem which holds for arbitrary initial states, and under the most general
manipulations allowed by the laws of quantum mechanics. Furthermore, we show
that each of these relations can be seen as the quasi-classical limit of three
fully quantum identities. This allows us to consider the free energy as an
operator, and allows one to obtain more general and fully quantum fluctuation
relations from the information theoretic approach to quantum thermodynamics.Comment: 11+3 pages. V4: Updated to match published version. Discussion of
thermo-majorization and implementing arbitary unitaries added. V3: Added
funding information. V2: Expanded discussion on relation to fluctuation
theorem
Risk measurement with the equivalent utility principles.
Risk measures have been studied for several decades in the actuarial literature, where they appeared under the guise of premium calculation principles. Risk measures and properties that risk measures should satisfy have recently received considerable at- tention in the financial mathematics literature. Mathematically, a risk measure is a mapping from a class of random variables defined on some measurable space to the (extended) real line. Economically, a risk measure should capture the preferences of the decision-maker. In incomplete financial markets, prices are no more unique but depend on the agents' attitudes towards risk. This paper complements the study initiated in Denuit, Dhaene & Van Wouwe (1999) and considers several theories for decision under uncertainty: the classical expected utility paradigm, Yaari's dual approach, maximin expected utility theory, Choquet expected utility theory and Quiggin rank-dependent utility theory. Building on the actuarial equivalent utility pricing principle, broad classes of risk measures are generated, of which most classical risk measures appear to be particular cases. This approach shows that most risk measures studied recently in the financial literature disregard the utility concept (i.e. correspond to linear utilities), causing some deficiencies. Some alternatives proposed in the literature are discussed, based on exponential utilities.Actuarial; Coherence; Decision; Expected; Market; Markets; Measurement; Preference; Premium; Prices; Pricing; Principles; Random variables; Research; Risk; Risk measure; Risk measurement; Space; Studies; Theory; Uncertainty; Utilities; Variables;
- …