3,496 research outputs found
Maximal power output of a stochastic thermodynamic engine
Classical thermodynamics aimed to quantify the efficiency of thermodynamic engines, by bounding the maximal amount of mechanical energy produced, compared to the amount of heat required. While this was accomplished early on, by Carnot and Clausius, the more practical problem to quantify limits of power that can be delivered, remained elusive due to the fact that quasistatic processes require infinitely slow cycling, resulting in a vanishing power output. Recent insights, drawn from stochastic models, appear to bridge the gap between theory and practice in that they lead to physically meaningful expressions for the dissipation cost in operating a thermodynamic engine over a finite time window. Indeed, the problem to optimize power can be expressed as a stochastic control problem. Building on this framework of stochastic thermodynamics we derive bounds on the maximal power that can be drawn by cycling an overdamped ensemble of particles via a time-varying potential while alternating contact with heat baths of different temperature (Tc cold, and Th hot). Specifically, assuming a suitable bound M on the spatial gradient of the controlling potential, we show that the maximal achievable power is bounded by [Formula presented]. Moreover, we show that this bound can be reached to within a factor of [Formula presented] by operating the cyclic thermodynamic process with a quadratic potential
Probabilistic Kernel Support Vector Machines
We propose a probabilistic enhancement of standard kernel Support Vector
Machines for binary classification, in order to address the case when, along
with given data sets, a description of uncertainty (e.g., error bounds) may be
available on each datum. In the present paper, we specifically consider
Gaussian distributions to model uncertainty. Thereby, our data consist of pairs
, , along with an indicator
to declare membership in one of two categories for each pair.
These pairs may be viewed to represent the mean and covariance, respectively,
of random vectors taking values in a suitable linear space (typically
). Thus, our setting may also be viewed as a modification of
Support Vector Machines to classify distributions, albeit, at present, only
Gaussian ones. We outline the formalism that allows computing suitable
classifiers via a natural modification of the standard "kernel trick." The main
contribution of this work is to point out a suitable kernel function for
applying Support Vector techniques to the setting of uncertain data for which a
detailed uncertainty description is also available (herein, "Gaussian points").Comment: 6 pages, 6 figure
QCD Theory
Quantum Chromodynamics is an established part of the Standard Model and an
essential part of the toolkit for searching for new physics at high-energy
colliders. I present a status report on the theory of QCD and review some of
the important developments in the past year.Comment: 10 pages, 11 figures, plenary talk presented at ICHEP04, Beijing,
China, August 200
Measuring the Impact of Adversarial Errors on Packet Scheduling Strategies
In this paper we explore the problem of achieving efficient packet
transmission over unreliable links with worst case occurrence of errors. In
such a setup, even an omniscient offline scheduling strategy cannot achieve
stability of the packet queue, nor is it able to use up all the available
bandwidth. Hence, an important first step is to identify an appropriate metric
for measuring the efficiency of scheduling strategies in such a setting. To
this end, we propose a relative throughput metric which corresponds to the long
term competitive ratio of the algorithm with respect to the optimal. We then
explore the impact of the error detection mechanism and feedback delay on our
measure. We compare instantaneous error feedback with deferred error feedback,
that requires a faulty packet to be fully received in order to detect the
error. We propose algorithms for worst-case adversarial and stochastic packet
arrival models, and formally analyze their performance. The relative throughput
achieved by these algorithms is shown to be close to optimal by deriving lower
bounds on the relative throughput of the algorithms and almost matching upper
bounds for any algorithm in the considered settings. Our collection of results
demonstrate the potential of using instantaneous feedback to improve the
performance of communication systems in adverse environments
Seven parton amplitudes from recursion relations
We present the first calculation of two-quark and five-gluon tree amplitudes
using on-shell recursion relations. These amplitudes are needed for tree level
5-jet cross-section and an essential ingredient for next-to-leading order 4-jet
and next-to-next-to-leading order 3-jet production at hadronic colliders. Very
compact expressions for all possible helicity configurations are provided,
allowing for direct implementation in Monte-Carlo codes.Comment: 11 page
Non-homogeneous random walks on a semi-infinite strip
We study the asymptotic behaviour of Markov chains (Xn,ηn) on Z+×S, where Z+ is the non-negative integers and S is a finite set. Neither coordinate is assumed to be Markov. We assume a moments bound on the jumps of Xn, and that, roughly speaking, ηn is close to being Markov when Xn is large. This departure from much of the literature, which assumes that ηn is itself a Markov chain, enables us to probe precisely the recurrence phase transitions by assuming asymptotically zero drift for Xn given ηn. We give a recurrence classification in terms of increment moment parameters for Xn and the stationary distribution for the large- X limit of ηn. In the null case we also provide a weak convergence result, which demonstrates a form of asymptotic independence between Xn (rescaled) and ηn. Our results can be seen as generalizations of Lamperti’s results for non-homogeneous random walks on Z+ (the case where S is a singleton). Motivation arises from modulated queues or processes with hidden variables where ηn tracks an internal state of the system
Deposition, diffusion, and nucleation on an interval
Motivated by nanoscale growth of ultra-thin films, we study a model of deposition, on an interval substrate, of particles that perform Brownian motions until any two meet, when they nucleate to form a static island, which acts as an absorbing barrier to subsequent particles. This is a continuum version of a lattice model studied in the applied literature. We show that the associated interval-splitting process converges in the sparse deposition limit to a Markovian process (in the vein of Brennan and Durrett) governed by a splitting density with a compact Fourier series expansion but, apparently, no simple closed form. We show that the same splitting density governs the fixed deposition rate, large time asymptotics of the normalized gap distribution, so these asymptotics are independent of deposition rate. The splitting density is derived by solving an exit problem for planar Brownian motion from a right-angled triangle, extending work of Smith and Watson
- …