1,707 research outputs found
Correlation-powered Information Engines and the Thermodynamics of Self-Correction
Information engines can use structured environments as a resource to generate
work by randomizing ordered inputs and leveraging the increased Shannon entropy
to transfer energy from a thermal reservoir to a work reservoir. We give a
broadly applicable expression for the work production of an information engine,
generally modeled as a memoryful channel that communicates inputs to outputs as
it interacts with an evolving environment. The expression establishes that an
information engine must have more than one memory state in order to leverage
input environment correlations. To emphasize this functioning, we designed an
information engine powered solely by temporal correlations and not by
statistical biases, as employed by previous engines. Key to this is the
engine's ability to synchronize---the engine automatically returns to a desired
dynamical phase when thrown into an unwanted, dissipative phase by corruptions
in the input---that is, by unanticipated environmental fluctuations. This
self-correcting mechanism is robust up to a critical level of corruption,
beyond which the system fails to act as an engine. We give explicit analytical
expressions for both work and critical corruption level and summarize engine
performance via a thermodynamic-function phase diagram over engine control
parameters. The results reveal a new thermodynamic mechanism based on
nonergodicity that underlies error correction as it operates to support
resilient engineered and biological systems.Comment: 22 pages, 13 figures;
http://csc.ucdavis.edu/~cmg/compmech/pubs/tos.ht
Optimal Feedback Communication via Posterior Matching
In this paper we introduce a fundamental principle for optimal communication
over general memoryless channels in the presence of noiseless feedback, termed
posterior matching. Using this principle, we devise a (simple, sequential)
generic feedback transmission scheme suitable for a large class of memoryless
channels and input distributions, achieving any rate below the corresponding
mutual information. This provides a unified framework for optimal feedback
communication in which the Horstein scheme (BSC) and the Schalkwijk-Kailath
scheme (AWGN channel) are special cases. Thus, as a corollary, we prove that
the Horstein scheme indeed attains the BSC capacity, settling a longstanding
conjecture. We further provide closed form expressions for the error
probability of the scheme over a range of rates, and derive the achievable
rates in a mismatch setting where the scheme is designed according to the wrong
channel model. Several illustrative examples of the posterior matching scheme
for specific channels are given, and the corresponding error probability
expressions are evaluated. The proof techniques employed utilize novel
relations between information rates and contraction properties of iterated
function systems.Comment: IEEE Transactions on Information Theor
- …