550 research outputs found
Statistical Mechanics of Broadcast Channels Using Low Density Parity Check Codes
We investigate the use of Gallager's low-density parity-check (LDPC) codes in
a broadcast channel, one of the fundamental models in network information
theory. Combining linear codes is a standard technique in practical network
communication schemes and is known to provide better performance than simple
timesharing methods when algebraic codes are used. The statistical physics
based analysis shows that the practical performance of the suggested method,
achieved by employing the belief propagation algorithm, is superior to that of
LDPC based timesharing codes while the best performance, when received
transmissions are optimally decoded, is bounded by the timesharing limit.Comment: 14 pages, 4 figure
Statistical Mechanics of Broadcast Channels Using Low Density Parity Check Codes
We investigate the use of Gallager's low-density parity-check (LDPC) codes in
a broadcast channel, one of the fundamental models in network information
theory. Combining linear codes is a standard technique in practical network
communication schemes and is known to provide better performance than simple
timesharing methods when algebraic codes are used. The statistical physics
based analysis shows that the practical performance of the suggested method,
achieved by employing the belief propagation algorithm, is superior to that of
LDPC based timesharing codes while the best performance, when received
transmissions are optimally decoded, is bounded by the timesharing limit.Comment: 14 pages, 4 figure
Semantic metrics
In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and?or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a variety of research disciplines, and enrich them with semantics based on standard Description Logic constructs. We argue that concept-based metrics can be aggregated to produce numeric distances at ontology-level and we speculate on the usability of our ideas through potential areas
All Else Being Equal Be Empowered
The original publication is available at www.springerlink.com . Copyright Springer DOI : 10.1007/11553090_75The classical approach to using utility functions suffers from the drawback of having to design and tweak the functions on a case by case basis. Inspired by examples from the animal kingdom, social sciences and games we propose empowerment, a rather universal function, defined as the information-theoretic capacity of an agent’s actuation channel. The concept applies to any sensorimotoric apparatus. Empowerment as a measure reflects the properties of the apparatus as long as they are observable due to the coupling of sensors and actuators via the environment.Peer reviewe
The Entropy of a Binary Hidden Markov Process
The entropy of a binary symmetric Hidden Markov Process is calculated as an
expansion in the noise parameter epsilon. We map the problem onto a
one-dimensional Ising model in a large field of random signs and calculate the
expansion coefficients up to second order in epsilon. Using a conjecture we
extend the calculation to 11th order and discuss the convergence of the
resulting series
Semi-automated dialogue act classification for situated social agents in games
As a step toward simulating dynamic dialogue between agents and humans in virtual environments, we describe learning a model of social behavior composed of interleaved utterances and physical actions. In our model, utterances are abstracted as {speech act, propositional content, referent} triples. After training a classifier on 100 gameplay logs from The Restaurant Game annotated with dialogue act triples, we have automatically classified utterances in an additional 5,000 logs. A quantitative evaluation of statistical models learned from the gameplay logs demonstrates that semi-automatically classified dialogue acts yield significantly more predictive power than automatically clustered utterances, and serve as a better common currency for modeling interleaved actions and utterances
Parallel Recursive State Compression for Free
This paper focuses on reducing memory usage in enumerative model checking,
while maintaining the multi-core scalability obtained in earlier work. We
present a tree-based multi-core compression method, which works by leveraging
sharing among sub-vectors of state vectors.
An algorithmic analysis of both worst-case and optimal compression ratios
shows the potential to compress even large states to a small constant on
average (8 bytes). Our experiments demonstrate that this holds up in practice:
the median compression ratio of 279 measured experiments is within 17% of the
optimum for tree compression, and five times better than the median compression
ratio of SPIN's COLLAPSE compression.
Our algorithms are implemented in the LTSmin tool, and our experiments show
that for model checking, multi-core tree compression pays its own way: it comes
virtually without overhead compared to the fastest hash table-based methods.Comment: 19 page
Quantitative information flow, with a view
We put forward a general model intended for assessment of system security against passive eavesdroppers, both quantitatively ( how much information is leaked) and qualitatively ( what properties are leaked). To this purpose, we extend information hiding systems ( ihs ), a model where the secret-observable relation is represented as a noisy channel, with views : basically, partitions of the state-space. Given a view W and n independent observations of the system, one is interested in the probability that a Bayesian adversary wrongly predicts the class of W the underlying secret belongs to. We offer results that allow one to easily characterise the behaviour of this error probability as a function of the number of observations, in terms of the channel matrices defining the ihs and the view W . In particular, we provide expressions for the limit value as n → ∞, show by tight bounds that convergence is exponential, and also characterise the rate of convergence to predefined error thresholds. We then show a few instances of statistical attacks that can be assessed by a direct application of our model: attacks against modular exponentiation that exploit timing leaks, against anonymity in mix-nets and against privacy in sparse datasets
Structural Routability of n-Pairs Information Networks
Information does not generally behave like a conservative fluid flow in
communication networks with multiple sources and sinks. However, it is often
conceptually and practically useful to be able to associate separate data
streams with each source-sink pair, with only routing and no coding performed
at the network nodes. This raises the question of whether there is a nontrivial
class of network topologies for which achievability is always equivalent to
routability, for any combination of source signals and positive channel
capacities. This chapter considers possibly cyclic, directed, errorless
networks with n source-sink pairs and mutually independent source signals. The
concept of downward dominance is introduced and it is shown that, if the
network topology is downward dominated, then the achievability of a given
combination of source signals and channel capacities implies the existence of a
feasible multicommodity flow.Comment: The final publication is available at link.springer.com
http://link.springer.com/chapter/10.1007/978-3-319-02150-8_
- …
