40,563 research outputs found
Control theoretic models of pointing
This article presents an empirical comparison of four models from manual control theory on their ability to model targeting behaviour by human users using a mouse: McRuer’s Crossover, Costello’s Surge, second-order lag (2OL), and the Bang-bang model. Such dynamic models are generative, estimating not only movement time, but also pointer position, velocity, and acceleration on a moment-to-moment basis. We describe an experimental framework for acquiring pointing actions and automatically fitting the parameters of mathematical models to the empirical data. We present the use of time-series, phase space, and Hooke plot visualisations of the experimental data, to gain insight into human pointing dynamics. We find that the identified control models can generate a range of dynamic behaviours that captures aspects of human pointing behaviour to varying degrees. Conditions with a low index of difficulty (ID) showed poorer fit because their unconstrained nature leads naturally to more behavioural variability. We report on characteristics of human surge behaviour (the initial, ballistic sub-movement) in pointing, as well as differences in a number of controller performance measures, including overshoot, settling time, peak time, and rise time. We describe trade-offs among the models. We conclude that control theory offers a promising complement to Fitts’ law based approaches in HCI, with models providing representations and predictions of human pointing dynamics, which can improve our understanding of pointing and inform design
Information-theoretic analysis of multivariate single - cell signaling responses using SLEMI
Mathematical methods of information theory constitute essential tools to
describe how stimuli are encoded in activities of signaling effectors.
Exploring the information-theoretic perspective, however, remains conceptually,
experimentally and computationally challenging. Specifically, existing
computational tools enable efficient analysis of relatively simple systems,
usually with one input and output only. Moreover, their robust and readily
applicable implementations are missing. Here, we propose a novel algorithm to
analyze signaling data within the framework of information theory. Our approach
enables robust as well as statistically and computationally efficient analysis
of signaling systems with high-dimensional outputs and a large number of input
values. Analysis of the NF-kB single - cell signaling responses to TNF-a
uniquely reveals that the NF-kB signaling dynamics improves discrimination of
high concentrations of TNF-a with a modest impact on discrimination of low
concentrations. Our readily applicable R-package, SLEMI - statistical learning
based estimation of mutual information, allows the approach to be used by
computational biologists with only elementary knowledge of information theory
Communication Theoretic Data Analytics
Widespread use of the Internet and social networks invokes the generation of
big data, which is proving to be useful in a number of applications. To deal
with explosively growing amounts of data, data analytics has emerged as a
critical technology related to computing, signal processing, and information
networking. In this paper, a formalism is considered in which data is modeled
as a generalized social network and communication theory and information theory
are thereby extended to data analytics. First, the creation of an equalizer to
optimize information transfer between two data variables is considered, and
financial data is used to demonstrate the advantages. Then, an information
coupling approach based on information geometry is applied for dimensionality
reduction, with a pattern recognition example to illustrate the effectiveness.
These initial trials suggest the potential of communication theoretic data
analytics for a wide range of applications.Comment: Published in IEEE Journal on Selected Areas in Communications, Jan.
201
Partial Information Decomposition as a Unified Approach to the Specification of Neural Goal Functions
In many neural systems anatomical motifs are present repeatedly, but despite
their structural similarity they can serve very different tasks. A prime
example for such a motif is the canonical microcircuit of six-layered
neo-cortex, which is repeated across cortical areas, and is involved in a
number of different tasks (e.g.sensory, cognitive, or motor tasks). This
observation has spawned interest in finding a common underlying principle, a
'goal function', of information processing implemented in this structure. By
definition such a goal function, if universal, cannot be cast in
processing-domain specific language (e.g. 'edge filtering', 'working memory').
Thus, to formulate such a principle, we have to use a domain-independent
framework. Information theory offers such a framework. However, while the
classical framework of information theory focuses on the relation between one
input and one output (Shannon's mutual information), we argue that neural
information processing crucially depends on the combination of
\textit{multiple} inputs to create the output of a processor. To account for
this, we use a very recent extension of Shannon Information theory, called
partial information decomposition (PID). PID allows to quantify the information
that several inputs provide individually (unique information), redundantly
(shared information) or only jointly (synergistic information) about the
output. First, we review the framework of PID. Then we apply it to reevaluate
and analyze several earlier proposals of information theoretic neural goal
functions (predictive coding, infomax, coherent infomax, efficient coding). We
find that PID allows to compare these goal functions in a common framework, and
also provides a versatile approach to design new goal functions from first
principles. Building on this, we design and analyze a novel goal function,
called 'coding with synergy'. [...]Comment: 21 pages, 4 figures, appendi
Partial information decomposition as a unified approach to the specification of neural goal functions
In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a ‘goal function’, of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. ‘edge filtering’, ‘working memory’). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon’s mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called ‘coding with synergy’, which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing
Bits from Biology for Computational Intelligence
Computational intelligence is broadly defined as biologically-inspired
computing. Usually, inspiration is drawn from neural systems. This article
shows how to analyze neural systems using information theory to obtain
constraints that help identify the algorithms run by such systems and the
information they represent. Algorithms and representations identified
information-theoretically may then guide the design of biologically inspired
computing systems (BICS). The material covered includes the necessary
introduction to information theory and the estimation of information theoretic
quantities from neural data. We then show how to analyze the information
encoded in a system about its environment, and also discuss recent
methodological developments on the question of how much information each agent
carries about the environment either uniquely, or redundantly or
synergistically together with others. Last, we introduce the framework of local
information dynamics, where information processing is decomposed into component
processes of information storage, transfer, and modification -- locally in
space and time. We close by discussing example applications of these measures
to neural data and other complex systems
- …