121 research outputs found
Recommended from our members
Stochastic Analysis
The meeting took place on May 30-June 3, 2011, with over 55 people in attendance. Each day had 6 to 7 talks of varying length (some talks were 30 minutes long), except for Thursday: the traditional hike was moved to Thursday due to the weather (and weather on thursday was indeed fine).
The talks reviewed directions in which progress in the general field of stochastic analysis occurred since the last meeting of this theme in Oberwolfach three years ago. Several themes were covered in some depth, in addition to a broad overview of recent developments. Among these themes a prominent role was played by random matrices, random surfaces/planar maps and their scaling limits, the KPZ universality class, and the interplay between SLE (Schramm-Loewner equation) and the GFF (Gaussian free field)
Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience
This essay is presented with two principal objectives in mind: first, to
document the prevalence of fractals at all levels of the nervous system, giving
credence to the notion of their functional relevance; and second, to draw
attention to the as yet still unresolved issues of the detailed relationships
among power law scaling, self-similarity, and self-organized criticality. As
regards criticality, I will document that it has become a pivotal reference
point in Neurodynamics. Furthermore, I will emphasize the not yet fully
appreciated significance of allometric control processes. For dynamic fractals,
I will assemble reasons for attributing to them the capacity to adapt task
execution to contextual changes across a range of scales. The final Section
consists of general reflections on the implications of the reviewed data, and
identifies what appear to be issues of fundamental importance for future
research in the rapidly evolving topic of this review
Particles and fields in fluid turbulence
The understanding of fluid turbulence has considerably progressed in recent
years. The application of the methods of statistical mechanics to the
description of the motion of fluid particles, i.e. to the Lagrangian dynamics,
has led to a new quantitative theory of intermittency in turbulent transport.
The first analytical description of anomalous scaling laws in turbulence has
been obtained. The underlying physical mechanism reveals the role of
statistical integrals of motion in non-equilibrium systems. For turbulent
transport, the statistical conservation laws are hidden in the evolution of
groups of fluid particles and arise from the competition between the expansion
of a group and the change of its geometry. By breaking the scale-invariance
symmetry, the statistically conserved quantities lead to the observed anomalous
scaling of transported fields. Lagrangian methods also shed new light on some
practical issues, such as mixing and turbulent magnetic dynamo.Comment: 165 pages, review article for Rev. Mod. Phy
Information processing in biology
To survive, organisms must respond appropriately to a variety of challenges posed by a dynamic and uncertain environment. The mechanisms underlying such responses can in general be framed as input-output devices which map environment states (inputs) to associated responses (output. In this light, it is appealing to attempt to model these systems using information theory, a well developed mathematical framework to describe input-output systems.
Under the information theoretical perspective, an organismās behavior is fully characterized by the repertoire of its outputs under different environmental conditions. Due to natural selection, it is reasonable to assume this input-output mapping has been fine tuned in such a way as to maximize the organismās fitness. If that is the case, it should be possible to abstract away the mechanistic implementation details and obtain the general principles that lead to fitness under a certain environment. These can then be used inferentially to both generate hypotheses about the underlying implementation as well as predict novel responses under external perturbations.
In this work I use information theory to address the question of how biological systems generate complex outputs using relatively simple mechanisms in a robust manner. In particular, I will examine how communication and distributed processing can lead to emergent phenomena which allow collective systems to respond in a much richer way than a single organism could
Power Laws in Economics and Finance
A power law is the form taken by a large number of surprising empirical regularities in economics and finance. This article surveys well-documented empirical power laws concerning income and wealth, the size of cities and firms, stock market returns, trading volume, international trade, and executive pay. It reviews detail-independent theoretical motivations that make sharp predictions concerning the existence and coefficients of power laws, without requiring delicate tuning of model parameters. These theoretical mechanisms include random growth, optimization, and the economics of superstars coupled with extreme value theory. Some of the empirical regularities currently lack an appropriate explanation. This article highlights these open areas for future research.
Information processing in biology
To survive, organisms must respond appropriately to a variety of challenges posed by a dynamic and uncertain environment. The mechanisms underlying such responses can in general be framed as input-output devices which map environment states (inputs) to associated responses (output. In this light, it is appealing to attempt to model these systems using information theory, a well developed mathematical framework to describe input-output systems.
Under the information theoretical perspective, an organismās behavior is fully characterized by the repertoire of its outputs under different environmental conditions. Due to natural selection, it is reasonable to assume this input-output mapping has been fine tuned in such a way as to maximize the organismās fitness. If that is the case, it should be possible to abstract away the mechanistic implementation details and obtain the general principles that lead to fitness under a certain environment. These can then be used inferentially to both generate hypotheses about the underlying implementation as well as predict novel responses under external perturbations.
In this work I use information theory to address the question of how biological systems generate complex outputs using relatively simple mechanisms in a robust manner. In particular, I will examine how communication and distributed processing can lead to emergent phenomena which allow collective systems to respond in a much richer way than a single organism could
- ā¦