62,200 research outputs found
The evolution of social norms
Evolutionary game theory provides the tools to analyze which strategies, or patterns of behaviour, emerge over time through a process of adaptation. Social norms can be defined as patterns of behaviour with certain characteristics. Evolutionary game theory thus provides one perspective on how social norms are formed and maintained. Prisoner's dilemma games can be used to study the conditions under which cooperative norms emerge. Bargaining games can be used to address the formation of fairness norms. However, being more congenial to analyzing norms that somehow focus on material payoffs, it is not a given that evolutionary game theory can adequately address norms focusing on rights or virtues.Evolutionary game theory Social norms
Health Figures: An Open Source JavaScript Library for Health Data Visualization
The way we look at data has a great impact on how we can understand it,
particularly when the data is related to health and wellness. Due to the
increased use of self-tracking devices and the ongoing shift towards preventive
medicine, better understanding of our health data is an important part of
improving the general welfare of the citizens. Electronic Health Records,
self-tracking devices and mobile applications provide a rich variety of data
but it often becomes difficult to understand. We implemented the hFigures
library inspired on the hGraph visualization with additional improvements. The
purpose of the library is to provide a visual representation of the evolution
of health measurements in a complete and useful manner. We researched the
usefulness and usability of the library by building an application for health
data visualization in a health coaching program. We performed a user evaluation
with Heuristic Evaluation, Controlled User Testing and Usability
Questionnaires. In the Heuristics Evaluation the average response was 6.3 out
of 7 points and the Cognitive Walkthrough done by usability experts indicated
no design or mismatch errors. In the CSUQ usability test the system obtained an
average score of 6.13 out of 7, and in the ASQ usability test the overall
satisfaction score was 6.64 out of 7. We developed hFigures, an open source
library for visualizing a complete, accurate and normalized graphical
representation of health data. The idea is based on the concept of the hGraph
but it provides additional key features, including a comparison of multiple
health measurements over time. We conducted a usability evaluation of the
library as a key component of an application for health and wellness
monitoring. The results indicate that the data visualization library was
helpful in assisting users in understanding health data and its evolution over
time.Comment: BMC Medical Informatics and Decision Making 16.1 (2016
A chemical model for the interstellar medium in galaxies
We present and test chemical models for three-dimensional hydrodynamical
simulations of galaxies. We explore the effect of changing key parameters such
as metallicity, radiation and non-equilibrium versus equilibrium metal cooling
approximations on the transition between the gas phases in the interstellar
medium. The microphysics is modelled by employing the public chemistry package
KROME and the chemical networks have been tested to work in a wide range of
densities and temperatures. We describe a simple H/He network following the
formation of H, and a more sophisticated network which includes metals.
Photochemistry, thermal processes, and different prescriptions for the H
catalysis on dust are presented and tested within a one-zone framework. The
resulting network is made publicly available on the KROME webpage. We find that
employing an accurate treatment of the dust-related processes induces a faster
HI--H transition. In addition, we show when the equilibrium assumption for
metal cooling holds, and how a non-equilibrium approach affects the thermal
evolution of the gas and the HII--HI transition. These models can be employed
in any hydrodynamical code via an interface to KROME and can be applied to
different problems including isolated galaxies, cosmological simulations of
galaxy formation and evolution, supernova explosions in molecular clouds, and
the modelling of star-forming regions. The metal network can be used for a
comparison with observational data of CII 158 m emission both for
high-redshift as well as for local galaxies.Comment: A&A accepte
Infinite average lifetime of an unstable bright state in the green fluorescent protein
The time evolution of the fluorescence intensity emitted by well-defined
ensembles of Green Fluorescent Proteins has been studied by using a standard
confocal microscope. In contrast with previous results obtained in single
molecule experiments, the photo-bleaching of the ensemble is well described by
a model based on Levy statistics. Moreover, this simple theoretical model
allows us to obtain information about the energy-scales involved in the aging
process.Comment: 4 pages, 4 figure
First passages in bounded domains: When is the mean first passage time meaningful?
We study the first passage statistics to adsorbing boundaries of a Brownian
motion in bounded two-dimensional domains of different shapes and
configurations of the adsorbing and reflecting boundaries. From extensive
numerical analysis we obtain the probability P(\omega) distribution of the
random variable \omega=\tau_1/(\tau_1+\tau_2), which is a measure for how
similar the first passage times \tau_1 and \tau_2 are of two independent
realisations of a Brownian walk starting at the same location. We construct a
chart for each domain, determining whether P(\omega) represents a unimodal,
bell-shaped form, or a bimodal, M-shaped behaviour. While in the former case
the mean first passage time (MFPT) is a valid characteristic of the first
passage behaviour, in the latter case it is an insufficient measure for the
process. Strikingly we find a distinct turnover between the two modes of
P(\omega), characteristic for the domain shape and the respective location of
absorbing and reflective boundaries. Our results demonstrate that large
fluctuations of the first passage times may occur frequently in two-dimensional
domains, rendering quite vague the general use of the MFPT as a robust measure
of the actual behaviour even in bounded domains, in which all moments of the
first passage distribution exist.Comment: 9 pages, 6 figure
Designing for mathematical abstraction
Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as designing for abstraction. In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing for abstraction. Through the case study, we elaborate a number of design heuristics that we claim are also identifiable in the broader literature on designing for mathematical abstraction. Our previous work on the micro-evolution of mathematical knowledge indicated that new mathematical abstractions are routinely forged in activity with available tools and representations, coordinated with relatively naïve unstructured knowledge. In this paper, we identify the role of design in steering the micro-evolution of knowledge towards the focus of the designer's aspirations. A significant finding from the current analysis is the identification of a heuristic in designing for abstraction that requires the intentional blurring of the key mathematical concepts with the tools whose use might foster the construction of that abstraction. It is commonly recognized that meaningful design constructs emerge from careful analysis of children's activity in relation to the designer's own framework for mathematical abstraction. The case study in this paper emphasizes the insufficiency of such a model for the relationship between epistemology and design. In fact, the case study characterises the dialectic relationship between epistemological analysis and design, in which the theoretical foundations of designing for abstraction and for the micro-evolution of mathematical knowledge can co-emerge. © 2010 Springer Science+Business Media B.V
Boltzmann meets Nash: Energy-efficient routing in optical networks under uncertainty
Motivated by the massive deployment of power-hungry data centers for service
provisioning, we examine the problem of routing in optical networks with the
aim of minimizing traffic-driven power consumption. To tackle this issue,
routing must take into account energy efficiency as well as capacity
considerations; moreover, in rapidly-varying network environments, this must be
accomplished in a real-time, distributed manner that remains robust in the
presence of random disturbances and noise. In view of this, we derive a pricing
scheme whose Nash equilibria coincide with the network's socially optimum
states, and we propose a distributed learning method based on the Boltzmann
distribution of statistical mechanics. Using tools from stochastic calculus, we
show that the resulting Boltzmann routing scheme exhibits remarkable
convergence properties under uncertainty: specifically, the long-term average
of the network's power consumption converges within of its
minimum value in time which is at most ,
irrespective of the fluctuations' magnitude; additionally, if the network
admits a strict, non-mixing optimum state, the algorithm converges to it -
again, no matter the noise level. Our analysis is supplemented by extensive
numerical simulations which show that Boltzmann routing can lead to a
significant decrease in power consumption over basic, shortest-path routing
schemes in realistic network conditions.Comment: 24 pages, 4 figure
- …