106 research outputs found
The dynamics of financial stability in complex networks
We address the problem of banking system resilience by applying
off-equilibrium statistical physics to a system of particles, representing the
economic agents, modelled according to the theoretical foundation of the
current banking regulation, the so called Merton-Vasicek model. Economic agents
are attracted to each other to exchange `economic energy', forming a network of
trades. When the capital level of one economic agent drops below a minimum, the
economic agent becomes insolvent. The insolvency of one single economic agent
affects the economic energy of all its neighbours which thus become susceptible
to insolvency, being able to trigger a chain of insolvencies (avalanche). We
show that the distribution of avalanche sizes follows a power-law whose
exponent depends on the minimum capital level. Furthermore, we present evidence
that under an increase in the minimum capital level, large crashes will be
avoided only if one assumes that agents will accept a drop in business levels,
while keeping their trading attitudes and policies unchanged. The alternative
assumption, that agents will try to restore their business levels, may lead to
the unexpected consequence that large crises occur with higher probability
When Models Interact with their Subjects: The Dynamics of Model Aware Systems
A scientific model need not be a passive and static descriptor of its
subject. If the subject is affected by the model, the model must be updated to
explain its affected subject. In this study, two models regarding the dynamics
of model aware systems are presented. The first explores the behavior of
"prediction seeking" (PSP) and "prediction avoiding" (PAP) populations under
the influence of a model that describes them. The second explores the
publishing behavior of a group of experimentalists coupled to a model by means
of confirmation bias. It is found that model aware systems can exhibit
convergent random or oscillatory behavior and display universal 1/f noise. A
numerical simulation of the physical experimentalists is compared with actual
publications of neutron life time and {\Lambda} mass measurements and is in
good quantitative agreement.Comment: Accepted for publication in PLoS-ON
Crises and collective socio-economic phenomena: simple models and challenges
Financial and economic history is strewn with bubbles and crashes, booms and
busts, crises and upheavals of all sorts. Understanding the origin of these
events is arguably one of the most important problems in economic theory. In
this paper, we review recent efforts to include heterogeneities and
interactions in models of decision. We argue that the Random Field Ising model
(RFIM) indeed provides a unifying framework to account for many collective
socio-economic phenomena that lead to sudden ruptures and crises. We discuss
different models that can capture potentially destabilising self-referential
feedback loops, induced either by herding, i.e. reference to peers, or
trending, i.e. reference to the past, and account for some of the phenomenology
missing in the standard models. We discuss some empirically testable
predictions of these models, for example robust signatures of RFIM-like herding
effects, or the logarithmic decay of spatial correlations of voting patterns.
One of the most striking result, inspired by statistical physics methods, is
that Adam Smith's invisible hand can badly fail at solving simple coordination
problems. We also insist on the issue of time-scales, that can be extremely
long in some cases, and prevent socially optimal equilibria to be reached. As a
theoretical challenge, the study of so-called "detailed-balance" violating
decision rules is needed to decide whether conclusions based on current models
(that all assume detailed-balance) are indeed robust and generic.Comment: Review paper accepted for a special issue of J Stat Phys; several
minor improvements along reviewers' comment
Identification of clusters of investors from their real trading activity in a financial market
We use statistically validated networks, a recently introduced method to
validate links in a bipartite system, to identify clusters of investors trading
in a financial market. Specifically, we investigate a special database allowing
to track the trading activity of individual investors of the stock Nokia. We
find that many statistically detected clusters of investors show a very high
degree of synchronization in the time when they decide to trade and in the
trading action taken. We investigate the composition of these clusters and we
find that several of them show an over-expression of specific categories of
investors.Comment: 25 pages, 5 figure
A simple and sensitive method for determination of Norfloxacin in pharmaceutical preparations
In this approach, a new voltammetric method for determination of norfloxacin was proposed with high sensitivity and wider detection linear range. The used voltammetric sensor was fabricated simply by coating a layer of graphene oxide (GO) and Nafion composited film on glassy carbon electrode. The advantage of proposed method was sensitive electrochemical response for norfloxacin, which was attributed to the excellent electrical conductivity of GO and the accumulating function of Nafion under optimum experimental conditions, the present method revealed a good linear response for determination of norfloxacin in the range of 1×10-8mol/L-7×10-6 mol/L with a detection limit of 5×10-9 mol/L. The proposed method was successfully applied in the determination of norfloxacin in capsules with satisfactory results.</p
An Information Theoretic Criterion for Empirical Validation of Time Series Models
Simulated models suffer intrinsically from validation and comparison problems. The choice of a suitable indicator quantifying the distance between the model and the data is pivotal to model selection. However, how to validate and discriminate between alternative models is still an open problem calling for further investigation, especially in light of the increasing use of simulations in social sciences. In this paper, we present an information theoretic criterion to measure how close models' synthetic output replicates the properties of observable time series without the need to resort to any likelihood function or to impose stationarity requirements. The indicator is sufficiently general to be applied to any kind of model able to simulate or predict time series data, from simple univariate models such as Auto Regressive Moving Average (ARMA) and Markov processes to more complex objects including agent-based or dynamic stochastic general equilibrium models. More specifically, we use a simple function of the L-divergence computed at different block lengths in order to select the model that is better able to reproduce the distributions of time changes in the data. To evaluate the L-divergence, probabilities are estimated across frequencies including a correction for the systematic bias. Finally, using a known data generating process, we show how this indicator can be used to validate and discriminate between different models providing a precise measure of the distance between each of them and the data
- …