28,170 research outputs found
Quantitative modelling of the human–Earth System a new kind of science?
The five grand challenges set out for Earth System Science by the International Council for Science in 2010 require a true fusion of social science, economics and natural science—a fusion that has not yet been achieved. In this paper we propose that constructing quantitative models of the dynamics of the human–Earth system can serve as a catalyst for this fusion. We confront well-known objections to modelling societal dynamics by drawing lessons from the development of natural science over the last four centuries and applying them to social and economic science. First, we pose three questions that require real integration of the three fields of science. They concern the coupling of physical planetary boundaries via social processes; the extension of the concept of planetary boundaries to the human–Earth System; and the possibly self-defeating nature of the United Nation’s Millennium Development Goals. Second, we ask whether there are regularities or ‘attractors’ in the human–Earth System analogous to those that prompted the search for laws of nature. We nominate some candidates and discuss why we should observe them given that human actors with foresight and intentionality play a fundamental role in the human–Earth System. We conclude that, at sufficiently large time and space scales, social processes are predictable in some sense. Third, we canvass some essential mathematical techniques that this research fusion must incorporate, and we ask what kind of data would be needed to validate or falsify our models. Finally, we briefly review the state of the art in quantitative modelling of the human–Earth System today and highlight a gap between so-called integrated assessment models applied at regional and global scale, which could be filled by a new scale of model
Complex Systems: A Survey
A complex system is a system composed of many interacting parts, often called
agents, which displays collective behavior that does not follow trivially from
the behaviors of the individual parts. Examples include condensed matter
systems, ecosystems, stock markets and economies, biological evolution, and
indeed the whole of human society. Substantial progress has been made in the
quantitative understanding of complex systems, particularly since the 1980s,
using a combination of basic theory, much of it derived from physics, and
computer simulation. The subject is a broad one, drawing on techniques and
ideas from a wide range of areas. Here I give a survey of the main themes and
methods of complex systems science and an annotated bibliography of resources,
ranging from classic papers to recent books and reviews.Comment: 10 page
A minimal noise trader model with realistic time series properties
Simulations of agent-based models have shown that the stylized facts (unit-root, fat tails and volatility clustering) of financial markets have a possible explanation in the interactions among agents. However, the complexity, originating from the presence of non-linearity and interactions, often limits the analytical approach to the dynamics of these models. In this paper we show that even a very simple model of a financial market with heterogeneous interacting agents is capable of reproducing realistic statistical properties of returns, in close quantitative accordance with the empirical analysis. The simplicity of the system also permits some analytical insights using concepts from statistical mechanics and physics. In our model, the traders are divided into two groups : fundamentalists and chartists, and their interactions are based on a variant of the herding mechanism introduced by Kirman [22]. The statistical analysis of our simulated data shows long-term dependence in the auto-correlations of squared and absolute returns and hyperbolic decay in the tail of the distribution of the raw returns, both with estimated decay parameters in the same range like empirical data. Theoretical analysis, however, excludes the possibility of ?true? scaling behavior because of the Markovian nature of the underlying process and the finite set of possible realized returns. The model, therefore, only mimics power law behavior. Similarly as with the phenomenological volatility models analyzed in LeBaron [25], the usual statistical tests are not able to distinguish between true or pseudo-scaling laws in the dynamics of our artificial market. --Herd Behavior , Speculative Dynamics , Fat Tails , Volatility Clustering
Functional Correlation Approach to Operational Risk in Banking Organizations
A Value-at-Risk based model is proposed to compute the adequate equity
capital necessary to cover potential losses due to operational risks, such as
human and system process failures, in banking organizations. Exploring the
analogy to a lattice gas model from physics, correlations between sequential
failures are modeled by as functionally defined, heterogeneous couplings
between mutually supportive processes. In contrast to traditional risk models
for market and credit risk, where correlations are described by the covariance
of Gaussian processes, the dynamics of the model shows collective phenomena
such as bursts and avalanches of process failures.Comment: 12 pages, 7 figures, uses RevTeX 4.0, submitted to Phys. Rev.
The virtues and vices of equilibrium and the future of financial economics
The use of equilibrium models in economics springs from the desire for
parsimonious models of economic phenomena that take human reasoning into
account. This approach has been the cornerstone of modern economic theory. We
explain why this is so, extolling the virtues of equilibrium theory; then we
present a critique and describe why this approach is inherently limited, and
why economics needs to move in new directions if it is to continue to make
progress. We stress that this shouldn't be a question of dogma, but should be
resolved empirically. There are situations where equilibrium models provide
useful predictions and there are situations where they can never provide useful
predictions. There are also many situations where the jury is still out, i.e.,
where so far they fail to provide a good description of the world, but where
proper extensions might change this. Our goal is to convince the skeptics that
equilibrium models can be useful, but also to make traditional economists more
aware of the limitations of equilibrium models. We sketch some alternative
approaches and discuss why they should play an important role in future
research in economics.Comment: 68 pages, one figur
Company-university collaboration in applying gamification to learning about insurance
Incorporating gamification into training–learning at universities is hampered by a shortage of quality, adapted educational video games. Large companies are leading in the creation of educational video games for their internal training or to enhance their public image and universities can benefit from collaborating. The aim of this research is to evaluate, both objectively and subjectively, the potential of the simulation game BugaMAP (developed by the MAPFRE Foundation) for university teaching about insurance. To this end, we have assessed both the game itself and the experience of using the game as perceived by 142 economics students from various degree plans and courses at the University of Seville during the 2017–2018 academic year. As a methodology, a checklist of gamification components is used for the objective evaluation, and an opinion questionnaire on the game experience is used for the subjective evaluation. Among the results several findings stand out. One is the high satisfaction of the students with the knowledge acquired using fun and social interaction. Another is that the role of the university professors and the company monitors turns out to be very active and necessary during the game-learning sessions. Finally, in addition to the benefits to the university of occasionally available quality games to accelerate student skills training, the company–university collaboration serves as a trial and refinement of innovative tools for game-based learning
Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity.
A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework--a dynamic knowledge repository--wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline
- …