35,118 research outputs found

    Application of Supercomputer Technologies for Simulation of Socio-Economic Systems

    Full text link
    To date, an extensive experience has been accumulated in investigation of problems related to quality, assessment of management systems, modeling of economic system sustainability. The studies performed have created a basis for formation of a new research area — Economics of Quality. Its tools allow to use opportunities of model simulation for construction of the mathematical models adequately reflecting the role of quality in natural, technical, social regularities of functioning of the complex socioeconomic systems. Extensive application and development of models, and also system modeling with use of supercomputer technologies, on our deep belief, will bring the conducted researches of social and economic systems to essentially new level. Moreover, the current scientific research makes a significant contribution to model simulation of multi-agent social systems and that isn’t less important, it belongs to the priority areas in development of science and technology in our country. This article is devoted to the questions of supercomputer technologies application in public sciences, first of all, — regarding technical realization of the large-scale agent-focused models (AFM). The essence of this tool is that owing to increase in power of computers it became possible to describe the behavior of many separate fragments of a difficult system, as social and economic systems represent. The article also deals with the experience of foreign scientists and practicians in launching the AFM on supercomputers, and also the example of AFM developed in CEMI RAS, stages and methods of effective calculating kernel display of multi-agent system on architecture of a modern supercomputer will be analyzed. The experiments on the basis of model simulation on forecasting the population of St. Petersburg according to three scenarios as one of the major factors influencing the development of social and economic system and quality of life of the population are presented in the conclusion

    Virtual Worlds as Petri Dishes for the Social and Behavioral Sciences

    Get PDF
    The next tool for social science experimentation should allow for macro level, generalizable, scientific research. In the past devices such as rat mazes, Petri dishes and supercolliders have been developed when scientists needed new tools to do research. We believe that Virtual Worlds are the modern equivalent to supercolliders for social scientists, and feel they should be the next area to receive significant attention and funding. The advantages provided by virtual worlds research outweigh the costs. Virtual worlds allow for societal level research with no harm to humans, la rge numbers of experiments and participants, and make long term and panel studies possible. Virtual worlds do have some drawbacks, in that they are expensive and time consuming to build. These obstacles can be overcome, however, by adopting the models of revenue and maintenance practiced by the current game industry. The returns from virtual worlds being used as scientific tools could reach levels that would self fund future search for decades to come. However, at the beginning an investment of funding agencies seems to be necessary.Virtual Worlds, Macro Level Experiments, Research Infrastructure

    Strategic Spatiotemporal Vaccine Distribution Increases the Survival Rate in an Infectious Disease like Covid-19

    Full text link
    Covid-19 has caused hundred of thousands of deaths and an economic damage amounting to trillions of dollars, creating a desire for the rapid development of vaccine. Once available, vaccine is gradually produced, evoking the question on how to distribute it best. While official vaccination guidelines largely focus on the question to whom vaccines should be provided first (e.g. to risk groups), here we propose a strategy for their distribution in time and space, which sequentially prioritizes regions with a high local infection growth rate. To demonstrate this strategy, we develop a simple statistical model describing the time-evolution of infection patterns and their response to vaccination, for infectious diseases like Covid-19. For inhomogeneous infection patterns, locally well-mixed populations and basic reproduction numbers R01.54R_0\sim 1.5-4 the proposed strategy at least halves the number of deaths in our simulations compared to the standard practice of distributing vaccines proportionally to the population density. For R01R_0\sim 1 we still find a significant increase of the survival rate. The proposed vaccine distribution strategy can be further tested in detailed modelling works and could excite discussions on the importance of the spatiotemporal distribution of vaccines for official guidelines.Comment: Supplementary movie temporarily available: https://www.dropbox.com/s/496xd46b6fzlmd4/movie_3.mov?dl=

    Key challenges in agent-based modelling for geo-spatial simulation

    Get PDF
    Agent-based modelling (ABM) is fast becoming the dominant paradigm in social simulation due primarily to a worldview that suggests that complex systems emerge from the bottom-up, are highly decentralised, and are composed of a multitude of heterogeneous objects called agents. These agents act with some purpose and their interaction, usually through time and space, generates emergent order, often at higher levels than those at which such agents operate. ABM however raises as many challenges as it seeks to resolve. It is the purpose of this paper to catalogue these challenges and to illustrate them using three somewhat different agent-based models applied to city systems. The seven challenges we pose involve: the purpose for which the model is built, the extent to which the model is rooted in independent theory, the extent to which the model can be replicated, the ways the model might be verified, calibrated and validated, the way model dynamics are represented in terms of agent interactions, the extent to which the model is operational, and the way the model can be communicated and shared with others. Once catalogued, we then illustrate these challenges with a pedestrian model for emergency evacuation in central London, a hypothetical model of residential segregation tuned to London data which elaborates the standard Schelling (1971) model, and an agent-based residential location built according to spatial interactions principles, calibrated to trip data for Greater London. The ambiguities posed by this new style of modelling are drawn out as conclusions

    Half a billion simulations: evolutionary algorithms and distributed computing for calibrating the SimpopLocal geographical model

    Get PDF
    Multi-agent geographical models integrate very large numbers of spatial interactions. In order to validate those models large amount of computing is necessary for their simulation and calibration. Here a new data processing chain including an automated calibration procedure is experimented on a computational grid using evolutionary algorithms. This is applied for the first time to a geographical model designed to simulate the evolution of an early urban settlement system. The method enables us to reduce the computing time and provides robust results. Using this method, we identify several parameter settings that minimise three objective functions that quantify how closely the model results match a reference pattern. As the values of each parameter in different settings are very close, this estimation considerably reduces the initial possible domain of variation of the parameters. The model is thus a useful tool for further multiple applications on empirical historical situations

    Quantitative Economic Modeling vs Methodological Individualism ?

    Get PDF
    During a long time, the Austrian Economic School was against any mathematical formalization in social and economic sciences because it would be opposed to an individualist point of view of social phenomenons. We try to find an quantitative individualism modelling way from a criticism of holist modelling. But this paper tries to improve the dilemma of the quantitative modelling : we have to choose between whole representations - which denies individuals - and individualist models which are'nt able all to represent. Each one of these models have got his own logic of elaboration (statistical, experimental etc.) but we have to consider they are complementary from a scientific point of view.Methodological Individualism ; Austrian Economics ; Quantitative Economics ; Computational Economics

    Agents in Bioinformatics

    No full text
    The scope of the Technical Forum Group (TFG) on Agents in Bioinformatics (BIOAGENTS) was to inspire collaboration between the agent and bioinformatics communities with the aim of creating an opportunity to propose a different (agent-based) approach to the development of computational frameworks both for data analysis in bioinformatics and for system modelling in computational biology. During the day, the participants examined the future of research on agents in bioinformatics primarily through 12 invited talks selected to cover the most relevant topics. From the discussions, it became clear that there are many perspectives to the field, ranging from bio-conceptual languages for agent-based simulation, to the definition of bio-ontology-based declarative languages for use by information agents, and to the use of Grid agents, each of which requires further exploration. The interactions between participants encouraged the development of applications that describe a way of creating agent-based simulation models of biological systems, starting from an hypothesis and inferring new knowledge (or relations) by mining and analysing the huge amount of public biological data. In this report we summarise and reflect on the presentations and discussions

    Mergers and Innovation in the Pharmaceutical Market

    Get PDF
    The U.S. pharmaceutical industry has experienced in recent years two dramatic changes: stagnation in the growth of new molecular entities approved for marketing, and a wave of mergers linking inter alia some of the largest companies. This paper explores possible links between these two phenomena and proposes alternative approach to merger policy. It points to the high degree of uncertainty encountered in the discovery and development of new pharmaceutical entities and shows how optimal strategies entail the pursue of parallel research and development paths. Uncertainties afflict both success rates and financial gains contingent upon success. A new model simulating optimal strategies given prevalent market uncertainties is presented. Parallelism can be sustained both within individual companies' R&D programs and across competing companies. The paper points to data showing little parallelism of programs within companies and argues that inter-company mergers jeopardize desirable parallelism across companies.

    Price Variations in a Stock Market With Many Agents

    Get PDF
    Large variations in stock prices happen with sufficient frequency to raise doubts about existing models, which all fail to account for non-Gaussian statistics. We construct simple models of a stock market, and argue that the large variations may be due to a crowd effect, where agents imitate each other's behavior. The variations over different time scales can be related to each other in a systematic way, similar to the Levy stable distribution proposed by Mandelbrot to describe real market indices. In the simplest, least realistic case, exact results for the statistics of the variations are derived by mapping onto a model of diffusing and annihilating particles, which has been solved by quantum field theory methods. When the agents imitate each other and respond to recent market volatility, different scaling behavior is obtained. In this case the statistics of price variations is consistent with empirical observations. The interplay between ``rational'' traders whose behavior is derived from fundamental analysis of the stock, including dividends, and ``noise traders'', whose behavior is governed solely by studying the market dynamics, is investigated. When the relative number of rational traders is small, ``bubbles'' often occur, where the market price moves outside the range justified by fundamental market analysis. When the number of rational traders is larger, the market price is generally locked within the price range they define.Comment: 39 pages (Latex) + 20 Figures and missing Figure 1 (sorry), submitted to J. Math. Eco

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c
    corecore