4,737 research outputs found

    A Markov chain approach to ABM calibration

    Get PDF
    Agent based model are nowadays widely used, however the lack of general methods and rules for their calibration still prevent to exploit completely their potentiality. Rarely such a kind of models can be studied analytically, more often they are studied by using simulation. Reference [1] show that many computer simulation models, like ABM, can be represented as Markov Chains. Exploting such an idea we illustrate an example of how to calibrate an ABM when it can be revisited as a Markov chain

    Assessing historical realibility of the agent-based model of the global energy system

    Get PDF
    This study looks at the historical reliability of the agent-based model of the global energy system. We present a mathematical framework for the agent-based model calibration and sensitivity analysis based on historical observations. Simulation consistency with the historical record is measured as a distance between two vectors of data points and inference on parameter values is done from the probability distribution of this stochastic estimate. Proposed methodology is applied to the model of the global energy system. Some model properties and limitations followed from calibration results are discussed

    Estimating Nonlinear Dynamic Equilibrium economies: A Likelihood Approach

    Get PDF
    This paper presents a framework to undertake likelihood-based inference in nonlinear dynamic equilibrium economies. We develop a Sequential Monte Carlo algorithm that delivers an estimate of the likelihood function of the model using simulation methods. This likelihood can be used for parameter estimation and for model comparison. The algorithm can deal both with nonlinearities of the economy and with the presence of non-normal shocks. We show consistency of the estimate and its good performance in finite simulations. This new algorithm is important because the existing empirical literature that wanted to follow a likelihood approach was limited to the estimation of linear models with Gaussian innovations. We apply our procedure to estimate the structural parameters of the neoclassical growth model.Likelihood-Based Inference, Dynamic Equilibrium Economies, Nonlinear Filtering, Sequential Monte Carlo)

    Data and Design: Advancing Theory for Complex Adaptive Systems

    Get PDF
    Complex adaptive systems exhibit certain types of behaviour that are difficult to predict or understand using reductionist approaches, such as linearization or assuming conditions of optimality. This research focuses on the complex adaptive systems associated with public health. These are noted for being driven by many latent forces, shaped centrally by human behaviour. Dynamic simulation techniques, including agent-based models (ABMs) and system dynamics (SD) models, have been used to study the behaviour of complex adaptive systems, including in public health. While much has been learned, such work is still hampered by important limitations. Models of complex systems themselves can be quite complex, increasing the difficulty in explaining unexpected model behaviour, whether that behaviour comes from model code errors or is due to new learning. Model complexity also leads to model designs that are hard to adapt to growing knowledge about the subject area, further reducing model-generated insights. In the current literature of dynamic simulations of human public health behaviour, few focus on capturing explicit psychological theories of human behaviour. Given that human behaviour, especially health and risk behaviour, is so central to understanding of processes in public health, this work explores several methods to improve the utility and flexibility of dynamic models in public health. This work is undertaken in three projects. The first uses a machine learning algorithm, the particle filter, to augment a simple ABM in the presence of continuous disease prevalence data from the modelled system. It is shown that, while using the particle filter improves the accuracy of the ABM, when compared with previous work using SD with a particle filter, the ABM has some limitations, which are discussed. The second presents a model design pattern that focuses on scalability and modularity to improve the development time, testability, and flexibility of a dynamic simulation for tobacco smoking. This method also supports a general pattern of constructing hybrid models --- those that contain elements of multiple methods, such as agent-based or system dynamics. This method is demonstrated with a stylized example of tobacco smoking in a human population. The final line of work implements this modular design pattern, with differing mechanisms of addiction dynamics, within a rich behavioural model of tobacco purchasing and consumption. It integrates the results from a discrete choice experiment, which is a widely used economic method for study human preferences. It compares and contrasts four independent addiction modules under different population assumptions. A number of important insights are discussed: no single module was universally more accurate across all human subpopulations, demonstrating the benefit of exploring a diversity of approaches; increasing the number of parameters does not necessarily improve a module's predictions, since the overall least accurate module had the second highest number of parameters; and slight changes in module structure can lead to drastic improvements, implying the need to be able to iteratively learn from model behaviour

    A Practical, Accurate, Information Criterion for Nth Order Markov Processes

    Get PDF
    The recent increase in the breath of computational methodologies has been matched with a corresponding increase in the difficulty of comparing the relative explanatory power of models from different methodological lineages. In order to help address this problem a Markovian information criterion (MIC) is developed that is analogous to the Akaike information criterion (AIC) in its theoretical derivation and yet can be applied to any model able to generate simulated or predicted data, regardless of its methodology. Both the AIC and proposed MIC rely on the Kullback–Leibler (KL) distance between model predictions and real data as a measure of prediction accuracy. Instead of using the maximum likelihood approach like the AIC, the proposed MIC relies instead on the literal interpretation of the KL distance as the inefficiency of compressing real data using modelled probabilities, and therefore uses the output of a universal compression algorithm to obtain an estimate of the KL distance. Several Monte Carlo tests are carried out in order to (a) confirm the performance of the algorithm and (b) evaluate the ability of the MIC to identify the true data-generating process from a set of alternative models
    • …
    corecore