306 research outputs found

    Racing to hardware-validated simulation

    Get PDF
    Processor simulators rely on detailed timing models of the processor pipeline to evaluate performance. The diversity in real-world processor designs mandates building flexible simulators that expose parts of the underlying model to the user in the form of configurable parameters. Consequently, the accuracy of modeling a real processor relies on both the accuracy of the pipeline model itself, and the accuracy of adjusting the configuration parameters according to the modeled processor. Unfortunately, processor vendors publicly disclose only a subset of their design decisions, raising the probability of introducing specification inaccuracies when modeling these processors. Inaccurately tuning model parameters deviates the simulated processor from the actual one. In the worst case, using improper parameters may lead to imbalanced pipeline models compromising the simulation output. Therefore, simulation models should be hardware-validated before using them for performance evaluation. As processors increase in complexity and diversity, validating a simulator model against real hardware becomes increasingly more challenging and time-consuming. In this work, we propose a methodology for validating simulation models against real hardware. We create a framework that relies on micro-benchmarks to collect performance statistics on real hardware, and machine learning-based algorithms to fine-tune the unknown parameters based on the accumulated statistics. We overhaul the Sniper simulator to support the ARM AArch64 instruction-set architecture (ISA), and introduce two new timing models for ARM-based in-order and out-of-order cores. Using our proposed simulator validation framework, we tune the in-order and out-of-order models to match the performance of a real-world implementation of the Cortex-A53 and Cortex-A72 cores with an average error of 7% and 15%, respectively, across a set of SPEC CPU2017 benchmarks

    Validation of Simulation: Patterns in the Social and Natural Sciences

    Get PDF
    In most cases, the meaning of computer simulation is strongly connected to the idea numerical calculations. A computer simulation is a numerical solution of a complex mathematical problem. Therefore, the problem of validation of its results should be only a problem of judging the underlying computational methods. However, it will be argued, that this is not the case. It is consensus in literature that validation constitutes one of the central epistemological problems of computer simulation methods. Especially in the case of simulations in the social sciences the answers given by many authors are not satisfactory. The following paper attempts to show how the characteristics of simulation, i.e. the imitation of a dynamic, constitute the problem of validation even in the case of the natural sciences and what consequences arise. Differences as well as common grounds between social and natural sciences will be discussed.Generative Mechanism, Imitation, Patterns, Simulation, Validation

    Bibliometrics, Stylized Facts and the Way Ahead: How to Build Good Social Simulation Models of Science?

    Get PDF
    This paper discusses how stylized facts derived from bibliometric studies can be used to build social simulation models of science. Based on a list of six stylized facts of science it illustrates how they can be brought into play to consolidate and direct research. Moreover, it discusses challenges such a stylized facts based approach of modeling science has to solve.Bibliometrics, Stylized Facts, Methodology, Model Comparison, Validation

    Issues and Challenges in Validating Military Simulation Models

    Get PDF
    The purpose of this thesis is to address the challenges of validating simulation models, especially those challenges facing military simulation analysts. Three distinct issues are of concern to the military simulation analyst: (1) What type of validation effort do the academic experts recommend? (2) What does the military policy say is necessary for a proper validation effort? (3) What can a simulation practitioner realistically accomplish given time and resource constraints? Four methodologies were chosen to represent the academic perspective on validation. A model of validation methods is integrated from the methodologies of these four simulation validation references. The validation policies of the Army, Navy, and Air Force are examined and analyzed for their methodologies to be applied to simulations. The integrated model is compared to these policies that are being formed within the DoD to determine the relationship between the academic experts and the military policies. Case studies of validation efforts are examined and analyzed for the methodologies used by simulation practitioners. The integrated model is compared to the case studies to examine the relationship between the academic experts and the actual practitioners. Finally, conclusions and observations are drawn from all of these comparisons

    “An ethnographic seduction”: how qualitative research and Agent-based models can benefit each other

    Get PDF
    We provide a general analytical framework for empirically informed agent-based simulations. This methodology provides present-day agent-based models with a sound and proper insight as to the behavior of social agents — an insight that statistical data often fall short of providing at least at a micro level and for hidden and sensitive populations. In the other direction, simulations can provide qualitative researchers in sociology, anthropology and other fields with valuable tools for: (a) testing the consistency and pushing the boundaries, of specific theoretical frameworks; (b) replicating and generalizing results; (c) providing a platform for cross-disciplinary validation of results

    Inverse meta-modelling to estimate soil available water capacity at high spatial resolution across a farm

    Get PDF
    Geo-referenced information on crop production that is both spatially- and temporally-dense would be useful for management in precision agriculture (PA). Crop yield monitors provide spatially but not temporally dense information. Crop growth simulation modelling can provide temporal density, but traditionally fail on the spatial issue. The research described was motivated by the challenge of satisfying both the spatial and temporal data needs of PA. The methods presented depart from current crop modelling within PA by introducing meta-modelling in combination with inverse modelling to estimate site-specific soil properties. The soil properties are used to predict spatially- and temporally-dense crop yields. An inverse meta-model was derived from the agricultural production simulator (APSIM) using neural networks to estimate soil available water capacity (AWC) from available yield data. Maps of AWC with a resolution of 10 m were produced across a dryland grain farm in Australia. For certain years and fields, the estimates were useful for yield prediction with APSIM and multiple regression, whereas for others the results were disappointing. The estimates contain ‘implicit information’ about climate interactions with soil, crop and landscape that needs to be identified. Improvement of the meta-model with more AWC scenarios, more years of yield data, inclusion of additional variables and accounting for uncertainty are discussed. We concluded that it is worthwhile to pursue this approach as an efficient way of extracting soil physical information that exists within crop yield maps to create spatially- and temporally-dense dataset

    Virtual Communication Stack: Towards Building Integrated Simulator of Mobile Ad Hoc Network-based Infrastructure for Disaster Response Scenarios

    Full text link
    Responses to disastrous events are a challenging problem, because of possible damages on communication infrastructures. For instance, after a natural disaster, infrastructures might be entirely destroyed. Different network paradigms were proposed in the literature in order to deploy adhoc network, and allow dealing with the lack of communications. However, all these solutions focus only on the performance of the network itself, without taking into account the specificities and heterogeneity of the components which use it. This comes from the difficulty to integrate models with different levels of abstraction. Consequently, verification and validation of adhoc protocols cannot guarantee that the different systems will work as expected in operational conditions. However, the DEVS theory provides some mechanisms to allow integration of models with different natures. This paper proposes an integrated simulation architecture based on DEVS which improves the accuracy of ad hoc infrastructure simulators in the case of disaster response scenarios.Comment: Preprint. Unpublishe

    Back to the Future: Economic Self-Organisation and Maximum Entropy Prediction

    Get PDF
    This paper shows that signal restoration methodology is appropriate for predicting the equilibrium state of certain economic systems. A formal justification for this is provided by proving the existence of finite improvement paths in object allocation problems under weak assumptions on preferences, linking any initial condition to a Nash equilibrium. Because a finite improvement path is made up of a sequence of systematic best-responses, backwards movement from the equilibrium back to the initial condition can be treated like the realisation of a noise process. This underpins the use of signal restoration to predict the equilibrium from the initial condition, and an illustration is provided through an application of maximum entropy signal restoration to the Schelling model of segregation
    • 

    corecore