542 research outputs found

    Unsupervised Continual Learning From Synthetic Data Generated with Agent-Based Modeling and Simulation: A preliminary experimentation

    Get PDF
    Continual Learning enables to learn a variable number of tasks sequentially without forgetting knowledge obtained from the past. Catastrophic forgetting usually occurs in neural networks for their inability to learn different tasks in sequence since the performance on the previous tasks drops down in a significant way. One way to solve this problem is providing a subset of the previous examples to the model while learning a new task. In this paper we evaluate the continual learning performance of an unsupervised model for anomaly detection by generating synthetic data using an Agent-based modeling and simulation technique. We simulated the movement of different types of individuals in a building and evaluate their trajectories depending on their role. We collected training and test sets based on their trajectories. We included, in the test set, negative examples that contain wrong trajectories. We applied a replay-based continual learning to teach the model how to distinguish anomaly trajectories depending on the users’ roles. The results show that using ABMS synthetic data it is enough a small percentage of synthetic data replay to mitigate the Catastrophic Forgetting and to achieve a satisfactory accuracy on the final binary classification (anomalous / non-anomalous)

    High-Performance Computing and ABMS for High-Resolution COVID-19 Spreading Simulation

    Get PDF
    This paper presents an approach for the modeling and the simulation of the spreading of COVID-19 based on agent-based modeling and simulation (ABMS). Our goal is not only to support large-scale simulations but also to increase the simulation resolution. Moreover, we do not assume an underlying network of contacts, and the person-to-person contacts responsible for the spreading are modeled as a function of the geographical distance among the individuals. In particular, we defined a commuting mechanism combining radiation-based and gravity-based models and we exploited the commuting properties at different resolution levels (municipalities and provinces). Finally, we exploited the high-performance computing (HPC) facilities to simulate millions of concurrent agents, each mapping the individual’s behavior. To do such simulations, we developed a spreading simulator and validated it through the simulation of the spreading in two of the most populated Italian regions: Lombardy and Emilia-Romagna. Our main achievement consists of the effective modeling of 10 million of concurrent agents, each one mapping an individual behavior with a high-resolution in terms of social contacts, mobility and contribution to the virus spreading. Moreover, we analyzed the forecasting ability of our framework to predict the number of infections being initialized with only a few days of real data. We validated our model with the statistical data coming from the serological analysis conducted in Lombardy, and our model makes a smaller error than other state of the art models with a final root mean squared error equal to 56,009 simulating the entire first pandemic wave in spring 2020. On the other hand, for the Emilia-Romagna region, we simulated the second pandemic wave during autumn 2020, and we reached a final RMSE equal to 10,730.11

    Generating Urban Forms from Ontologies

    Get PDF
    The paper presents the ongoing research work on a software system for supporting the exploration of the numerous and often interrelated factors that can affect the urban design. The present implementation of the system supports the simulation of different urban scenarios in relation to the uniqueness and constraints peculiar to a design and a site. The paper considers our ongoing research work to formally represent the implicit and explicit knowledge used by means of ontologies. The ontology semantic system administrates a set of rules and relations among urban entities. To this aim, we are dealing with different issues: all the factors involved in the urban design process cross various domain knowledge, from different competencies and sources; the knowledge is both semantic and procedural

    Fine-Grained Agent-Based Modeling to Predict Covid-19 Spreading and Effect of Policies in Large-Scale Scenarios

    Get PDF
    Modeling and forecasting the spread of COVID-19 remains an open problem for several reasons. One of these concerns the difficulty to model a complex system at a high resolution (fine-grained) level at which the spread can be simulated by taking into account individual features such as the social structure, the effects of the governments’ policies, age sensitivity to Covid-19, maskwearing habits and geographical distribution of susceptible people. Agent-based modeling usually needs to find an optimal trade-off between the resolution of the simulation and the population size. Indeed, modeling single individuals usually leads to simulations of smaller populations or the use of meta-populations. In this article, we propose a solution to efficiently model the Covid-19 spread in Lombardy, the most populated Italian region with about ten million people. In particular, the model described in this paper is, to the best of our knowledge, the first attempt in literature to model a large population at the single-individual level. To achieve this goal, we propose a framework that implements: i. a scale-free model of the social contacts combining a sociability rate, demographic information, and geographical assumptions; ii. a multi-agent system relying on the actor model and the High-Performance Computing technology to efficiently implement ten million concurrent agents. We simulated the epidemic scenario from January to April 2020 and from August to December 2020, modeling the government’s lockdown policies and people’s maskwearing habits. The social modeling approach we propose could be rapidly adapted for modeling future epidemics at their early stage in scenarios where little prior knowledge is available

    Space-time susceptibility modeling of hydro-morphological processes at the Chinese national scale

    Get PDF
    Hydro-morphological processes (HMP; any process in the spectrum between debris flows and flash floods) threaten human lives and infrastructure; and their effects are only expected to worsen under the influence of climate change. Limiting the potential damage of HMPs by taking preventive or remedial actions requires the probabilistic expectation of where and how frequently these processes may occur. The information on where and how frequently a given earth surface process may manifest can be expressed via susceptibility modeling. For the whole Chinese territory, a susceptibility model for HMP is currently not available. To address this issue, we propose a yearly space-time model built on the basis of a binomial Generalized Linear Model. The target variable of such model is the annual presences/absences of HMP per catchment across China, from 1985 to 2015. This information has been accessed via the Chinese catalogue of HMP, a data repository the Chinese Government has activated in 1950 and which is still currently in use. This binary spatio-temporal information is regressed against a set of time-invariant (catchment shape indices and geomorphic attributes) and time-variant (urban coverage, rainfall, vegetation density and land use) covariates. Furthermore, we include a regression constant for each of the 31 years under consideration and also a three-years aggregated information on previously occurred (and not-occurred) HMP. We consider two versions of our modeling approach, an explanatory benchmark where we fit the whole space-time HMP data, including a multiple intercept per year. Furthermore, we also extend this explanatory model into a predictive one, by considering four temporal cross-validation schemes. As a result, we portrayed the annual susceptibility models into 30 maps, where the south-east of China is shown to exhibit the largest variation in the spatio-temporal probability of HMP occurrence. Also, we compressed the whole spatio-temporal prediction into three summary maps. These report the mean, maximum and 95% confidence interval of the spatio-temporal susceptibility distribution per catchment, per year. The information we present has a dual value. On the one hand, we provide a platform to interpret environmental effects controlling the occurrence of HMP over a very large spatial (the whole Chinese country) and temporal (31 years of records) domain. On the other hand, we provide information on which catchments are more prone to experience a HMP-driven hazard. Hence, a step further would be to select the most susceptible catchments for detailed analysis where physically-based models could be tested to estimate the potentially impacted areas. For transparency, the results generated in this work are shared in the supplementary material as GIS (geopackage) files

    INDCOR White Paper 2: Interactive Narrative Design for Representing Complexity

    Full text link
    This white paper was written by the members of the Work Group focusing on design practices of the COST Action 18230 - Interactive Narrative Design for Complexity Representation (INDCOR, WG1). It presents an overview of Interactive Digital Narratives (IDNs) design for complexity representations through IDN workflows and methodologies, IDN authoring tools and applications. It provides definitions of the central elements of the IDN alongside its best practices, designs and methods. Finally, it describes complexity as a feature of IDN, with related examples. In summary, this white paper serves as an orienting map for the field of IDN design, understanding where we are in the contemporary panorama while charting the grounds of their promising futures.Comment: 11 pages, This whitepaper was produced by members of the COST Action 18230 - Interactive Narrative Design for Complexity Representation (INDCOR - https://indcor.eu

    INDCOR white paper on the Design of Complexity IDNs

    Get PDF
    This white paper was written by the members of the Work Group focusing on design practices of the COST Action 18230 - Interactive Narrative Design for Complexity Representation (INDCOR, WG1). It presents an overview of Interactive Digital Narratives (IDNs) design for complexity representations through IDN workflows and methodologies, IDN authoring tools and applications. It provides definitions of the central elements of the IDN alongside its best practices, designs and methods. Finally, it describes complexity as a feature of IDN, with related examples. In summary, this white paper serves as an orienting map for the field of IDN design, understanding where we are in the contemporary panorama while charting the grounds of their promising futures

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≥20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≤pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≤{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration
    corecore