2,334 research outputs found

    AGENT-BASED DISCRETE EVENT SIMULATION MODELING AND EVOLUTIONARY REAL-TIME DECISION MAKING FOR LARGE-SCALE SYSTEMS

    Get PDF
    Computer simulations are routines programmed to imitate detailed system operations. They are utilized to evaluate system performance and/or predict future behaviors under certain settings. In complex cases where system operations cannot be formulated explicitly by analytical models, simulations become the dominant mode of analysis as they can model systems without relying on unrealistic or limiting assumptions and represent actual systems more faithfully. Two main streams exist in current simulation research and practice: discrete event simulation and agent-based simulation. This dissertation facilitates the marriage of the two. By integrating the agent-based modeling concepts into the discrete event simulation framework, we can take advantage of and eliminate the disadvantages of both methods.Although simulation can represent complex systems realistically, it is a descriptive tool without the capability of making decisions. However, it can be complemented by incorporating optimization routines. The most challenging problem is that large-scale simulation models normally take a considerable amount of computer time to execute so that the number of solution evaluations needed by most optimization algorithms is not feasible within a reasonable time frame. This research develops a highly efficient evolutionary simulation-based decision making procedure which can be applied in real-time management situations. It basically divides the entire process time horizon into a series of small time intervals and operates simulation optimization algorithms for those small intervals separately and iteratively. This method improves computational tractability by decomposing long simulation runs; it also enhances system dynamics by incorporating changing information/data as the event unfolds. With respect to simulation optimization, this procedure solves efficient analytical models which can approximate the simulation and guide the search procedure to approach near optimality quickly.The methods of agent-based discrete event simulation modeling and evolutionary simulation-based decision making developed in this dissertation are implemented to solve a set of disaster response planning problems. This research also investigates a unique approach to validating low-probability, high-impact simulation systems based on a concrete example problem. The experimental results demonstrate the feasibility and effectiveness of our model compared to other existing systems

    A generic method to develop simulation models for ambulance systems

    Get PDF
    In this paper, we address the question of generic simulation models and their role in improving emergency care around the world. After reviewing the development of ambulance models and the contexts in which they have been applied, we report the construction of a reusable model for ambulance systems. Further, we describe the associated parameters, data sources, and performance measures, and report on the collection of information, as well as the use of optimisation to configure the service to best effect. Having developed the model, we have validated it using real data from the emergency medical system in a Brazilian city, Belo Horizonte. To illustrate the benefits of standardisation and reusability we apply the model to a UK context by exploring how different rules of engagement would change the performance of the system. Finally, we consider the impact that one might observe if such rules were adopted by the Brazilian system

    Modeling social norms in real-world agent-based simulations

    Get PDF
    Studying and simulating social systems including human groups and societies can be a complex problem. In order to build a model that simulates humans\u27 actions, it is necessary to consider the major factors that affect human behavior. Norms are one of these factors: social norms are the customary rules that govern behavior in groups and societies. Norms are everywhere around us, from the way people handshake or bow to the clothes they wear. They play a large role in determining our behaviors. Studies on norms are much older than the age of computer science, since normative studies have been a classic topic in sociology, psychology, philosophy and law. Various theories have been put forth about the functioning of social norms. Although an extensive amount of research on norms has been performed during the recent years, there remains a significant gap between current models and models that can explain real-world normative behaviors. Most of the existing work on norms focuses on abstract applications, and very few realistic normative simulations of human societies can be found. The contributions of this dissertation include the following: 1) a new hybrid technique based on agent-based modeling and Markov Chain Monte Carlo is introduced. This method is used to prepare a smoking case study for applying normative models. 2) This hybrid technique is described using category theory, which is a mathematical theory focusing on relations rather than objects. 3) The relationship between norm emergence in social networks and the theory of tipping points is studied. 4) A new lightweight normative architecture for studying smoking cessation trends is introduced. This architecture is then extended to a more general normative framework that can be used to model real-world normative behaviors. The final normative architecture considers cognitive and social aspects of norm formation in human societies. Normative architectures based on only one of these two aspects exist in the literature, but a normative architecture that effectively includes both of these two is missing

    Architectures and GPU-Based Parallelization for Online Bayesian Computational Statistics and Dynamic Modeling

    Get PDF
    Recent work demonstrates that coupling Bayesian computational statistics methods with dynamic models can facilitate the analysis of complex systems associated with diverse time series, including those involving social and behavioural dynamics. Particle Markov Chain Monte Carlo (PMCMC) methods constitute a particularly powerful class of Bayesian methods combining aspects of batch Markov Chain Monte Carlo (MCMC) and the sequential Monte Carlo method of Particle Filtering (PF). PMCMC can flexibly combine theory-capturing dynamic models with diverse empirical data. Online machine learning is a subcategory of machine learning algorithms characterized by sequential, incremental execution as new data arrives, which can give updated results and predictions with growing sequences of available incoming data. While many machine learning and statistical methods are adapted to online algorithms, PMCMC is one example of the many methods whose compatibility with and adaption to online learning remains unclear. In this thesis, I proposed a data-streaming solution supporting PF and PMCMC methods with dynamic epidemiological models and demonstrated several successful applications. By constructing an automated, easy-to-use streaming system, analytic applications and simulation models gain access to arriving real-time data to shorten the time gap between data and resulting model-supported insight. The well-defined architecture design emerging from the thesis would substantially expand traditional simulation models' potential by allowing such models to be offered as continually updated services. Contingent on sufficiently fast execution time, simulation models within this framework can consume the incoming empirical data in real-time and generate informative predictions on an ongoing basis as new data points arrive. In a second line of work, I investigated the platform's flexibility and capability by extending this system to support the use of a powerful class of PMCMC algorithms with dynamic models while ameliorating such algorithms' traditionally stiff performance limitations. Specifically, this work designed and implemented a GPU-enabled parallel version of a PMCMC method with dynamic simulation models. The resulting codebase readily has enabled researchers to adapt their models to the state-of-art statistical inference methods, and ensure that the computation-heavy PMCMC method can perform significant sampling between the successive arrival of each new data point. Investigating this method's impact with several realistic PMCMC application examples showed that GPU-based acceleration allows for up to 160x speedup compared to a corresponding CPU-based version not exploiting parallelism. The GPU accelerated PMCMC and the streaming processing system can complement each other, jointly providing researchers with a powerful toolset to greatly accelerate learning and securing additional insight from the high-velocity data increasingly prevalent within social and behavioural spheres. The design philosophy applied supported a platform with broad generalizability and potential for ready future extensions. The thesis discusses common barriers and difficulties in designing and implementing such systems and offers solutions to solve or mitigate them

    Adaptable Spatial Agent-Based Facility Location for Healthcare Coverage

    Get PDF
    Lack of access to healthcare is responsible for the world’s poverty, mortality and morbidity. Public healthcare facilities (HCFs) are expected to be located such that they can be reached within reasonable distances of the patients’ locations, while at the same time providing complete service coverage. However, complete service coverage is generally hampered by resource availability. Therefore, the Maximal Covering Location Problem (MCLP), seeks to locate HCFs such that as much population as possible is covered within a desired service distance. A consideration to the population not covered introduces a distance constraint that is greater than the desired service distance, beyond which no population should be. Existing approaches to the MCLP exogenously set the number of HCFs and the distance parameters, with further assumption of equal access to HCFs, infinite or equal capacity of HCFs and data availability. These models tackle the real-world system as static and do not address its intrinsic complexity that is characterised by unstable and diverse geographic, demographic and socio-economic factors that influence the spatial distribution of population and HCFs, resource management, the number of HCFs and proximity to HCFs. Static analysis incurs more expenditure in the analytical and decision-making process for every additional complexity and heterogeneity. This thesis is focused on addressing these limitations and simplifying the computationally intensive problems. A novel adaptable and flexible simulation-based meta-heuristic approach is employed to determine suitable locations for public HCFs by integrating Geographic Information Systems (GIS) with Agent-Based Models (ABM). Intelligent, adaptable and autonomous spatial and non-spatial agents are utilized to interact with each other and the geographic environment, while taking independent decisions governed by spatial rules, such as •containment, •adjacency, •proximity and •connectivity. Three concepts are introduced: assess the coverage of existing HCFs using travel-time along the road network and determine the different average values of the service distance; endogenously determine the number and suitable locations of HCFs by integrating capacity and locational suitability constraints for maximizing coverage within the prevailing service distance; endogenously determine the distance constraint as the maximum distance between the population not covered within the desired service distance and its closest facility. The models’ validations on existing algorithms produce comparable and better results. With confirmed transferability, the thesis is applied to Lagos State, Nigeria in a disaggregated analysis that reflects spatial heterogeneity, to provide improved service coverage for healthcare. The assessment of the existing health service coverage and spatial distribution reveals disparate accessibility and insufficiency of the HCFs whose locations do not factor in the spatial distribution of the population. Through the application of the simulation-based approach, a cost-effective complete health service coverage is achieved with new HCFs. The spatial pattern and autocorrelation analysis reveal the influence of population distribution and geographic phenomenon on HCF location. The relationship of selected HCFs with other spatial features indicates agents’ compliant with spatial association. This approach proves to be a better alternative in resource constrained systems. The adaptability and flexibility meet the global health coverage agenda, the desires of the decision maker and the population, in the support for public health service coverage. In addition, a general theory of the system for a better-informed decision and analytical knowledge is obtained

    Assessing vulnerability and modelling assistance: using demographic indicators of vulnerability and agent-based modelling to explore emergency flooding relief response

    Get PDF
    Flooding is a significant concern for much of the UK and is recognised as a primary threat by most local councils. Those in society most often deemed vulnerable: the elderly, poor or sick, for example, often see their level of vulnerability increase during hazard events. A greater knowledge of the spatial distribution of vulnerability within communities is key to understanding how a population may be impacted by a hazard event. Vulnerability indices are regularly used – in conjunction with needs assessments and on-the-ground research – to target service provision and justify resource allocation. Past work on measuring and mapping vulnerability has been limited by a focus on income-related indicators, a lack of consideration of accessibility, and the reliance on proprietary data. The Open Source Vulnerability Index (OSVI) encompasses an extensive range of vulnerability indicators supported by the wider literature and expert validation and provides data at a sufficiently fine resolution that can identify vulnerable populations. Findings of the OSVI demonstrate the potential cascading impact of a flood hazard as it impacts an already vulnerable population: exacerbating pre-existing vulnerabilities, limiting capabilities and restricting accessibility and access to key services. The OSVI feeds into an agent-based model (ABM) that explores the capacity of the British Red Cross (BRC) to distribute relief during flood emergencies using strategies based upon the OSVI. A participatory modelling approach was utilised whereby the BRC were included in all aspects of the model development. The major contribution of this work is the novel synthesis of demographics analysis, vulnerability mapping and geospatial simulation. The project contributes to the growing understanding of vulnerability and response management within the NGO sector. It is hoped that the index and model produced will allow responder organisations to run simulations of similar emergency events and adjust strategic response plans accordingly
    • …
    corecore