17 research outputs found

    flepiMoP: The evolution of a flexible infectious disease modeling pipeline during the COVID-19 pandemic

    Get PDF
    The COVID-19 pandemic led to an unprecedented demand for projections of disease burden and healthcare utilization under scenarios ranging from unmitigated spread to strict social distancing policies. In response, members of the Johns Hopkins Infectious Disease Dynamics Group developed flepiMoP (formerly called the COVID Scenario Modeling Pipeline), a comprehensive open-source software pipeline designed for creating and simulating compartmental models of infectious disease transmission and inferring parameters through these models. The framework has been used extensively to produce short-term forecasts and longer-term scenario projections of COVID-19 at the state and county level in the US, for COVID-19 in other countries at various geographic scales, and more recently for seasonal influenza. In this paper, we highlight how the flepiMoP has evolved throughout the COVID-19 pandemic to address changing epidemiological dynamics, new interventions, and shifts in policy-relevant model outputs. As the framework has reached a mature state, we provide a detailed overview of flepiMoP's key features and remaining limitations, thereby distributing flepiMoP and its documentation as a flexible and powerful tool for researchers and public health professionals to rapidly build and deploy large-scale complex infectious disease models for any pathogen and demographic setup

    Exploring Blockchain Adoption Supply Chains: Opportunities and Challenges

    Get PDF
    Acquisition Management / Grant technical reportAcquisition Research Program Sponsored Report SeriesSponsored Acquisition Research & Technical ReportsIn modern supply chains, acquisition often occurs with the involvement of a network of organizations. The resilience, efficiency, and effectiveness of supply networks are crucial for the viability of acquisition. Disruptions in the supply chain require adequate communication infrastructure to ensure resilience. However, supply networks do not have a shared information technology infrastructure that ensures effective communication. Therefore decision-makers seek new methodologies for supply chain management resilience. Blockchain technology offers new decentralization and service delegation methods that can transform supply chains and result in a more flexible, efficient, and effective supply chain. This report presents a framework for the application of Blockchain technology in supply chain management to improve resilience. In the first part of this study, we discuss the limitations and challenges of the supply chain system that can be addressed by integrating Blockchain technology. In the second part, the report provides a comprehensive Blockchain-based supply chain network management framework. The application of the proposed framework is demonstrated using modeling and simulation. The differences in the simulation scenarios can provide guidance for decision-makers who consider using the developed framework during the acquisition process.Approved for public release; distribution is unlimited

    Enhancing Partially Labelled Data: Self Learning and Word Vectors in Natural Language Processing

    Get PDF
    There has been an explosion in unstructured text data in recent years with services like Twitter, Facebook and WhatsApp helping drive this growth. Many of these companies are facing pressure to monitor the content on their platforms and as such Natural Language Processing (NLP) techniques are more important than ever. There are many applications of NLP ranging from spam filtering, sentiment analysis of social media, automatic text summarisation and document classification

    A Multicomponent Distributed Framework for Smart Production System Modeling and Simulation

    Get PDF
    In order to control manufacturing systems, managers need risk and performance evaluation methods and simulation tools. However, these simulation techniques must evolve towards being multiperformance, multiactor, and multisimulation tools, and this requires interoperability between those distributed components. This paper presents an integrated platform that brings interoperability to several simulation components. This work expands the process modeling tool Papyrus to allow it to communicate with external components through both distributed simulation and cosimulation standards. The distributed modeling and simulation framework (DMSF) platform takes its environment into consideration in order to evaluate the sustainability of the system while integrating external heterogeneous components. For instance, a DMSF connection with external IoT devices has been implemented. Moreover, the orchestration of different smart manufacturing components and services is achieved through configurable business models. As a result, an automotive industry case study has successfully been tested to demonstrate the sustainability of smart supply chains and manufacturing factories, allowing better connectivity with their real environments

    Proceedings, MSVSCC 2013

    Get PDF
    Proceedings of the 7th Annual Modeling, Simulation & Visualization Student Capstone Conference held on April 11, 2013 at VMASC in Suffolk, Virginia

    Tools and methods in participatory modeling: Selecting the right tool for the job

    Get PDF
    © 2018 Elsevier Ltd Various tools and methods are used in participatory modelling, at different stages of the process and for different purposes. The diversity of tools and methods can create challenges for stakeholders and modelers when selecting the ones most appropriate for their projects. We offer a systematic overview, assessment, and categorization of methods to assist modelers and stakeholders with their choices and decisions. Most available literature provides little justification or information on the reasons for the use of particular methods or tools in a given study. In most of the cases, it seems that the prior experience and skills of the modelers had a dominant effect on the selection of the methods used. While we have not found any real evidence of this approach being wrong, we do think that putting more thought into the method selection process and choosing the most appropriate method for the project can produce better results. Based on expert opinion and a survey of modelers engaged in participatory processes, we offer practical guidelines to improve decisions about method selection at different stages of the participatory modeling process

    Modelling and Simulation of Human-Environment Interactions

    Get PDF
    Computational models provide intelligent environmental decision support systems to understand how human decisions are shaped by, and contribute to changes in, the environment. These models provide essential tools to tackle the important issues raised by climate change, including migrations and conflicts due to resource scarcity (e.g., water resources), while accounting for the necessity of co-managing ecosystems across a population of stakeholders with diverse goals. Such socio-environmental systems are characterized by their complexity, which is reflected by an abundance of open questions. This book explores several of these open questions, based on the contributions from over 50 authors. While several books account for methodological developments in modeling socio-environmental systems, our book is unique in combining case studies, methodological innovations, and a holistic approach to training the next generation of modelers. One chapter covers the ontological, epistemological, and ethical issues raised at the intersection of sustainability research and social simulation. In another chapter, we show that the benefits of simulations are not limited to managing complex eco-systems, as they can also serve an educational mission in teaching essential rules and thus improve systems thinking competencies in the broader population

    Big data analytics tools for improving the decision-making process in agrifood supply chain

    Get PDF
    Introduzione: Nell'interesse di garantire una sicurezza alimentare a lungo termine di fronte a circostanze mutevoli, ù necessario comprendere e considerare gli aspetti ambientali, sociali ed economici del processo di produzione. Inoltre, a causa della globalizzazione, sono stati sollevati i problemi delle lunghe filiere agroalimentari, l'asimmetria informativa, la contraffazione, la difficoltà di tracciare e rintracciare l'origine dei prodotti e le numerose questioni correlate quali il benessere dei consumatori e i costi sanitari. Le tecnologie emergenti guidano verso il raggiungimento di nuovi approcci socioeconomici in quanto consentono al governo e ai singoli produttori agricoli di raccogliere ed analizzare una quantità sempre crescente di dati ambientali, agronomici, logistici e danno la possibilità ai consumatori ed alle autorità di controllo della qualità di accedere a tutte le informazioni necessarie in breve tempo e facilmente. Obiettivo: L'oggetto della ricerca riguarda lo studio delle modalità di miglioramento del processo produttivo attraverso la riduzione dell'asimmetria informativa, rendendola disponibile alle parti interessate in un tempo ragionevole, analizzando i dati sui processi produttivi, considerando l'impatto ambientale della produzione in termini di ecologia, economia, sicurezza alimentare e qualità di cibo, costruendo delle opportunità per le parti interessate nel prendere decisioni informate, oltre che semplificare il controllo della qualità, della contraffazione e delle frodi. Pertanto, l'obiettivo di questo lavoro ù quello di studiare le attuali catene di approvvigionamento, identificare le loro debolezze e necessità, analizzare le tecnologie emergenti, le loro caratteristiche e gli impatti sulle catene di approvvigionamento e fornire utili raccomandazioni all'industria, ai governi e ai policy maker.Introduction: In the interest of ensuring long-term food security and safety in the face of changing circumstances, it is interesting and necessary to understand and to take into consideration the environmental, social and economic aspects of food and beverage production in relation to the consumers’ demand. Besides, due to the globalization, the problems of long supply chains, information asymmetry, counterfeiting, difficulty for tracing and tracking back the origin of the products and numerous related issues have been raised such as consumers’ well-being and healthcare costs. Emerging technologies drive to achieve new socio-economic approaches as they enable government and individual agricultural producers to collect and analyze an ever-increasing amount of environmental, agronomic, logistic data, and they give the possibility to the consumers and quality control authorities to get access to all necessary information in a short notice and easily. Aim: The object of the research essentially concerns the study of the ways for improving the production process through reducing the information asymmetry, making it available for interested parties in a reasonable time, analyzing the data about production processes considering the environmental impact of production in terms of ecology, economy, food safety and food quality and build the opportunity for stakeholders to make informed decisions, as well as simplifying the control of the quality, counterfeiting and fraud. Therefore, the aim of this work is to study current supply chains, to identify their weaknesses and necessities, to investigate the emerging technologies, their characteristics and the impacts on supply chains, and to provide with the useful recommendations the industry, governments and policymakers

    Monitoring of Honey Bee Colony Losses

    Get PDF
    In recent decades, independent national and international research programs have revealed possible reasons behind the death of managed honey bee colonies worldwide. Such losses are not due to a single factor, but instead are due to highly complex interactions between various internal and external influences, including pests, pathogens, honey bee stock diversity, and environmental changes. Reduced honey bee vitality and nutrition, exposure to agrochemicals, and the quality of colony management contribute to reduced colony survival in beekeeping operations. Our Special Issue (SI) on ‘’Monitoring of Honey Bee Colony Losses” aims to address the specific challenges that honey bee researchers and beekeepers face. This SI includes four reviews, with one being a meta-analysis that identifies gaps in the current and future directions for research into honey bee colonies’ mortalities. Other review articles include studies regarding the impact of numerous factors on honey bee mortality, including external abiotic factors (e.g., winter conditions and colony management) as well as biotic factors such as attacks by Vespa velutina and Varroa destructor

    Keeping checkpoint/restart viable for exascale systems

    Get PDF
    Next-generation exascale systems, those capable of performing a quintillion operations per second, are expected to be delivered in the next 8-10 years. These systems, which will be 1,000 times faster than current systems, will be of unprecedented scale. As these systems continue to grow in size, faults will become increasingly common, even over the course of small calculations. Therefore, issues such as fault tolerance and reliability will limit application scalability. Current techniques to ensure progress across faults like checkpoint/restart, the dominant fault tolerance mechanism for the last 25 years, are increasingly problematic at the scales of future systems due to their excessive overheads. In this work, we evaluate a number of techniques to decrease the overhead of checkpoint/restart and keep this method viable for future exascale systems. More specifically, this work evaluates state-machine replication to dramatically increase the checkpoint interval (the time between successive checkpoints) and hash-based, probabilistic incremental checkpointing using graphics processing units to decrease the checkpoint commit time (the time to save one checkpoint). Using a combination of empirical analysis, modeling, and simulation, we study the costs and benefits of these approaches on a wide range of parameters. These results, which cover of number of high-performance computing capability workloads, different failure distributions, hardware mean time to failures, and I/O bandwidths, show the potential benefits of these techniques for meeting the reliability demands of future exascale platforms
    corecore