58 research outputs found
Towards efficient large scale epidemiological simulations in EpiGraph
The work we present in this paper focuses on understanding the propagation of flu-like infectious outbreaks between geographically distant regions due to the movement of people outside their base location. Our approach incorporates geographic location and a transportation model into our existing region-based, closed-world EpiGraph simulator to model a more realistic movement of the virus between different geographic areas. This paper describes the MPI-based implementation of this simulator, including several optimization techniques such as a novel approach for mapping processes onto available processing elements based on the temporal distribution of process loads. We present an extensive evaluation of EpiGraph in terms of its ability to simulate large-scale scenarios, as well as from a performance perspective.We would like to acknowledge the assistance provided by David del RÃo Astorga and Alberto MartÃn Cajal. This work has been partially supported by the Spanish Ministry of Science TIN2010-16497, 2010.Peer ReviewedPostprint (author's final draft
ACM Trans Model Comput Simul
The Global-Scale Agent Model (GSAM) is presented. The GSAM is a high-performance distributed platform for agent-based epidemic modeling capable of simulating a disease outbreak in a population of several billion agents. It is unprecedented in its scale, its speed, and its use of Java. Solutions to multiple challenges inherent in distributing massive agent-based models are presented. Communication, synchronization, and memory usage are among the topics covered in detail. The memory usage discussion is Java specific. However, the communication and synchronization discussions apply broadly. We provide benchmarks illustrating the GSAM's speed and scalability.R24 HD042854/HD/NICHD NIH HHS/United StatesU01 GM070708/GM/NIGMS NIH HHS/United StatesDP1 OD003874/OD/NIH HHS/United StatesP01 TP000304/TP/OPHPR CDC HHS/United StatesDP1 GM105382/GM/NIGMS NIH HHS/United States2014-01-22T00:00:00Z24465120PMC389877
Research And Application Of Parallel Computing Algorithms For Statistical Phylogenetic Inference
Estimating the evolutionary history of organisms, phylogenetic inference, is a
critical step in many analyses involving biological sequence data such as DNA.
The likelihood calculations at the heart of the most effective methods for
statistical phylogenetic analyses are extremely computationally intensive, and
hence these analyses become a bottleneck in many studies. Recent progress in
computer hardware, specifically the increase in pervasiveness of highly
parallel, many-core processors has created opportunities for new approaches to
computationally intensive methods, such as those in phylogenetic inference.
We have developed an open source library, BEAGLE, which uses parallel
computing methods to greatly accelerate statistical phylogenetic inference,
for both maximum likelihood and Bayesian approaches. BEAGLE defines a uniform
application programming interface and includes a collection of efficient
implementations that use NVIDIA CUDA, OpenCL, and C++ threading frameworks
for evaluating likelihoods under a wide variety of evolutionary models, on
GPUs as well as on multi-core CPUs. BEAGLE employs a number of different
parallelization techniques for phylogenetic inference, at different
granularity levels and for distinct processor architectures. On CUDA and
OpenCL devices, the library enables concurrent computation of site likelihoods,
data subsets, and independent subtrees. The general design features of the
library also provide a model for software development using parallel computing
frameworks that is applicable to other domains.
BEAGLE has been integrated with some of the leading programs in the field,
such as MrBayes and BEAST, and is used in a diverse range of evolutionary
studies, including those of disease causing viruses. The library can provide
significant performance gains, with the exact increase in performance
depending on the specific properties of the data set, evolutionary model, and
hardware. In general, nucleotide analyses are accelerated on the order of
10-fold and codon analyses on the order of 100-fold
Edge Intelligence for Empowering IoT-based Healthcare Systems
The demand for real-time, affordable, and efficient smart healthcare services
is increasing exponentially due to the technological revolution and burst of
population. To meet the increasing demands on this critical infrastructure,
there is a need for intelligent methods to cope with the existing obstacles in
this area. In this regard, edge computing technology can reduce latency and
energy consumption by moving processes closer to the data sources in comparison
to the traditional centralized cloud and IoT-based healthcare systems. In
addition, by bringing automated insights into the smart healthcare systems,
artificial intelligence (AI) provides the possibility of detecting and
predicting high-risk diseases in advance, decreasing medical costs for
patients, and offering efficient treatments. The objective of this article is
to highlight the benefits of the adoption of edge intelligent technology, along
with AI in smart healthcare systems. Moreover, a novel smart healthcare model
is proposed to boost the utilization of AI and edge technology in smart
healthcare systems. Additionally, the paper discusses issues and research
directions arising when integrating these different technologies together.Comment: This paper has been accepted in IEEE Wireless Communication Magazin
Modern computing: Vision and challenges
Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress
Modern computing: vision and challenges
Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress
- …