10,712 research outputs found
Hierarchical fractional-step approximations and parallel kinetic Monte Carlo algorithms
We present a mathematical framework for constructing and analyzing parallel
algorithms for lattice Kinetic Monte Carlo (KMC) simulations. The resulting
algorithms have the capacity to simulate a wide range of spatio-temporal scales
in spatially distributed, non-equilibrium physiochemical processes with complex
chemistry and transport micro-mechanisms. The algorithms can be tailored to
specific hierarchical parallel architectures such as multi-core processors or
clusters of Graphical Processing Units (GPUs). The proposed parallel algorithms
are controlled-error approximations of kinetic Monte Carlo algorithms,
departing from the predominant paradigm of creating parallel KMC algorithms
with exactly the same master equation as the serial one.
Our methodology relies on a spatial decomposition of the Markov operator
underlying the KMC algorithm into a hierarchy of operators corresponding to the
processors' structure in the parallel architecture. Based on this operator
decomposition, we formulate Fractional Step Approximation schemes by employing
the Trotter Theorem and its random variants; these schemes, (a) determine the
communication schedule} between processors, and (b) are run independently on
each processor through a serial KMC simulation, called a kernel, on each
fractional step time-window.
Furthermore, the proposed mathematical framework allows us to rigorously
justify the numerical and statistical consistency of the proposed algorithms,
showing the convergence of our approximating schemes to the original serial
KMC. The approach also provides a systematic evaluation of different processor
communicating schedules.Comment: 34 pages, 9 figure
Multifractal Characterization of Protein Contact Networks
The multifractal detrended fluctuation analysis of time series is able to
reveal the presence of long-range correlations and, at the same time, to
characterize the self-similarity of the series. The rich information derivable
from the characteristic exponents and the multifractal spectrum can be further
analyzed to discover important insights about the underlying dynamical process.
In this paper, we employ multifractal analysis techniques in the study of
protein contact networks. To this end, initially a network is mapped to three
different time series, each of which is generated by a stationary unbiased
random walk. To capture the peculiarities of the networks at different levels,
we accordingly consider three observables at each vertex: the degree, the
clustering coefficient, and the closeness centrality. To compare the results
with suitable references, we consider also instances of three well-known
network models and two typical time series with pure monofractal and
multifractal properties. The first result of notable interest is that time
series associated to proteins contact networks exhibit long-range correlations
(strong persistence), which are consistent with signals in-between the typical
monofractal and multifractal behavior. Successively, a suitable embedding of
the multifractal spectra allows to focus on ensemble properties, which in turn
gives us the possibility to make further observations regarding the considered
networks. In particular, we highlight the different role that small and large
fluctuations of the considered observables play in the characterization of the
network topology
Semi-automated creation of converged iTV services: From macromedia director simulations to services ready for broadcast
While sound and video may capture viewers’ attention, interaction can captivate them. This has not been available prior to the advent of Digital Television. In fact, what lies at the heart of the Digital Television revolution
is this new type of interactive content, offered
in the form of interactive Television (iTV) services. On top of that, the new world of converged networks has created a demand for a new type of converged services on a range of mobile terminals (Tablet PCs, PDAs and mobile phones). This paper aims at presenting a new approach to service creation that allows for the semi-automatic translation of simulations and rapid prototypes created in the accessible desktop
multimedia authoring package Macromedia Director
into services ready for broadcast. This is achieved by a series of tools that de-skill and speed-up the process of creating digital TV user interfaces (UI) and applications for mobile terminals.
The benefits of rapid prototyping are essential for the production of these new types of services, and are therefore discussed in the first section of this paper.
In the following sections, an overview of the
operation of content, service, creation and management sub-systems is presented, which illustrates why these tools compose an important and integral part of a system responsible of creating, delivering and managing converged broadcast and telecommunications services.
The next section examines a number of metadata
languages candidates for describing the iTV services user interface and the schema language adopted in this project. A detailed description of the operation of the two tools is provided to offer an insight of how they can be used to de-skill and speed-up the process of creating digital TV user interfaces and applications for mobile terminals. Finally, representative broadcast oriented and telecommunication oriented converged service components are also introduced, demonstrating how these tools have been used to generate different types of services
Computer model calibration with large non-stationary spatial outputs: application to the calibration of a climate model
Bayesian calibration of computer models tunes unknown input parameters by
comparing outputs with observations. For model outputs that are distributed
over space, this becomes computationally expensive because of the output size.
To overcome this challenge, we employ a basis representation of the model
outputs and observations: we match these decompositions to carry out the
calibration efficiently. In the second step, we incorporate the non-stationary
behaviour, in terms of spatial variations of both variance and correlations, in
the calibration. We insert two integrated nested Laplace
approximation-stochastic partial differential equation parameters into the
calibration. A synthetic example and a climate model illustration highlight the
benefits of our approach
Recommended from our members
The Computational Diet: A Review of Computational Methods Across Diet, Microbiome, and Health.
Food and human health are inextricably linked. As such, revolutionary impacts on health have been derived from advances in the production and distribution of food relating to food safety and fortification with micronutrients. During the past two decades, it has become apparent that the human microbiome has the potential to modulate health, including in ways that may be related to diet and the composition of specific foods. Despite the excitement and potential surrounding this area, the complexity of the gut microbiome, the chemical composition of food, and their interplay in situ remains a daunting task to fully understand. However, recent advances in high-throughput sequencing, metabolomics profiling, compositional analysis of food, and the emergence of electronic health records provide new sources of data that can contribute to addressing this challenge. Computational science will play an essential role in this effort as it will provide the foundation to integrate these data layers and derive insights capable of revealing and understanding the complex interactions between diet, gut microbiome, and health. Here, we review the current knowledge on diet-health-gut microbiota, relevant data sources, bioinformatics tools, machine learning capabilities, as well as the intellectual property and legislative regulatory landscape. We provide guidance on employing machine learning and data analytics, identify gaps in current methods, and describe new scenarios to be unlocked in the next few years in the context of current knowledge
- …