13,331 research outputs found
Gunrock: A High-Performance Graph Processing Library on the GPU
For large-scale graph analytics on the GPU, the irregularity of data access
and control flow, and the complexity of programming GPUs have been two
significant challenges for developing a programmable high-performance graph
library. "Gunrock", our graph-processing system designed specifically for the
GPU, uses a high-level, bulk-synchronous, data-centric abstraction focused on
operations on a vertex or edge frontier. Gunrock achieves a balance between
performance and expressiveness by coupling high performance GPU computing
primitives and optimization strategies with a high-level programming model that
allows programmers to quickly develop new graph primitives with small code size
and minimal GPU programming knowledge. We evaluate Gunrock on five key graph
primitives and show that Gunrock has on average at least an order of magnitude
speedup over Boost and PowerGraph, comparable performance to the fastest GPU
hardwired primitives, and better performance than any other GPU high-level
graph library.Comment: 14 pages, accepted by PPoPP'16 (removed the text repetition in the
previous version v5
Simplifying the Development, Use and Sustainability of HPC Software
Developing software to undertake complex, compute-intensive scientific
processes requires a challenging combination of both specialist domain
knowledge and software development skills to convert this knowledge into
efficient code. As computational platforms become increasingly heterogeneous
and newer types of platform such as Infrastructure-as-a-Service (IaaS) cloud
computing become more widely accepted for HPC computations, scientists require
more support from computer scientists and resource providers to develop
efficient code and make optimal use of the resources available to them. As part
of the libhpc stage 1 and 2 projects we are developing a framework to provide a
richer means of job specification and efficient execution of complex scientific
software on heterogeneous infrastructure. The use of such frameworks has
implications for the sustainability of scientific software. In this paper we
set out our developing understanding of these challenges based on work carried
out in the libhpc project.Comment: 4 page position paper, submission to WSSSPE13 worksho
A component-based middleware framework for configurable and reconfigurable Grid computing
Significant progress has been made in the design and development of Grid middleware which, in its present form, is founded on Web services technologies. However, we argue that present-day Grid middleware is severely limited in supporting projected next-generation applications which will involve pervasive and heterogeneous networked infrastructures, and advanced services such as collaborative distributed visualization. In this paper we discuss a new Grid middleware framework that features (i) support for advanced network services based on the novel concept of pluggable overlay networks, (ii) an architectural framework for constructing bespoke Grid middleware platforms in terms of 'middleware domains' such as extensible interaction types and resource discovery. We believe that such features will become increasingly essential with the emergence of next-generation e-Science applications. Copyright (c) 2005 John Wiley & Sons, Ltd
GRIDKIT: Pluggable overlay networks for Grid computing
A `second generation' approach to the provision of Grid middleware is now emerging which is built on service-oriented architecture and web services standards and technologies. However, advanced Grid applications have significant demands that are not addressed by present-day web services platforms. As one prime example, current platforms do not support the rich diversity of communication `interaction types' that are demanded by advanced applications (e.g. publish-subscribe, media streaming, peer-to-peer interaction). In the paper we describe the Gridkit middleware which augments the basic service-oriented architecture to address this particular deficiency. We particularly focus on the communications infrastructure support required to support multiple interaction types in a unified, principled and extensible manner-which we present in terms of the novel concept of pluggable overlay networks
Integrating the common variability language with multilanguage annotations for web engineering
Web applications development involves managing a high diversity of files and resources like code, pages or style sheets, implemented in different languages. To deal with the automatic generation of
custom-made configurations of web applications, industry usually adopts annotation-based approaches even though the majority of studies encourage the use of composition-based approaches to implement
Software Product Lines. Recent work tries to combine both approaches to get the complementary benefits. However, technological companies are reticent to adopt new development paradigms
such as feature-oriented programming or aspect-oriented programming.
Moreover, it is extremely difficult, or even impossible, to apply
these programming models to web applications, mainly because of
their multilingual nature, since their development involves multiple
types of source code (Java, Groovy, JavaScript), templates (HTML,
Markdown, XML), style sheet files (CSS and its variants, such as
SCSS), and other files (JSON, YML, shell scripts). We propose to
use the Common Variability Language as a composition-based approach
and integrate annotations to manage fine grained variability
of a Software Product Line for web applications. In this paper, we (i)
show that existing composition and annotation-based approaches,
including some well-known combinations, are not appropriate to
model and implement the variability of web applications; and (ii)
present a combined approach that effectively integrates annotations
into a composition-based approach for web applications. We implement
our approach and show its applicability with an industrial
real-world system.Universidad de Málaga. Campus de Excelencia Internacional AndalucĂa Tech
Consistent Resolution of Some Relativistic Quantum Paradoxes
A relativistic version of the (consistent or decoherent) histories approach
to quantum theory is developed on the basis of earlier work by Hartle, and used
to discuss relativistic forms of the paradoxes of spherical wave packet
collapse, Bohm's formulation of Einstein-Podolsky-Rosen, and Hardy's paradox.
It is argued that wave function collapse is not needed for introducing
probabilities into relativistic quantum mechanics, and in any case should never
be thought of as a physical process. Alternative approaches to stochastic time
dependence can be used to construct a physical picture of the measurement
process that is less misleading than collapse models. In particular, one can
employ a coarse-grained but fully quantum mechanical description in which
particles move along trajectories, with behavior under Lorentz transformations
the same as in classical relativistic physics, and detectors are triggered by
particles reaching them along such trajectories. States entangled between
spacelike separate regions are also legitimate quantum descriptions, and can be
consistently handled by the formalism presented here. The paradoxes in question
arise because of using modes of reasoning which, while correct for classical
physics, are inconsistent with the mathematical structure of quantum theory,
and are resolved (or tamed) by using a proper quantum analysis. In particular,
there is no need to invoke, nor any evidence for, mysterious long-range
superluminal influences, and thus no incompatibility, at least from this
source, between relativity theory and quantum mechanics.Comment: Latex 42 pages, 7 figures in text using PSTrick
Consistent Probabilities in Wheeler-DeWitt Quantum Cosmology
We give an explicit, rigorous framework for calculating quantum probabilities
in a model theory of quantum gravity. Specifically, we construct the
decoherence functional for the Wheeler-DeWitt quantization of a flat
Friedmann-Robertson-Walker cosmology with a free, massless, minimally coupled
scalar field, thus providing a complete decoherent histories formulation for
this quantum cosmological model. The decoherence functional is applied to study
predictions concerning the model's Dirac (relational) observables; the behavior
of semiclassical states and superpositions of such states; and to study the
singular behavior of quantum Wheeler-DeWitt universes. Within this framework,
rigorous formulae are given for calculating the corresponding probabilities
from the wave function when those probabilities may be consistently defined,
thus replacing earlier heuristics for interpreting the wave function of the
universe with explicit constructions. It is shown according to a rigorously
formulated standard, and in a quantum-mechanically consistent way, that in this
quantization these models are generically singular. Independent of the choice
of state we show that the probability for these Wheeler-DeWitt quantum
universes to ever encounter a singularity is unity. In addition, the relation
between histories formulations of quantum theory and relational Dirac
observables is clarified.Comment: 27 pages, 4 figures. Minor revisions and updated references. Matches
published versio
- …