1,681 research outputs found
Asynchronous Execution of Python Code on Task Based Runtime Systems
Despite advancements in the areas of parallel and distributed computing, the
complexity of programming on High Performance Computing (HPC) resources has
deterred many domain experts, especially in the areas of machine learning and
artificial intelligence (AI), from utilizing performance benefits of such
systems. Researchers and scientists favor high-productivity languages to avoid
the inconvenience of programming in low-level languages and costs of acquiring
the necessary skills required for programming at this level. In recent years,
Python, with the support of linear algebra libraries like NumPy, has gained
popularity despite facing limitations which prevent this code from distributed
runs. Here we present a solution which maintains both high level programming
abstractions as well as parallel and distributed efficiency. Phylanx, is an
asynchronous array processing toolkit which transforms Python and NumPy
operations into code which can be executed in parallel on HPC resources by
mapping Python and NumPy functions and variables into a dependency tree
executed by HPX, a general purpose, parallel, task-based runtime system written
in C++. Phylanx additionally provides introspection and visualization
capabilities for debugging and performance analysis. We have tested the
foundations of our approach by comparing our implementation of widely used
machine learning algorithms to accepted NumPy standards
A dataflow platform for applications based on Linked Data
Modern software applications increasingly benefit from accessing the multifarious and heterogeneous Web of Data, thanks to the use of web APIs and Linked Data principles. In previous work, the authors proposed a platform to develop applications consuming Linked Data in a declarative and modular way. This paper describes in detail the functional language the platform gives access to, which is based on SPARQL (the standard query language for Linked Data) and on the dataflow paradigm. The language features interactive and meta-programming capabilities so that complex modules/applications can be developed. By adopting a declarative style, it favours the development of modules that can be reused in various specific execution context
Mathematical Modelling of Chemical Diffusion through Skin using Grid-based PSEs
A Problem Solving Environment (PSE) with connections to remote distributed Grid processes is developed. The Grid simulation is itself a parallel process and allows steering of individual or multiple runs of the core computation of chemical diffusion through the stratum corneum, the outer layer of the skin. The effectiveness of this Grid-based approach in improving the quality of the simulation is assessed
P ORTOLAN: a Model-Driven Cartography Framework
Processing large amounts of data to extract useful information is an
essential task within companies. To help in this task, visualization techniques
have been commonly used due to their capacity to present data in synthesized
views, easier to understand and manage. However, achieving the right
visualization display for a data set is a complex cartography process that
involves several transformation steps to adapt the (domain) data to the
(visualization) data format expected by visualization tools. To maximize the
benefits of visualization we propose Portolan, a generic model-driven
cartography framework that facilitates the discovery of the data to visualize,
the specification of view definitions for that data and the transformations to
bridge the gap with the visualization tools. Our approach has been implemented
on top of the Eclipse EMF modeling framework and validated on three different
use cases
The MaggLite Post-WIMP Toolkit: Draw It, Connect It and Run It
International audienceThis article presents MaggLite, a toolkit and sketch-based interface builder allowing fast and interactive design of post-WIMP user interfaces. MaggLite improves design of advanced UIs thanks to its novel mixed-graph architecture that dynamically combines scene-graphs with interaction- graphs. Scene-graphs provide mechanisms to describe and produce rich graphical effects, whereas interaction-graphs allow expressive and fine-grained description of advanced interaction techniques and behaviors such as multiple pointers management, toolglasses, bimanual interaction, gesture, and speech recognition. Both graphs can be built interactively by sketching the UI and specifying the interaction using a dataflow visual language. Communication between the two graphs is managed at runtime by components we call Interaction Access Points. While developers can extend the toolkit by refining built-in generic mechanisms, UI designers can quickly and interactively design, prototype and test advanced user interfaces by applying the MaggLite principle: "draw it, connect it and run it"
On encouraging multiple views for visualization
Visualization enables 'seeing the unseen', and provides new insight into the underlying data. However users far too easily believe or rely on a single representation of the data; this view may be a favourite method, the simplest to perform, or a method that has always been used! But, a single representation may generate a misinterpretation of the information or provide a situation where the user is missing the 'richness' of the data content! By displaying the data in multiple ways a user may understand the information through different perspectives, overcome possible misinterpretations and perform interactive investigative visualization through correlating the information between views. Thus, the use of multiple views of the same information should be encouraged. We believe the visualization system itself should actively encourage the generation of multiple views by providing appropriate tools to aid in this operation. We present and categorise issues for encouraging multiple views and provide a framework for the generation, management and manipulation of such views
- âŠ