4,206 research outputs found

    Making work flow : on the design, analysis and enactment of business processes

    Get PDF

    Harnessing the Power of Many: Extensible Toolkit for Scalable Ensemble Applications

    Full text link
    Many scientific problems require multiple distinct computational tasks to be executed in order to achieve a desired solution. We introduce the Ensemble Toolkit (EnTK) to address the challenges of scale, diversity and reliability they pose. We describe the design and implementation of EnTK, characterize its performance and integrate it with two distinct exemplar use cases: seismic inversion and adaptive analog ensembles. We perform nine experiments, characterizing EnTK overheads, strong and weak scalability, and the performance of two use case implementations, at scale and on production infrastructures. We show how EnTK meets the following general requirements: (i) implementing dedicated abstractions to support the description and execution of ensemble applications; (ii) support for execution on heterogeneous computing infrastructures; (iii) efficient scalability up to O(10^4) tasks; and (iv) fault tolerance. We discuss novel computational capabilities that EnTK enables and the scientific advantages arising thereof. We propose EnTK as an important addition to the suite of tools in support of production scientific computing

    Data model issues in the Cherenkov Telescope Array project

    Get PDF
    The planned Cherenkov Telescope Array (CTA), a future ground-based Very-High-Energy (VHE) gamma-ray observatory, will be the largest project of its kind. It aims to provide an order of magnitude increase in sensitivity compared to currently operating VHE experiments and open access to guest observers. These features, together with the thirty years lifetime planned for the installation, impose severe constraints on the data model currently being developed for the project. In this contribution we analyze the challenges faced by the CTA data model development and present the requirements imposed to face them. While the full data model is still not completed we show the organization of the work, status of the design, and an overview of the prototyping efforts carried out so far. We also show examples of specific aspects of the data model currently under development.Comment: In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589

    Analyze Large Multidimensional Datasets Using Algebraic Topology

    Get PDF
    This paper presents an efficient algorithm to extract knowledge from high-dimensionality, high- complexity datasets using algebraic topology, namely simplicial complexes. Based on concept of isomorphism of relations, our method turn a relational table into a geometric object (a simplicial complex is a polyhedron). So, conceptually association rule searching is turned into a geometric traversal problem. By leveraging on the core concepts behind Simplicial Complex, we use a new technique (in computer science) that improves the performance over existing methods and uses far less memory. It was designed and developed with a strong emphasis on scalability, reliability, and extensibility. This paper also investigate the possibility of Hadoop integration and the challenges that come with the framework

    Potential of I/O aware workflows in climate and weather

    Get PDF
    The efficient, convenient, and robust execution of data-driven workflows and enhanced data management are essential for productivity in scientific computing. In HPC, the concerns of storage and computing are traditionally separated and optimised independently from each other and the needs of the end-to-end user. However, in complex workflows, this is becoming problematic. These problems are particularly acute in climate and weather workflows, which as well as becoming increasingly complex and exploiting deep storage hierarchies, can involve multiple data centres. The key contributions of this paper are: 1) A sketch of a vision for an integrated data-driven approach, with a discussion of the associated challenges and implications, and 2) An architecture and roadmap consistent with this vision that would allow a seamless integration into current climate and weather workflows as it utilises versions of existing tools (ESDM, Cylc, XIOS, and DDN’s IME). The vision proposed here is built on the belief that workflows composed of data, computing, and communication-intensive tasks should drive interfaces and hardware configurations to better support the programming models. When delivered, this work will increase the opportunity for smarter scheduling of computing by considering storage in heterogeneous storage systems. We illustrate the performance-impact on an example workload using a model built on measured performance data using ESDM at DKRZ

    New design companions opening up the process through self-made computation

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Architecture, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 73-75).This thesis is about man and machine roles in the early conception of designs where it investigates computational methods that support creativity and surprise. It discusses the relationship between human and digital medium in the enterprise of Computer-Aided Design', and Self-Made Computation to empower the designer as driver of digital processes taking the computer as an active collaborator, or a sharp apprentice, rather than a master. In a design process tool personalization enables precise feedback between human and medium. In the field of architecture, every project is unique, and there are as many design workflows as designers. However current off-the-shelf design software has an inflexible built-in structure targeting general problem-solving that can interfere with non-standard design needs. Today, those with programming agility look for customized processes that assist early problem-finding instead of converging solutions. Contributing to alleviate software frustrations, smaller tailor-made applications prove to be precisely tailored, viable and enriching companions in certain moments of the project development. Previous work on the impact of standardized software for design has focused on the figure of the designer as a tool-user, this thesis addresses the question from the vision of the designer as a tool-maker. It investigates how self-made software can become a design companion for computational thinking - observed here as a new mindset that shifts design workflows, rather than a technique. The research compares and diagrams designer-toolmaker work where self-made applets where produced, as well as the structures in the work of rule-maker artisans. The main contributions are a comparative study of three models of computer-aided design, their history and technical review, their influence in design workflows and a graphical framework to better compare them. Critical analysis reveals a common structure to tailor a creative and explorative design workflow. Its advantages and limitations are exposed to guide designers into alternative computational methods for design processes. Keywords: design workflow; computation; applets; self-made tools; diagrams; design process; feedback; computers; computer-assisted-designby Laia Mogas-Soldevila.S.M
    • 

    corecore