18 research outputs found

    Scalable analytics over unstructured multidimensional time-series data

    Get PDF
    Presented at the National data integrity conference: enabling research: new challenges & opportunities held on May 7-8, 2015 at Colorado State University, Fort Collins, Colorado. Researchers, administrators and integrity officers are encountering new challenges regarding research data and integrity. This conference aims to provide attendees with both a high level understanding of these challenges and impart practical tools and skills to deal with them. Topics will include data reproducibility, validity, privacy, security, visualization, reuse, access, preservation, rights and management.PowerPoint presentation given on May 7, 2015

    SWARM: Scheduling Large-Scale Jobs over the Loosely-Coupled HPC Clusters

    Full text link
    Abstract β€” Compute-intensive scientific applications are heavily reliant on the available quantity of computing resources. The Grid paradigm provides a large scale computing environment for scientific users. However, conventional Grid job submission tools do not provide a high-level job scheduling environment for these users across multiple institutions. For extremely large number of jobs, a more scalable job scheduling framework that can leverage highly distributed clusters and supercomputers is required. In this paper, we propose a high-level job scheduling Web service framework, Swarm. Swarm is developed for scientific applications that must submit massive number of high-throughput jobs or workflows to highly distributed computing clusters. The Swarm service itself is designed to b

    Exchanges indicating

    No full text
    scheme for reliable delivery Message numbering initializatio
    corecore