18,539 research outputs found

    HPC Cloud for Scientific and Business Applications: Taxonomy, Vision, and Research Challenges

    Full text link
    High Performance Computing (HPC) clouds are becoming an alternative to on-premise clusters for executing scientific applications and business analytics services. Most research efforts in HPC cloud aim to understand the cost-benefit of moving resource-intensive applications from on-premise environments to public cloud platforms. Industry trends show hybrid environments are the natural path to get the best of the on-premise and cloud resources---steady (and sensitive) workloads can run on on-premise resources and peak demand can leverage remote resources in a pay-as-you-go manner. Nevertheless, there are plenty of questions to be answered in HPC cloud, which range from how to extract the best performance of an unknown underlying platform to what services are essential to make its usage easier. Moreover, the discussion on the right pricing and contractual models to fit small and large users is relevant for the sustainability of HPC clouds. This paper brings a survey and taxonomy of efforts in HPC cloud and a vision on what we believe is ahead of us, including a set of research challenges that, once tackled, can help advance businesses and scientific discoveries. This becomes particularly relevant due to the fast increasing wave of new HPC applications coming from big data and artificial intelligence.Comment: 29 pages, 5 figures, Published in ACM Computing Surveys (CSUR

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    GTTC Future of Ground Testing Meta-Analysis of 20 Documents

    Get PDF
    National research, development, test, and evaluation ground testing capabilities in the United States are at risk. There is a lack of vision and consensus on what is and will be needed, contributing to a significant threat that ground test capabilities may not be able to meet the national security and industrial needs of the future. To support future decisions, the AIAA Ground Testing Technical Committees (GTTC) Future of Ground Test (FoGT) Working Group selected and reviewed 20 seminal documents related to the application and direction of ground testing. Each document was reviewed, with the content main points collected and organized into sections in the form of a gap analysis current state, future state, major challenges/gaps, and recommendations. This paper includes key findings and selected commentary by an editing team

    Planning Support Systems: Progress, Predictions, and Speculations on the Shape of Things to Come

    Get PDF
    In this paper, we review the brief history of planning support systems, sketching the way both the fields of planning and the software that supports and informs various planning tasks have fragmented and diversified. This is due to many forces which range from changing conceptions of what planning is for and who should be involved, to the rapid dissemination of computers and their software, set against the general quest to build ever more generalized software products applicable to as many activities as possible. We identify two main drivers – the move to visualization which dominates our very interaction with the computer and the move to disseminate and share software data and ideas across the web. We attempt a brief and somewhat unsatisfactory classification of tools for PSS in terms of the planning process and the software that has evolved, but this does serve to point up the state-ofthe- art and to focus our attention on the near and medium term future. We illustrate many of these issues with three exemplars: first a land usetransportation model (LUTM) as part of a concern for climate change, second a visualization of cities in their third dimension which is driving an interest in what places look like and in London, a concern for high buildings, and finally various web-based services we are developing to share spatial data which in turn suggests ways in which stakeholders can begin to define urban issues collaboratively. All these are elements in the larger scheme of things – in the development of online collaboratories for planning support. Our review far from comprehensive and our examples are simply indicative, not definitive. We conclude with some brief suggestions for the future

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    EcoGIS – GIS tools for ecosystem approaches to fisheries management

    Get PDF
    Executive Summary: The EcoGIS project was launched in September 2004 to investigate how Geographic Information Systems (GIS), marine data, and custom analysis tools can better enable fisheries scientists and managers to adopt Ecosystem Approaches to Fisheries Management (EAFM). EcoGIS is a collaborative effort between NOAA’s National Ocean Service (NOS) and National Marine Fisheries Service (NMFS), and four regional Fishery Management Councils. The project has focused on four priority areas: Fishing Catch and Effort Analysis, Area Characterization, Bycatch Analysis, and Habitat Interactions. Of these four functional areas, the project team first focused on developing a working prototype for catch and effort analysis: the Fishery Mapper Tool. This ArcGIS extension creates time-and-area summarized maps of fishing catch and effort from logbook, observer, or fishery-independent survey data sets. Source data may come from Oracle, Microsoft Access, or other file formats. Feedback from beta-testers of the Fishery Mapper was used to debug the prototype, enhance performance, and add features. This report describes the four priority functional areas, the development of the Fishery Mapper tool, and several themes that emerged through the parallel evolution of the EcoGIS project, the concept and implementation of the broader field of Ecosystem Approaches to Management (EAM), data management practices, and other EAM toolsets. In addition, a set of six succinct recommendations are proposed on page 29. One major conclusion from this work is that there is no single “super-tool” to enable Ecosystem Approaches to Management; as such, tools should be developed for specific purposes with attention given to interoperability and automation. Future work should be coordinated with other GIS development projects in order to provide “value added” and minimize duplication of efforts. In addition to custom tools, the development of cross-cutting Regional Ecosystem Spatial Databases will enable access to quality data to support the analyses required by EAM. GIS tools will be useful in developing Integrated Ecosystem Assessments (IEAs) and providing pre- and post-processing capabilities for spatially-explicit ecosystem models. Continued funding will enable the EcoGIS project to develop GIS tools that are immediately applicable to today’s needs. These tools will enable simplified and efficient data query, the ability to visualize data over time, and ways to synthesize multidimensional data from diverse sources. These capabilities will provide new information for analyzing issues from an ecosystem perspective, which will ultimately result in better understanding of fisheries and better support for decision-making. (PDF file contains 45 pages.

    Flood hazard hydrology: interdisciplinary geospatial preparedness and policy

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2017Floods rank as the deadliest and most frequently occurring natural hazard worldwide, and in 2013 floods in the United States ranked second only to wind storms in accounting for loss of life and damage to property. While flood disasters remain difficult to accurately predict, more precise forecasts and better understanding of the frequency, magnitude and timing of floods can help reduce the loss of life and costs associated with the impact of flood events. There is a common perception that 1) local-to-national-level decision makers do not have accurate, reliable and actionable data and knowledge they need in order to make informed flood-related decisions, and 2) because of science--policy disconnects, critical flood and scientific analyses and insights are failing to influence policymakers in national water resource and flood-related decisions that have significant local impact. This dissertation explores these perceived information gaps and disconnects, and seeks to answer the question of whether flood data can be accurately generated, transformed into useful actionable knowledge for local flood event decision makers, and then effectively communicated to influence policy. Utilizing an interdisciplinary mixed-methods research design approach, this thesis develops a methodological framework and interpretative lens for each of three distinct stages of flood-related information interaction: 1) data generation—using machine learning to estimate streamflow flood data for forecasting and response; 2) knowledge development and sharing—creating a geoanalytic visualization decision support system for flood events; and 3) knowledge actualization—using heuristic toolsets for translating scientific knowledge into policy action. Each stage is elaborated on in three distinct research papers, incorporated as chapters in this dissertation, that focus on developing practical data and methodologies that are useful to scientists, local flood event decision makers, and policymakers. Data and analytical results of this research indicate that, if certain conditions are met, it is possible to provide local decision makers and policy makers with the useful actionable knowledge they need to make timely and informed decisions
    corecore