2,710 research outputs found

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    Heterogeneous hierarchical workflow composition

    Get PDF
    Workflow systems promise scientists an automated end-to-end path from hypothesis to discovery. However, expecting any single workflow system to deliver such a wide range of capabilities is impractical. A more practical solution is to compose the end-to-end workflow from more than one system. With this goal in mind, the integration of task-based and in situ workflows is explored, where the result is a hierarchical heterogeneous workflow composed of subworkflows, with different levels of the hierarchy using different programming, execution, and data models. Materials science use cases demonstrate the advantages of such heterogeneous hierarchical workflow composition.This work is a collaboration between Argonne National Laboratory and the Barcelona Supercomputing Center within the Joint Laboratory for Extreme-Scale Computing. This research is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under contract number DE-AC02- 06CH11357, program manager Laura Biven, and by the Spanish Government (SEV2015-0493), by the Spanish Ministry of Science and Innovation (contract TIN2015-65316-P), by Generalitat de Catalunya (contract 2014-SGR-1051).Peer ReviewedPostprint (author's final draft

    Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities: Challenges and Strategies

    Get PDF
    A fundamental goal of cyberinfrastructure (CI) is the integration of computing hardware, software, and network technology, along with data, information management, and human resources to advance scholarship and research. Such integration creates opportunities for researchers, educators, and learners to share ideas, expertise, tools, and facilities in new and powerful ways that cannot be realized if each of these components is applied independently. Bridging the gap between the reality of CI today and its potential in the immediate future is critical to building a balanced CI ecosystem that can support future scholarship and research. This report summarizes the observations and recommendations from a workshop in July 2008 sponsored by the EDUCAUSE Net@EDU Campus Cyberinfrastructure Working Group (CCI) and the Coalition for Academic Scientific Computation (CASC). The invitational workshop was hosted at the University Place Conference Center on the IUPUI campus in Indianapolis. Over 50 individuals representing a cross-section of faculty, senior campus information technology leaders, national lab directors, and other CI experts attended. The workshop focused on the challenges that must be addressed to build a coherent CI from the local to the national level, and the potential opportunities that would result. Both the organizing committee and the workshop participants hope that some of the ideas, suggestions, and recommendations in this report will take hold and be implemented in the community. The goal is to create a better, more supportive, more usable CI environment in the future to advance both scholarship and research

    Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3)

    Get PDF
    This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3). The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group's future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen

    Innovative Technologies for Human Exploration: Opportunities for Partnerships and Leveraging Novel Technologies External to NASA

    Get PDF
    Human spaceflight organizations have ambitious goals for expanding human presence throughout the solar system. To meet these goals, spaceflight organizations have to overcome complex technical challenges for human missions to Mars, Near Earth Asteroids, and other distant celestial bodies. Resolving these challenges requires considerable resources and technological innovations, such as advancements in human health and countermeasures for space environments; self-sustaining habitats; advanced power and propulsion systems; and information technologies. Today, government space agencies seek cooperative endeavors to reduce cost burdens, improve human exploration capabilities, and foster knowledge sharing among human spaceflight organizations. This paper looks at potential opportunities for partnerships and spin-ins from economic sectors outside the space industry. It highlights innovative technologies and breakthrough concepts that could have significant impacts on space exploration and identifies organizations throughout the broader economy that specialize in these technologies

    Return on Investment from Academic Supercomputing: SC14 Panel

    Get PDF
    Return on Investment or ROI is a fundamental measure of effectiveness in business. It has been applied broadly across industries, including information technology and supercomputing. In this panel, we will share approaches to assessing ROI for academic supercomputing.The panel will address the challenge that “returns” from supercomputing and other computationally based research activities are often not financial. This is major distinction from other industrial sectors, where product sales, inventions, and patents might form the basis of ROI calculations. How should ROI be assessed for high performance computing in academic environments? What inroads to ROI calculations are underway by the panelists? What are challenges of ROI calculations

    The Silent Arms Race: The Role of the Supercomputer During the Cold War, 1947-1963

    Get PDF
    One of the central features of the Cold War is the Arms Race. The United States and the Union of Soviet Socialist republics vied for supremacy over the globe for a fifty-year period in which there were several arms races; atomic weapons, thermonuclear weapons and various kinds of conventional weapons. However, there is another arms race that goes unsung during this period of history and that is in the area of supercomputing. The other types of arms races are taken for granted by historians and others, but the technological competition between the superpowers would have been impossible without the historically silent arms race in the area of supercomputers. The construction of missiles, jets as well as the testing of nuclear weapons had serious implications for international relations. Often perception is more important than fact. Perceived power maintained a deterrent effect on the two superpowers. If one superpower suspected that they, in fact, had an advantage over the other then the balance of power would be upset and more aggressive measures might have been taken in various fronts of the conflict, perhaps leading to war. Due to this, it was necessary to maintain a balance of power not only in weapons but in supercomputing as well. Considering the role that the computers played, it is time for closer historical scrutiny
    • …
    corecore