326 research outputs found
Enhancing Energy Production with Exascale HPC Methods
High Performance Computing (HPC) resources have become the key actor for achieving more ambitious challenges in many disciplines. In this step beyond, an explosion on the available parallelism and the use of special purpose
processors are crucial. With such a goal, the HPC4E project applies new exascale HPC techniques to energy industry simulations, customizing them if necessary, and going beyond the state-of-the-art in the required HPC exascale
simulations for different energy sources. In this paper, a general overview of these methods is presented as well as some specific preliminary results.The research leading to these results has received funding from the European Union's Horizon 2020 Programme (2014-2020) under the HPC4E Project (www.hpc4e.eu), grant agreement n° 689772, the Spanish Ministry of
Economy and Competitiveness under the CODEC2 project (TIN2015-63562-R), and
from the Brazilian Ministry of Science, Technology and Innovation through Rede
Nacional de Pesquisa (RNP). Computer time on Endeavour cluster is provided by the
Intel Corporation, which enabled us to obtain the presented experimental results in
uncertainty quantification in seismic imagingPostprint (author's final draft
Applying future Exascale HPC methodologies in the energy sector
The appliance of new exascale HPC techniques to energy industry simulations is absolutely needed nowadays. In this sense, the common procedure is to customize these techniques to the specific energy sector they are of interest in order to go beyond the state-of-the-art in the required HPC exascale simulations. With this aim, the HPC4E project is developing new exascale methodologies to three different energy sources that are the present and the future of energy: wind energy production and design, efficient combustion systems for biomass-derived fuels (biogas), and exploration geophysics for hydrocarbon reservoirs. In this work, the general exascale advances proposed as part of HPC4E and its outcome to specific results in different domains are presented.The research leading to these results has received funding from the European Union's Horizon 2020 Programme (2014-2020) under the HPC4E Project (www.hpc4e.eu), grant agreement n° 689772, the Spanish Ministry of Economy and Competitiveness under the CODEC2 project (TIN2015-63562-R), and from the Brazilian Ministry of Science, Technology and Innovation through Rede Nacional de Pesquisa (RNP). Computer time on Endeavour cluster is provided by the Intel Corporation, which enabled us to obtain the presented experimental results in uncertainty quantification in seismic imaging.Postprint (author's final draft
Recommended from our members
FabSim3: An automation toolkit for verified simulations using high performance computing
A common feature of computational modelling and simulation research is the need to perform many
tasks in complex sequences to achieve a usable result. This will typically involve tasks such as preparing
input data, pre-processing, running simulations on a local or remote machine, post-processing, and
performing coupling communications, validations and/or optimisations. Tasks like these can involve
manual steps which are time and effort intensive, especially when it involves the management of large
ensemble runs. Additionally, human errors become more likely and numerous as the research work
becomes more complex, increasing the risk of damaging the credibility of simulation results. Automation
tools can help ensure the credibility of simulation results by reducing the manual time and effort
required to perform these research tasks, by making more rigorous procedures tractable, and by reducing
the probability of human error due to a reduced number of manual actions. In addition, efficiency
gained through automation can help researchers to perform more research within the budget and effort
constraints imposed by their projects.
This paper presents the main software release of FabSim3, and explains how our automation toolkit
can improve and simplify a range of tasks for researchers and application developers. FabSim3 helps
to prepare, submit, execute, retrieve, and analyze simulation workflows. By providing a suitable level
of abstraction, FabSim3 reduces the complexity of setting up and managing a large-scale simulation
scenario, while still providing transparent access to the underlying layers for effective debugging.
The tool also facilitates job submission and management (including staging and curation of files
and environments) for a range of different supercomputing environments. Although FabSim3 itself is
application-agnostic, it supports a provably extensible plugin system where users automate simulation
and analysis workflows for their own application domains. To highlight this, we briefly describe a
selection of these plugins and we demonstrate the efficiency of the toolkit in handling large ensemble
workflows.EPSRC under grant agreement EP/W007711/1, as well as by the VECMA and HiDALGO projects, which have
received funding from the European Union Horizon 2020 research and innovation programme under grant agreement nos 800925 and
824115. In addition, FabFlee was supported by the ITFLOWS project and FabCovid19 by the STAMINA project, both of which have received
funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 882986 and No 883441
respectivel
Roadmap on multiscale materials modeling
Modeling and simulation is transforming modern materials science, becoming an important tool for the discovery of new materials and material phenomena, for gaining insight into the processes that govern materials behavior, and, increasingly, for quantitative predictions that can be used as part of a design tool in full partnership with experimental synthesis and characterization. Modeling and simulation is the essential bridge from good science to good engineering, spanning from fundamental understanding of materials behavior to deliberate design of new materials technologies leveraging new properties and processes. This Roadmap presents a broad overview of the extensive impact computational modeling has had in materials science in the past few decades, and offers focused perspectives on where the path forward lies as this rapidly expanding field evolves to meet the challenges of the next few decades. The Roadmap offers perspectives on advances within disciplines as diverse as phase field methods to model mesoscale behavior and molecular dynamics methods to deduce the fundamental atomic-scale dynamical processes governing materials response, to the challenges involved in the interdisciplinary research that tackles complex materials problems where the governing phenomena span different scales of materials behavior requiring multiscale approaches. The shift from understanding fundamental materials behavior to development of quantitative approaches to explain and predict experimental observations requires advances in the methods and practice in simulations for reproducibility and reliability, and interacting with a computational ecosystem that integrates new theory development, innovative applications, and an increasingly integrated software and computational infrastructure that takes advantage of the increasingly powerful computational methods and computing hardware
Vision 2040: A Roadmap for Integrated, Multiscale Modeling and Simulation of Materials and Systems
Over the last few decades, advances in high-performance computing, new materials characterization methods, and, more recently, an emphasis on integrated computational materials engineering (ICME) and additive manufacturing have been a catalyst for multiscale modeling and simulation-based design of materials and structures in the aerospace industry. While these advances have driven significant progress in the development of aerospace components and systems, that progress has been limited by persistent technology and infrastructure challenges that must be overcome to realize the full potential of integrated materials and systems design and simulation modeling throughout the supply chain. As a result, NASA's Transformational Tools and Technology (TTT) Project sponsored a study (performed by a diverse team led by Pratt & Whitney) to define the potential 25-year future state required for integrated multiscale modeling of materials and systems (e.g., load-bearing structures) to accelerate the pace and reduce the expense of innovation in future aerospace and aeronautical systems. This report describes the findings of this 2040 Vision study (e.g., the 2040 vision state; the required interdependent core technical work areas, Key Element (KE); identified gaps and actions to close those gaps; and major recommendations) which constitutes a community consensus document as it is a result of over 450 professionals input obtain via: 1) four society workshops (AIAA, NAFEMS, and two TMS), 2) community-wide survey, and 3) the establishment of 9 expert panels (one per KE) consisting on average of 10 non-team members from academia, government and industry to review, update content, and prioritize gaps and actions. The study envisions the development of a cyber-physical-social ecosystem comprised of experimentally verified and validated computational models, tools, and techniques, along with the associated digital tapestry, that impacts the entire supply chain to enable cost-effective, rapid, and revolutionary design of fit-for-purpose materials, components, and systems. Although the vision focused on aeronautics and space applications, it is believed that other engineering communities (e.g., automotive, biomedical, etc.) can benefit as well from the proposed framework with only minor modifications. Finally, it is TTT's hope and desire that this vision provides the strategic guidance to both public and private research and development decision makers to make the proposed 2040 vision state a reality and thereby provide a significant advancement in the United States global competitiveness
The Academic Library as a University Research Business Intelligence Partner
Ranked in the world’s top 50, The University of Queensland (UQ) is a comprehensive research and teaching institution. The University has an established Planning and Business Intelligence Office (PBI) that provides central reporting infrastructure (via SAP Business Objects) as well as analysis support for strategic planning and decision making. On top of the support provided by PBI, some data providers, especially in areas with complex datasets that require expert knowledge (e.g. awards; publications), partner across the University to deliver bespoke analyses of their data. Underpinned by an institutional repository rich in publication metadata and bibliometric indicators, the Library partners across the University to enable and deliver research business intelligence in the areas of: Collaboration analysis Capability mapping Strategic recruitment KPI reporting and planning ORCID registration and use Open access compliance
Research is core business at UQ, and the Library’s institutional repository UQ eSpace has evolved in response to global, national and institutional drivers to become more than the institutional open access repository: UQ eSpace is the University’s official source of publication data and an integral part of the business intelligence environment. Services have been developed by leveraging the data within UQ eSpace, and, as an in-house built system, the Library has developed it to deliver functionality to ensure it is a strategic asset of the University.
In this case study, we will discuss the business intelligence activities that the Library enables and delivers to ensure the institution can make evidence-based and strategic decisions. Taking into account the Library’s unique position in supporting the institution\u27s business intelligence environment through its custodianship of the repository UQ eSpace, the paper will include discussions around resourcing, systems, tools/methodologies, and opportunities for growth
Enhancing optimization capabilities using the AGILE collaborative MDO framework with application to wing and nacelle design
This paper presents methodological investigations performed in research activities in the field of Multi-disciplinary Design and Optimization (MDO) for overall aircraft design in the EU funded research project AGILE (2015–2018). In the AGILE project a team of 19 industrial, research and academic partners from Europe, Canada and Russia are working together to develop the next generation of MDO environment that targets significant reductions in aircraft development costs and time to market, leading to cheaper and greener aircraft. The paper introduces the AGILE project structure and describes the achievements of the 1st year that led to a reference distributed MDO system. A focus is then made on different novel optimization techniques studied during the 2nd year, all aiming at easing the optimization of complex workflows that are characterized by a high number of discipline interdependencies and a large number of design variables in the context of multi-level processes and multi-partner collaborative engineering projects. Three optimization strategies are introduced and validated for a conventional aircraft. First, a multi-objective technique based on Nash Games and Genetic Algorithm is used on a wing design problem. Then a zoom is made on the nacelle design where a surrogate-based optimizer is used to solve a mono-objective problem. Finally a robust approach is adopted to study the effects of uncertainty in parameters on the nacelle design process. These new capabilities have been integrated in the AGILE collaborative framework that in the future will be used to study and optimize novel unconventional aircraft configurations
- …