1,289 research outputs found

    The GNAT method for nonlinear model reduction: effective implementation and application to computational fluid dynamics and turbulent flows

    Full text link
    The Gauss--Newton with approximated tensors (GNAT) method is a nonlinear model reduction method that operates on fully discretized computational models. It achieves dimension reduction by a Petrov--Galerkin projection associated with residual minimization; it delivers computational efficency by a hyper-reduction procedure based on the `gappy POD' technique. Originally presented in Ref. [1], where it was applied to implicit nonlinear structural-dynamics models, this method is further developed here and applied to the solution of a benchmark turbulent viscous flow problem. To begin, this paper develops global state-space error bounds that justify the method's design and highlight its advantages in terms of minimizing components of these error bounds. Next, the paper introduces a `sample mesh' concept that enables a distributed, computationally efficient implementation of the GNAT method in finite-volume-based computational-fluid-dynamics (CFD) codes. The suitability of GNAT for parameterized problems is highlighted with the solution of an academic problem featuring moving discontinuities. Finally, the capability of this method to reduce by orders of magnitude the core-hours required for large-scale CFD computations, while preserving accuracy, is demonstrated with the simulation of turbulent flow over the Ahmed body. For an instance of this benchmark problem with over 17 million degrees of freedom, GNAT outperforms several other nonlinear model-reduction methods, reduces the required computational resources by more than two orders of magnitude, and delivers a solution that differs by less than 1% from its high-dimensional counterpart

    Intelligent Data Storage and Retrieval for Design Optimisation – an Overview

    Get PDF
    This paper documents the findings of a literature review conducted by the Sir Lawrence Wackett Centre for Aerospace Design Technology at RMIT University. The review investigates aspects of a proposed system for intelligent design optimisation. Such a system would be capable of efficiently storing (and compressing if required) a range of types of design data into an intelligent database. This database would be accessed by the system during subsequent design processes, allowing for search of relevant design data for re-use in later designs, allowing it to become very efficient in reducing the time for later designs as the database grows in size. Extensive research has been performed, in both theoretical aspects of the project, and practical examples of current similar systems. This research covers the areas of database systems, database queries, representation and compression of design data, geometric representation and heuristic methods for design applications.

    Data quality evaluation through data quality rules and data provenance.

    Get PDF
    The application and exploitation of large amounts of data play an ever-increasing role in today’s research, government, and economy. Data understanding and decision making heavily rely on high quality data; therefore, in many different contexts, it is important to assess the quality of a dataset in order to determine if it is suitable to be used for a specific purpose. Moreover, as the access to and the exchange of datasets have become easier and more frequent, and as scientists increasingly use the World Wide Web to share scientific data, there is a growing need to know the provenance of a dataset (i.e., information about the processes and data sources that lead to its creation) in order to evaluate its trustworthiness. In this work, data quality rules and data provenance are used to evaluate the quality of datasets. Concerning the first topic, the applied solution consists in the identification of types of data constraints that can be useful as data quality rules and in the development of a software tool to evaluate a dataset on the basis of a set of rules expressed in the XML markup language. We selected some of the data constraints and dependencies already considered in the data quality field, but we also used order dependencies and existence constraints as quality rules. In addition, we developed some algorithms to discover the types of dependencies used in the tool. To deal with the provenance of data, the Open Provenance Model (OPM) was adopted, an experimental query language for querying OPM graphs stored in a relational database was implemented, and an approach to design OPM graphs was proposed

    From Relations to XML: Cleaning, Integrating and Securing Data

    Get PDF
    While relational databases are still the preferred approach for storing data, XML is emerging as the primary standard for representing and exchanging data. Consequently, it has been increasingly important to provide a uniform XML interface to various data sources— integration; and critical to protect sensitive and confidential information in XML data — access control. Moreover, it is preferable to first detect and repair the inconsistencies in the data to avoid the propagation of errors to other data processing steps. In response to these challenges, this thesis presents an integrated framework for cleaning, integrating and securing data. The framework contains three parts. First, the data cleaning sub-framework makes use of a new class of constraints specially designed for improving data quality, referred to as conditional functional dependencies (CFDs), to detect and remove inconsistencies in relational data. Both batch and incremental techniques are developed for detecting CFD violations by SQL efficiently and repairing them based on a cost model. The cleaned relational data, together with other non-XML data, is then converted to XML format by using widely deployed XML publishing facilities. Second, the data integration sub-framework uses a novel formalism, XML integration grammars (XIGs), to integrate multi-source XML data which is either native or published from traditional databases. XIGs automatically support conformance to a target DTD, and allow one to build a large, complex integration via composition of component XIGs. To efficiently materialize the integrated data, algorithms are developed for merging XML queries in XIGs and for scheduling them. Third, to protect sensitive information in the integrated XML data, the data security sub-framework allows users to access the data only through authorized views. User queries posed on these views need to be rewritten into equivalent queries on the underlying document to avoid the prohibitive cost of materializing and maintaining large number of views. Two algorithms are proposed to support virtual XML views: a rewriting algorithm that characterizes the rewritten queries as a new form of automata and an evaluation algorithm to execute the automata-represented queries. They allow the security sub-framework to answer queries on views in linear time. Using both relational and XML technologies, this framework provides a uniform approach to clean, integrate and secure data. The algorithms and techniques in the framework have been implemented and the experimental study verifies their effectiveness and efficiency

    Updating the Lambda modes of a nuclear power reactor

    Full text link
    [EN] Starting from a steady state configuration of a nuclear power reactor some situations arise in which the reactor configuration is perturbed. The Lambda modes are eigenfunctions associated with a given configuration of the reactor, which have successfully been used to describe unstable events in BWRs. To compute several eigenvalues and its corresponding eigenfunctions for a nuclear reactor is quite expensive from the computational point of view. Krylov subspace methods are efficient methods to compute the dominant Lambda modes associated with a given configuration of the reactor, but if the Lambda modes have to be computed for different perturbed configurations of the reactor more efficient methods can be used. In this paper, different methods for the updating Lambda modes problem will be proposed and compared by computing the dominant Lambda modes of different configurations associated with a Boron injection transient in a typical BWR reactor. (C) 2010 Elsevier Ltd. All rights reserved.This work has been partially supported by the Spanish Ministerio de Educacion y Ciencia under projects ENE2008-02669 and MTM2007-64477-AR07, the Generalitat Valenciana under project ACOMP/2009/058, and the Universidad Politecnica de Valencia under project PAID-05-09-4285.González Pintor, S.; Ginestar Peiro, D.; Verdú Martín, GJ. (2011). Updating the Lambda modes of a nuclear power reactor. Mathematical and Computer Modelling. 54(7):1796-1801. https://doi.org/10.1016/j.mcm.2010.12.013S1796180154

    kmos: A lattice kinetic Monte Carlo framework

    Get PDF
    Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.Comment: 21 pages, 12 figure

    Semantic representation of engineering knowledge:pre-study

    Get PDF

    Unstructured Grid Generation Techniques and Software

    Get PDF
    The Workshop on Unstructured Grid Generation Techniques and Software was conducted for NASA to assess its unstructured grid activities, improve the coordination among NASA centers, and promote technology transfer to industry. The proceedings represent contributions from Ames, Langley, and Lewis Research Centers, and the Johnson and Marshall Space Flight Centers. This report is a compilation of the presentations made at the workshop
    corecore