45,258 research outputs found

    Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim

    Get PDF
    Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process

    SIMDAT

    No full text

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    GTTC Future of Ground Testing Meta-Analysis of 20 Documents

    Get PDF
    National research, development, test, and evaluation ground testing capabilities in the United States are at risk. There is a lack of vision and consensus on what is and will be needed, contributing to a significant threat that ground test capabilities may not be able to meet the national security and industrial needs of the future. To support future decisions, the AIAA Ground Testing Technical Committees (GTTC) Future of Ground Test (FoGT) Working Group selected and reviewed 20 seminal documents related to the application and direction of ground testing. Each document was reviewed, with the content main points collected and organized into sections in the form of a gap analysis current state, future state, major challenges/gaps, and recommendations. This paper includes key findings and selected commentary by an editing team

    Evaluating the Rationale for Folding Wing Tips Comparing the Exergy and Breguet Approaches

    Get PDF
    The design and development processes for future aircraft aims to address the environmental and efficiency challenges needed to facilitate the engineering of concepts that are far more integrated and require a multidisciplinary approach. This study investigates the benefit of incorporating span extension wing tips onto future aircraft configurations as a method of providing improved aerodynamic efficiency, whilst allowing the extension to fold on the ground to meet airport gate size constraints. Although the actuated wing tips are not studied in detail, the focus of this study is to compare two different methods of analysis that can be used to identify the benefit and limitations of adding such devices. The two methods considered are a quasi-steady implicit energy analysis based on the Breguet Range Equation and an explicit energy analysis based on the first and second laws of thermodynamics known as Exergy Analysis. It has been found that both methods provide agreeable results and have individual merits. The Breguet Range Equation can provide quick results in early design, whilst the Exergy Analysis has been found to be far more extensive and allows the complete dynamic behaviour of the aircraft to be assessed through a single metric. Hence, allowing comparison of losses from multiple subsystems

    Expressing the tacit knowledge of a digital library system as linked data

    Get PDF
    Library organizations have enthusiastically undertaken semantic web initiatives and in particular the data publishing as linked data. Nevertheless, different surveys report the experimental nature of initiatives and the consumer difficulty in re-using data. These barriers are a hindrance for using linked datasets, as an infrastructure that enhances the library and related information services. This paper presents an approach for encoding, as a Linked Vocabulary, the "tacit" knowledge of the information system that manages the data source. The objective is the improvement of the interpretation process of the linked data meaning of published datasets. We analyzed a digital library system, as a case study, for prototyping the "semantic data management" method, where data and its knowledge are natively managed, taking into account the linked data pillars. The ultimate objective of the semantic data management is to curate the correct consumers' interpretation of data, and to facilitate the proper re-use. The prototype defines the ontological entities representing the knowledge, of the digital library system, that is not stored in the data source, nor in the existing ontologies related to the system's semantics. Thus we present the local ontology and its matching with existing ontologies, Preservation Metadata Implementation Strategies (PREMIS) and Metadata Objects Description Schema (MODS), and we discuss linked data triples prototyped from the legacy relational database, by using the local ontology. We show how the semantic data management, can deal with the inconsistency of system data, and we conclude that a specific change in the system developer mindset, it is necessary for extracting and "codifying" the tacit knowledge, which is necessary to improve the data interpretation process

    A novel mechanical analogy based battery model for SoC estimation using a multi-cell EKF

    Full text link
    The future evolution of technological systems dedicated to improve energy efficiency will strongly depend on effective and reliable Energy Storage Systems, as key components for Smart Grids, microgrids and electric mobility. Besides possible improvements in chemical materials and cells design, the Battery Management System is the most important electronic device that improves the reliability of a battery pack. In fact, a precise State of Charge (SoC) estimation allows the energy flows controller to exploit better the full capacity of each cell. In this paper, we propose an alternative definition for the SoC, explaining the rationales by a mechanical analogy. We introduce a novel cell model, conceived as a series of three electric dipoles, together with a procedure for parameters estimation relying only on voltage measures and a given current profile. The three dipoles represent the quasi-stationary, the dynamics and the istantaneous components of voltage measures. An Extended Kalman Filer (EKF) is adopted as a nonlinear state estimator. Moreover, we propose a multi-cell EKF system based on a round-robin approach to allow the same processing block to keep track of many cells at the same time. Performance tests with a prototype battery pack composed by 18 A123 cells connected in series show encouraging results.Comment: 8 page, 12 figures, 1 tabl

    Millimeter-wave communication for a last-mile autonomous transport vehicle

    Get PDF
    Low-speed autonomous transport of passengers and goods is expected to have a strong, positive impact on the reliability and ease of travelling. Various advanced functions of the involved vehicles rely on the wireless exchange of information with other vehicles and the roadside infrastructure, thereby benefitting from the low latency and high throughput characteristics that 5G technology has to offer. This work presents an investigation of 5G millimeter-wave communication links for a low-speed autonomous vehicle, focusing on the effects of the antenna positions on both the received signal quality and the link performance. It is observed that the excess loss for communication with roadside infrastructure in front of the vehicle is nearly half-power beam width independent, and the increase of the root mean square delay spread plays a minor role in the resulting signal quality, as the absolute times are considerably shorter than the typical duration of 5G New Radio symbols. Near certain threshold levels, a reduction of the received power affects the link performance through an increased error vector magnitude of the received signal, and subsequent decrease of the achieved data throughput
    • …
    corecore