263,485 research outputs found

    From a Competition for Self-Driving Miniature Cars to a Standardized Experimental Platform: Concept, Models, Architecture, and Evaluation

    Full text link
    Context: Competitions for self-driving cars facilitated the development and research in the domain of autonomous vehicles towards potential solutions for the future mobility. Objective: Miniature vehicles can bridge the gap between simulation-based evaluations of algorithms relying on simplified models, and those time-consuming vehicle tests on real-scale proving grounds. Method: This article combines findings from a systematic literature review, an in-depth analysis of results and technical concepts from contestants in a competition for self-driving miniature cars, and experiences of participating in the 2013 competition for self-driving cars. Results: A simulation-based development platform for real-scale vehicles has been adapted to support the development of a self-driving miniature car. Furthermore, a standardized platform was designed and realized to enable research and experiments in the context of future mobility solutions. Conclusion: A clear separation between algorithm conceptualization and validation in a model-based simulation environment enabled efficient and riskless experiments and validation. The design of a reusable, low-cost, and energy-efficient hardware architecture utilizing a standardized software/hardware interface enables experiments, which would otherwise require resources like a large real-scale test track.Comment: 17 pages, 19 figues, 2 table

    Steady-State Co-Kriging Models

    Get PDF
    In deterministic computer experiments, a computer code can often be run at different levels of complexity/fidelity and a hierarchy of levels of code can be obtained. The higher the fidelity and hence the computational cost, the more accurate output data can be obtained. Methods based on the co-kriging methodology Cressie (2015) for predicting the output of a high-fidelity computer code by combining data generated to varying levels of fidelity have become popular over the last two decades. For instance, Kennedy and O\u27Hagan (2000) first propose to build a metamodel for multi-level computer codes by using an auto-regressive model structure. Forrester et al. (2007) provide details on estimation of the model parameters and further investigate the use of co-kriging for multi-fidelity optimization based on the efficient global optimization algorithm Jones et al. (1998). Qian and Wu (2008) propose a Bayesian hierarchical modeling approach for combining low-accuracy and high-accuracy experiments. More recently, Gratiet and Cannamela (2015) propose sequential design strategies using fast cross-validation techniques for multi-fidelity computer codes.;This research intends to extend the co-kriging metamodeling methodology to study steady-state simulation experiments. First, the mathematical structure of co-kriging is extended to take into account heterogeneous simulation output variances. Next, efficient steady-state simulation experimental designs are investigated for co-kriging to achieve a high prediction accuracy for estimation of steady-state parameters. Specifically, designs consisting of replicated longer simulation runs at a few design points and replicated shorter simulation runs at a larger set of design points will be considered. Also, design with no replicated simulation runs at long simulation is studied, along with different methods for calculating the output variance in absence of replicated outputs.;Stochastic co-kriging (SCK) method is applied to an M/M/1, as well as an M/M/5 queueing system. In both examples, the prediction performance of the SCK model is promising. It is also shown that the SCK method provides better response surfaces compared to the SK method

    Deploying building simulation to enhance the experimental design of a full-scale empirical validation project

    Get PDF
    Empirical validation of building simulation results is a complex and time-consuming process. A well-structured and thorough experimental design is therefore a crucial step of the experimental procedure. A full-scale empirical validation study is planned to take place within IEA EBC Annex 71: “Building energy performance assessment based on in situ measurements”. The experimental data are currently being gathered in two experiments being conducted at the Fraunhofer IBP test site at Holzkirchen in Germany. This paper describes the methodology followed during the experimental design of the project. Particular focus is on how Building Performance Simulation (BPS) was used to assist the preparation of the actual experiment, to determine suitable test sequences, magnitudes of heat inputs and temperature variations. A combination of both deterministic and probabilistic simulation (using the method of Morris) is employed to replicate the actual experiment and to assess the sensitivity of the model to uncertain input parameters. A number of experimental errors are identified in the experiment, primarily concerning the magnitude of heat inputs. Moreover, the paper includes a discussion on lessons learned from the simulations and on the reliability, reproducibility and limitations of the suggested experimental design procedure

    Validation of Immersed Boundary Simulations of Heart Valve Hemodynamics against In Vitro 4D Flow MRI Data

    Full text link
    The immersed boundary (IB) method is a mathematical framework for fluid-structure interaction problems (FSI) that was originally developed to simulate flows around heart valves. Validation of FSI simulations around heart valves against experimental data is challenging, however, due to the difficulty of performing robust and effective simulations, the complications of modeling a specific physical experiment, and the need to acquire experimental data that is directly comparable to simulation data. In this work, we performed physical experiments of flow through a pulmonary valve in an in vitro pulse duplicator, and measured the corresponding velocity field using 4D flow MRI (4-dimensional flow magnetic resonance imaging). We constructed a computer model of this pulmonary artery setup, including modeling valve geometry and material properties via a technique called design-based elasticity, and simulated flow through it with the IB method. The simulated flow fields showed excellent qualitative agreement with experiments, excellent agreement on integral metrics, and reasonable relative error in the entire flow domain and on slices of interest. These results validate our design-based valve model construction, the IB solvers used and the immersed boundary method for flows around heart valves

    Clock Domain Crossing Fault Model and Coverage Metric for Validation of SoC Design

    Full text link
    Multiple asynchronous clock domains have been increasingly employed in System-on-Chip (SoC) designs for different I/O interfaces. Functional validation is one of the most expensive tasks in the SoC design process. Simulation on register transfer level (RTL) is still the most widely used method. It is important to quantitatively measure the validation confidence and progress for clock domain crossing (CDC) designs. In this paper, we propose an efficient method for definition of CDC coverage, which can be used in RTL simulation for a multi-clock domain SoC design. First, we develop a CDC fault model to present the actual effect of metastability Second, we use a temporal dataflow graph (TDFG) to propagate the CDC faults to observable variables. Finally, CDC coverage is defined based on the CDC faults and their observability Our experiments on a commercial IP demonstrate that this method is useful to find CDC errors early in the design cycles.Automation & Control SystemsEngineering, Electrical & ElectronicEngineering, MechanicalEICPCI-S(ISTP)

    Fracture Characterisation and Modelling of AHSS Using Acoustic Emission Analysis for Deep Drawing †

    Get PDF
    Driven by high energy prices, AHSS are still gaining importance in the automotive industry regarding electric vehicles and their battery range. Simulation-based design of forming processes can contribute to exploiting their potential for lightweight design. Fracture models are frequently used to predict the material’s failure and are often parametrised using different tensile tests with optical measurements. Hereby, the fracture is determined by a surface crack. However, for many steels, the fracture initiation already occurs inside the specimen prior to a crack on the surface. This leads to inaccuracies and more imprecise fracture models. Using a method that detects the fracture initiation within the specimen, such as acoustic emission analysis, has a high potential to improve the modelling accuracy. In the presented paper, tests for fracture characterisation with two AHSS were performed for a wide range of stress states and measured with a conventional optical as well as a new acoustical measurement system. The tests were analysed regarding the fracture initiation using both measurement systems. Numerical models of the tests were created, and the EMC fracture model was parametrised based on the two evaluation areas: a surface crack as usual and a fracture from the inside as a novelty. The two fracture models were used in a deep drawing simulation for analysis, comparison and validation with deep drawing experiments. It was shown that the evaluation area for the fracture initiation had a significant impact on the fracture model. Hence, the failure prediction of the EMC fracture model from the acoustic evaluation method showed a higher agreement in the numerical simulations with the experiments than the model from the optical evaluation

    Development of a surrogate model of an amine scrubbing digital twin using machine learning methods

    Get PDF
    Advancements in the process industry require building more complex simulations and performing computationally intensive operations like optimization. To overcome the numerical limit of conventional process simulations a surrogate model is a viable strategy. In this work, a surrogate model of an industrial amine scrubbing digital twin has been developed. The surrogate model has been built based on the process simulation created in Aspen HYSYS and validated as a digital twin against real process data collected during a steady-state operation. The surrogate relies on an accurate Design of Experiments procedure. In this case, the Latin-Hypercube method has been chosen and several nested domains have been defined in ranges around the nominal steady state operative condition. Several machine learning models have been trained using cross-validation, and the most accurate has been selected to predict each target. The resulting surrogate model showed a satisfactory performance, given the data available

    ERIGrid Holistic Test Description for Validating Cyber-Physical Energy Systems

    Get PDF
    Smart energy solutions aim to modify and optimise the operation of existing energy infrastructure. Such cyber-physical technology must be mature before deployment to the actual infrastructure, and competitive solutions will have to be compliant to standards still under development. Achieving this technology readiness and harmonisation requires reproducible experiments and appropriately realistic testing environments. Such testbeds for multi-domain cyber-physical experiments are complex in and of themselves. This work addresses a method for the scoping and design of experiments where both testbed and solution each require detailed expertise. This empirical work first revisited present test description approaches, developed a newdescription method for cyber-physical energy systems testing, and matured it by means of user involvement. The new Holistic Test Description (HTD) method facilitates the conception, deconstruction and reproduction of complex experimental designs in the domains of cyber-physical energy systems. This work develops the background and motivation, offers a guideline and examples to the proposed approach, and summarises experience from three years of its application.This work received funding in the European Community’s Horizon 2020 Program (H2020/2014–2020) under project “ERIGrid” (Grant Agreement No. 654113)

    Efficient Simulation of Structural Faults for the Reliability Evaluation at System-Level

    Get PDF
    In recent technology nodes, reliability is considered a part of the standard design ¿ow at all levels of embedded system design. While techniques that use only low-level models at gate- and register transfer-level offer high accuracy, they are too inefficient to consider the overall application of the embedded system. Multi-level models with high abstraction are essential to efficiently evaluate the impact of physical defects on the system. This paper provides a methodology that leverages state-of-the-art techniques for efficient fault simulation of structural faults together with transaction-level modeling. This way it is possible to accurately evaluate the impact of the faults on the entire hardware/software system. A case study of a system consisting of hardware and software for image compression and data encryption is presented and the method is compared to a standard gate/RT mixed-level approac
    corecore