21,998 research outputs found

    On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis

    Get PDF
    In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies

    Shingle 2.0: generalising self-consistent and automated domain discretisation for multi-scale geophysical models

    Full text link
    The approaches taken to describe and develop spatial discretisations of the domains required for geophysical simulation models are commonly ad hoc, model or application specific and under-documented. This is particularly acute for simulation models that are flexible in their use of multi-scale, anisotropic, fully unstructured meshes where a relatively large number of heterogeneous parameters are required to constrain their full description. As a consequence, it can be difficult to reproduce simulations, ensure a provenance in model data handling and initialisation, and a challenge to conduct model intercomparisons rigorously. This paper takes a novel approach to spatial discretisation, considering it much like a numerical simulation model problem of its own. It introduces a generalised, extensible, self-documenting approach to carefully describe, and necessarily fully, the constraints over the heterogeneous parameter space that determine how a domain is spatially discretised. This additionally provides a method to accurately record these constraints, using high-level natural language based abstractions, that enables full accounts of provenance, sharing and distribution. Together with this description, a generalised consistent approach to unstructured mesh generation for geophysical models is developed, that is automated, robust and repeatable, quick-to-draft, rigorously verified and consistent to the source data throughout. This interprets the description above to execute a self-consistent spatial discretisation process, which is automatically validated to expected discrete characteristics and metrics.Comment: 18 pages, 10 figures, 1 table. Submitted for publication and under revie

    Fast Neural Network Predictions from Constrained Aerodynamics Datasets

    Full text link
    Incorporating computational fluid dynamics in the design process of jets, spacecraft, or gas turbine engines is often challenged by the required computational resources and simulation time, which depend on the chosen physics-based computational models and grid resolutions. An ongoing problem in the field is how to simulate these systems faster but with sufficient accuracy. While many approaches involve simplified models of the underlying physics, others are model-free and make predictions based only on existing simulation data. We present a novel model-free approach in which we reformulate the simulation problem to effectively increase the size of constrained pre-computed datasets and introduce a novel neural network architecture (called a cluster network) with an inductive bias well-suited to highly nonlinear computational fluid dynamics solutions. Compared to the state-of-the-art in model-based approximations, we show that our approach is nearly as accurate, an order of magnitude faster, and easier to apply. Furthermore, we show that our method outperforms other model-free approaches

    Human Motion Trajectory Prediction: A Survey

    Full text link
    With growing numbers of intelligent autonomous systems in human environments, the ability of such systems to perceive, understand and anticipate human behavior becomes increasingly important. Specifically, predicting future positions of dynamic agents and planning considering such predictions are key tasks for self-driving vehicles, service robots and advanced surveillance systems. This paper provides a survey of human motion trajectory prediction. We review, analyze and structure a large selection of work from different communities and propose a taxonomy that categorizes existing methods based on the motion modeling approach and level of contextual information used. We provide an overview of the existing datasets and performance metrics. We discuss limitations of the state of the art and outline directions for further research.Comment: Submitted to the International Journal of Robotics Research (IJRR), 37 page

    Solving constraints within a graph based dependency model by digitising a new process of incrementally casting concrete structures

    Get PDF
    The mechanisation of incrementally casting concrete structures can reduce the economic and environmental cost of the formwork which produces them. Low-tech versions of these forms have been designed to produce structures with cross-sectional continuity, but the design and implementation of complex adaptable formworks remains untenable for smaller projects. Addressing these feasibility issues by digitally modelling these systems is problematic because constraint solvers are the obvious method of modelling the adaptable formwork, but cannot acknowledge the hierarchical relationships created by assembling multiple instances of the system. This thesis hypothesises that these opposing relationships may not be completely disparate and that simple dependency relationships can be used to solve constraints if the real procedure of constructing the system is replicated digitally. The behaviour of the digital model was correlated with the behaviour of physical prototypes of the system which were refined based on digital explorations of its possibilities. The generated output is assessed physically on the basis of its efficiency and ease of assembly and digitally on the basis that permutations can be simply described and potentially built in reality. One of the columns generated by the thesis will be cast by the redesigned system in Lyon at the first F2F (file to factory) continuum workshop

    Multi-resolution dimer models in heat baths with short-range and long-range interactions

    Full text link
    This work investigates multi-resolution methodologies for simulating dimer models. The solvent particles which make up the heat bath interact with the monomers of the dimer either through direct collisions (short-range) or through harmonic springs (long-range). Two types of multi-resolution methodologies are considered in detail: (a) describing parts of the solvent far away from the dimer by a coarser approach; (b) describing each monomer of the dimer by using a model with different level of resolution. These methodologies are then utilised to investigate the effect of a shared heat bath versus two uncoupled heat baths, one for each monomer. Furthermore the validity of the multi-resolution methods is discussed by comparison to dynamics of macroscopic Langevin equations.Comment: Submitted to Interface Focu

    Dynamic problems for metamaterials: Review of existing models and ideas for further research

    Get PDF
    Metamaterials are materials especially engineered to have a peculiar physical behaviour, to be exploited for some well-specified technological application. In this context we focus on the conception of general micro-structured continua, with particular attention to piezoelectromechanical structures, having a strong coupling between macroscopic motion and some internal degrees of freedom, which may be electric or, more generally, related to some micro-motion. An interesting class of problems in this context regards the design of wave-guides aimed to control wave propagation. The description of the state of the art is followed by some hints addressed to describe some possible research developments and in particular to design optimal design techniques for bone reconstruction or systems which may block wave propagation in some frequency ranges, in both linear and non-linear fields. (C) 2014 Elsevier Ltd. All rights reserved

    On the use of simulated experiments in designing tests for material characterization from full-field measurements

    Get PDF
    The present paper deals with the use of simulated experiments to improve the design of an actual mechanical test. The analysis focused on the identification of the orthotropic properties of composites using the unnotched Iosipescu test and a full-field optical technique, the grid method. The experimental test was reproduced numerically by finite element analysis and the recording of deformed grey level images by a CCD camera was simulated trying to take into account the most significant parameters that can play a role during an actual test, e.g. the noise, the failure of the specimen, the size of the grid printed on the surface, etc. The grid method then was applied to the generated synthetic images in order to extract the displacement and strain fields and the Virtual Fields Method was finally used to identify the material properties and a cost function was devised to evaluate the error in the identification. The developed procedure was used to study different features of the test such as the aspect ratio and the fibre orientation of the specimen, the use of smoothing functions in the strain reconstruction from noisy data, the influence of missing data on the identification. Four different composite materials were considered and, for each of them, a set of optimized design variables was found by minimization of the cost function
    • ā€¦
    corecore