29 research outputs found

    Support vector regression for anomaly detection from measurement histories

    Get PDF
    Copyright © 2013 Elsevier. NOTICE: this is the author’s version of a work that was accepted for publication in Advanced Engineering Informatics. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Advanced Engineering Informatics Vol. 27 (2013), DOI: 10.1016/j.aei.2013.03.002This research focuses on the analysis of measurements from distributed sensing of structures. The premise is that ambient temperature variations, and hence the temperature distribution across the structure, have a strong correlation with structural response and that this relationship could be exploited for anomaly detection. Specifically, this research first investigates whether support vector regression (SVR) models could be trained to capture the relationship between distributed temperature and response measurements and subsequently, if these models could be employed in an approach for anomaly detection. The study develops a methodology to generate SVR models that predict the thermal response of bridges from distributed temperature measurements, and evaluates its performance on measurement histories simulated using numerical models of a bridge girder. The potential use of these SVR models for damage detection is then studied by comparing their strain predictions with measurements collected from simulations of the bridge girder in damaged condition. Results show that SVR models that predict structural response from distributed temperature measurements could form the basis for a reliable anomaly detection methodology

    Predicting thermal response of bridges using regression models derived from measurement histories

    Get PDF
    Copyright © 2014 Elsevier. NOTICE: this is the author’s version of a work that was accepted for publication in Computers and Structures. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Computers and Structures Vol. 136 (2014), DOI: 10.1016/j.compstruc.2014.01.026This study investigates the application of novel computational techniques for structural performance monitoring of bridges that enable quantification of temperature-induced response during the measurement interpretation process. The goal is to support evaluation of bridge response to diurnal and seasonal changes in environmental conditions, which have widely been cited to produce significantly large deformations that exceed even the effects of live loads and damage. This paper proposes a regression-based methodology to generate numerical models, which capture the relationships between temperature distributions and structural response, from distributed measurements collected during a reference period. It compares the performance of various regression algorithms such as multiple linear regression (MLR), robust regression (RR) and support vector regression (SVR) for application within the proposed methodology. The methodology is successfully validated on measurements collected from two structures – a laboratory truss and a concrete footbridge. Results show that the methodology is capable of accurately predicting thermal response and can therefore help with interpreting measurements from continuous bridge monitoring

    Configuring and enhancing measurement systems for damage identification

    Get PDF
    Engineers often decide to measure structures upon signs of damage to determine its extent and its location. Measurement locations, sensor types and numbers of sensors are selected based on judgment and experience. Rational and systematic methods for evaluating structural performance can help make better decisions. This paper proposes strategies for supporting two measurement tasks related to structural health monitoring – (1) installing an initial measurement system and (2) enhancing measurement systems for subsequent measurements once data interpretation has occurred. The strategies are based on previous research into system identification using multiple models. A global optimization approach is used to design the initial measurement system. Then a greedy strategy is used to select measurement locations with maximum entropy among candidate model predictions. Two bridges are used to illustrate the proposed methodology. First, a railway truss bridge in Zangenberg, Germany, is examined. For illustration purposes, the model space is reduced by assuming only a few types of possible damage in the truss bridge. The approach is then applied to the Schwandbach bridge in Switzerland, where a broad set of damage scenarios is evaluated. For the truss bridge, the approach correctly identifies the damage that represents the behaviour of the structure. For the Schwandbach bridge, the approach is able to significantly reduce the number of candidate models. Values of candidate model parameters are also useful for planning inspection and eventual repair.Swiss National Science Foundatio

    Computational framework for remotely operable laboratories

    Get PDF
    Decision-makers envision a significant role for remotely operable laboratories in advancing research in structural engineering, as seen from the tremendous support for the network for earthquake engineering simulation (NEES) framework. This paper proposes a computational framework that uses LabVIEW and web technologies to enable observation and control of laboratory experiments via the internet. The framework, which is illustrated for a shaketable experiment, consists of two key hardware components: (1) a local network that has an NI-PXI with hardware for measurement acquisition and shaketable control along with a Windows-based PC that acquires images from a high-speed camera for video, and (2) a proxy server that controls access to the local network. The software for shaketable control and data/video acquisition are developed in the form of virtual instruments (VI) using LabVIEW development system. The proxy server employs a user-based authentication protocol to provide security to the experiment. The user can run perl-based CGI scripts on the proxy server for scheduling to control or observe the experiment in a future timeslot as well as gain access to control or observe the experiment during that timeslot. The proxy server implements single-controller multiple-observer architecture so that many users can simultaneously observe and download measurements as a single controller decides the waveform input into the shaketable. A provision is also created for users to simultaneously view the real-time video of the experiment. Two different methods to communicate the video are studied. It is concluded that a JPEG compression of the images acquired from the camera offers the best performance over a wide range of networks. The framework is accessible by a remote user with a computer that has access to a high-speed internet connection and has the LabVIEW runtime engine that is available at no cost to the user. Care is taken to ensure that the implementation of the LabVIEW applications and the perl scripts have little dependency for ease of portability to other experiment

    A genetic algorithm for design of moment-resisting steel frames

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s00158-011-0654-7Copyright © Springer-Verlag 2011This paper presents computational approaches that can be implemented in a decision support system for the design of moment-resisting steel frames. Trade-off studies are performed using genetic algorithms to evaluate the savings due to the inclusion of the cost of connections in the optimization model. Since the labor costs and material costs vary according to the geographical location and time of construction, the trade-off curves are computed for several values of the ratio between the cost of rigid connection and the cost of steel. A real-life 5-bay 5-story frame is used for illustration. Results indicate that the total cost of the frame is minimal when rigid connections are present only at certain locations. Finally, "Modeling to Generate Alternatives-MGA, " is proposed to generate good design alternatives as the solution from optimization may not be optimal with respect to the unmodeled objectives and constraints. It provides a set of alternatives that are near-optimal with respect to the modelled objectives and that are also farther from each other in the decision space. Results show that a final design could be chosen from the set of alternatives or obtained by tinkering one of the alternatives. © 2011 Springer-Verlag

    Multimodel structural performance monitoring

    Get PDF
    Journal ArticleMeasurements from load tests may lead to numerical models that better reflect structural behavior. This kind of system identification is not straightforward due to important uncertainties in measurement and models. Moreover, since system identification is an inverse engineering task, many models may fit measured behavior. Traditional model updating methods may not provide the correct behavioral model due to uncertainty and parameter compensation. In this paper, a multimodel approach that explicitly incorporates uncertainties and modeling assumptions is described. The approach samples thousands of models starting from a general parametrized finite-element model. The population of selected candidate models may be used to understand and predict behavior, thereby improving structural management decision making. This approach is applied to measurements from structural performance monitoring of the Langensand Bridge in Lucerne, Switzerland. Predictions from the set of candidate models are homogenous and show an average discrepancy of 4-7% from the displacement measurements. The tests demonstrate the applicability of the multimodel approach for the structural identification and performance monitoring of real structures. The multimodel approach reveals that the Langensand Bridge has a reserve capacity of 30% with respect to serviceability requirements.Swiss National Science Foundatio

    Design of tensegrity structures using parametric analysis and stochastic search.

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s00366-009-0154-1Tensegrity structures are lightweight structures composed of cables in tension and struts in compression. Since tensegrity systems exhibit geometrically nonlinear behavior, finding optimal structural designs is difficult. This paper focuses on the use of stochastic search for the design of tensegrity systems. A pedestrian bridge made of square hollow-rope tensegrity ring modules is studied. Two design methods are compared in this paper. Both methods aim to find the minimal cost solution. The first method approximates current practice in design offices. More specifically, parametric analysis that is similar to a gradient-based optimization is used to identify good designs. Parametric studies are executed for each system parameter in order to identify its influence on response. The second method uses a stochastic search strategy called probabilistic global search Lausanne. Both methods provide feasible configurations that meet civil engineering criteria of safety and serviceability. Parametric studies also help in defining search parameters such as appropriate penalty costs to enforce constraints while optimizing using stochastic search. Traditional design methods are useful to gain an understanding of structural behavior. However, due to the many local minima in the solution space, stochastic search strategies find better solutions than parametric studies.Swiss National Science Foundatio

    Design of tensegrity structures using parametric analysis and stochastic search

    Get PDF
    Tensegrity structures are lightweight structures composed of cables in tension and struts in compression. Since tensegrity systems exhibit geometrically nonlinear behavior, finding optimal structural designs is difficult. This paper focuses on the use of stochastic search for the design of tensegrity systems. A pedestrian bridge made of square hollow-rope tensegrity ring modules is studied. Two design methods are compared in this paper. Both methods aim to find the minimal cost solution. The first method approximates current practice in design offices. More specifically, parametric analysis that is similar to a gradient-based optimization is used to identify good designs. Parametric studies are executed for each system parameter in order to identify its influence on response. The second method uses a stochastic search strategy called probabilistic global search Lausanne. Both methods provide feasible configurations that meet civil engineering criteria of safety and serviceability. Parametric studies also help in defining search parameters such as appropriate penalty costs to enforce constraints while optimizing using stochastic search. Traditional design methods are useful to gain an understanding of structural behavior. However, due to the many local minima in the solution space, stochastic search strategies find better solutions than parametric studie

    Structural identification through continuous monitoring: data cleansing using temperature variations

    Get PDF
    The aim of structural performance monitoring is to infer the state of a structure from measurements and thereby support decisions related to structural management. Complex structures may be equipped with hundreds of sensors that measure quantities such as temperature, acceleration and strain. However, meaningful interpretation of data collected from continuous monitoring remains a challenge. MPCA (Moving principal component analysis) is a model-free data interpretation method which compares characteristics of a moving window of measurements against those derived from a reference period. This paper explores a data cleansing approach to improve the performance of MPCA. The approach uses a smoothing procedure or a low-pass filter (moving average) to exclude the effects of seasonal temperature variations. Consequently MPCA can use a smaller moving window and therefore detect anomalies more rapidly. Measurements from a numerical model and a prestressed beam are used to illustrate the approach. Results show that removal of seasonal temperature effects can improve the performance of MPCA. However, improvement may not be significant and there remains a trade off when choosing the window size. A small window increases the risk of false-positives while a large window increases the time to detect damage

    Optimal Sensor Placement for Damage Detection: Role of Global Search

    Get PDF
    corecore