373 research outputs found

    Structural health monitoring : a data-driven damage detection approach

    Get PDF
    Despite their importance, many Structural Health Monitoring (SHM) systems still rely on human inspection to verify the condition of the structure under analysis. Thus, the present work focus on creating an intelligent system that is able to detect damage automatically. This system is based on a real-life structure and the data collected in both undamaged and damaged states of the structure. Two SHM approaches are proposed. First, explanatory models supported by machine learning algorithms (linear regression, random forest, support vector machines and neural network) are used to predict the values of the physical properties monitored in a regular condition. By comparing the predicted and observed values, a potential abnormal condition of the structure is detected by means of a Hotelling T2 control chart. In the second approach, a time series analysis is adopted, using the cointegration properties of the series to compute the relationships between the variables monitored. These relationships are monitored with a X-bar control chart, where a potential change in the relationship indicate the presence of damage. The two proposed approaches revealed to be capable of damage detection only when there is indeed a damage. More so, after the damage has been induced in the structure, both were able to signal an anomaly before 24 hours have passed. These results support the fact that SHM systems constitute a relevant tool to support the decision-makers in charge of monitoring the condition of the structures.Apesar da sua importância, muitos sistemas de Monitorização da Saúde Estrutural (MSE) ainda dependem da inspeção humana para verificar a condição da estrutura em análise. Assim, o presente trabalho foca-se na criação de um sistema inteligente capaz de detetar dano de forma autónoma. Este sistema é baseado numa estrutura real em que os dados são captados nos estados com e sem dano da própria estrutura. Duas abordagens para a MSE são propostas. Primeiro, modelos explicativos suportados por algoritmos de machine learning (regressão linear, random forest, redes neuronais e máquina de vetores de suporte) são usados para prever os valores das propriedades físicas monitorizadas numa condição normal. Comparando os valores previstos com os observados, uma potencial condição anormal da estrutura é detetada por meios de uma carta de controlo Hotelling T2. Numa segunda abordagem, a análise de series temporais é adotada, usando as propriedades da cointegração das séries para encontrar as relações entre as variáveis monitorizadas. Estas relações são acompanhadas por uma carta de controlo X-bar, onde uma potencial mudança nas anteriores indica a presença de dano. As duas abordagens propostas revelam ter a capacidade de detetar dano apenas quando realmente ele existe. Mesmo depois de o dano ter sido induzido na estrutura, ambas foram capazes de sinalizar uma anomalia antes de passarem 24 horas. Estes resultados apoiam o facto de os sistemas de monitorização da saúde estrutural revelarem ser ferramentas relevantes ao suporte à tomada de decisão no que toca à monitorização da condição de estruturas

    On Nonstationarity from Operational and Environmental Effects in Structural Health Monitoring Bridge Data

    Get PDF
    Structural Health Monitoring (SHM) describes a set of activities that can be followed in order to collect data from an existent structure, generate data-based information about its current condition, identify the presence of any signs of abnormality and forecast its future response. These activities, among others, include instrumentation, data acquisition, processing, generation of diagnostic tools, as well as transmission of information to engineers, owners and authorities. SHM and, more specifically, continuous monitoring can provide numerous measures, which can be generally classified into three categories; vibrational-based, which includes natural frequencies, modeshapes, damping ratios, component-based, such as strains, tensions, deflections and environmental and operational variations (EOVs), associated with temperature, wind, traffic humidity and others. One of the main technical problems that SHM has to tackle is that of data normalisation. In abstract terms, this describes the impact that EOVs can have on SHM measures. In many cases, with interest placed on bridges, it has been observed that EOVs introduce nonstationary to SHM signals that can mask the variability that can be associated with the presence of damage; making damage detection attempts difficult. Hence, it is desirable to quantify the impacts of EOVs on damage sensitive features, project them out, using methods such as the cointegration, Principal Component Analysis (PCA) or others, in order to achieve a stationary signal. This type of signal can be assessed over time using tools, such as statistical process control (SPC) charts, to identify the existence of novelty, which can be linked with damage. As one can understand from the latter, it is important to detect the presence of nonstationary in SHM signals and identify its sources. However, this is not a straight-forward procedure and one important question that need to be answered is; how one can judge if a signal is stationary or not. Inside this work, this question is discussed, focusing on the definition of weak stationarity and under which assumption this judgement holds. In particular, the data coming from SHM are finite samples. Therefore, the mean and variance of a signal can be tracked, using a sequence of moving windows, something that needs a prior determination of the width of window. However, the major concern here is that the SHM signals can be characterised as periodically-correlated or cyclostationary. In such cases, it seems that it is better for one to use more advanced statistical tools to assess a signal's nonstationary. More specifically, nonstationary tests coming from the context of Econometrics and time-series analysis can be employed. In order to use such proxies more extensively, one should build trust on their indications by understanding the mechanism under which they perform. This work concentrates on the Augmented Dickey-Fuller (ADF) nonstationary test and emphasis is placed on the hypothesis (unit root) under which performs its assessment. In brief, a series of simulations are generated, and based on dimensional analysis, it is shown that the ADF test is essentially counts the number of cycles/periods of the dominant periodic component. Its indications depend on the number of observations/cycles, the normalised frequency of the signal, the sampling rate and signal-to-noise ratio (SNR). The most important conclusion made is that knowing the sampling frequency of any given signal, a critical frequency in Hz can be found, which can be derived from the critical normalised one, as a function of the number of cycles, which can be directly used to judge if the signal is stationary or not. In other words, this investigation provides an answer to the question; after how many cycles of continuous monitoring (i.e. days), an SHM signal can be judged as stationary? As well as considering nonstationary in a general way, this thesis returns to the main issue of data normalisation. To begin with, a laboratory test is performed, at the laboratory (Jonas lab) of Sheffield University, on an aluminium truss bridge model manufactured there. In particular, that involved vibration analysis of the truss bridge inside an environmental chamber, which simulated varying temperature conditions from -10 to 20 deg. Celsius, while damage introduced on the structure by the removal of bolts and connecting brackets in two locations of the model. This experiment provided interesting results to discuss further the impact of EOVs on data coming from the monitoring of a small-scale structure. After that, the thesis discusses the use of Johansen's approach to cointegration in the context of SHM, demonstrate its use on the laboratory truss bridge data and provides a review of the available methods that can be used to monitor the cointegration residual. The latter is the stationary signal provided by cointegration which is free from EOVs and capable for novelty detection. The methodologies reviewed are various SPC charts, while also the use of ADF is also explored, providing extensive discussion. Furthermore, an important conclusion from the SHM literature is that the impact of EOVs on SHM signals can occur on widely disparate time scales. Therefore, the quantification and elimination of these impacts from signals is not an easy procedure and prior knowledge is needed. For such purposes, refined means originated from the field of signal processing can be used within SHM. Of particular interest here is the concept of multiresolution analysis (MRA), which has been used in SHM in order to decompose a given signal in its frequency components (different time-scales) and evaluate the damage sensitivity of each one, employing the Johansen's approach to cointegration, which is able to project out the impact of EOVs from multiple SHM series. A more principled way to perform MRA is proposed here, in order to decompose SHM signals, by introducing two additional steps. The first step is the ADF test, which can be used to assess each one of the MRA levels in terms of nonstationary. In this way, a critical decomposition level (L*) can be found and used to decompose the original SHM signal into a non-stationary and stationary part. The second step introduced includes the use of autocorrelation functions (ACFs) in order to test the stationary MRA levels and identify those that can be considered as delta-correlated. These levels can be used to form a noisy component inside the stationary one. Assuming that all the aforementioned steps are confirmed, the original signal can now be decomposed into a stationary, a mean, a non-stationary and a noisy component. The proposed decomposition can be of great interest not only for SHM purposes, but also in the general context of time-series analysis, as it provides a principled way to perform MRA. The proposed analysis is demonstrated on natural frequency and temperature data of the Z24 Bridge. All in all, the thesis tries to answer the following questions: 1) How an SHM signal can be judged as non-stationary/stationary and under which assumptions? 2) After how many cycles of continuous monitoring an SHM signal that is initially non-stationary becomes stationary? 3) Which are the main drivers of this nonstationary (i.e. EOVs, abnormality/damage or others)? 4) How one can distinguish the effect of EOVs from this of abnormality/damage? 5) How one can project out the confounding} influence of EOVs from an SHM signal and provide a signal that is capable for novelty detection? 6) Is it possible to decompose an SHM signal and study each one of these components separately? 7) Which of these components are mostly affected by EOVs, which from damage and which do not include important information in terms of novelty detection? Understanding and answering all the aforementioned questions can help on identifying signals that can be monitored over time or in data windows, ensuring that stationarity achieved, employing methodologies such as statistical process control (SPC) for novelty detection

    Is there a Natural Rate of Crime?

    Get PDF
    Studies in the economics of crime literature have reached mixed conclusions on the deterrence hypothesis. One explanation which has been offered for the failure to find evidence of a deterrent effect in the long run is the natural rate of crime. This paper applies the univariate Lagrange Multiplier (LM) unit root test with one and two structural breaks to crime series for the United Kingdom and United States and the panel LM unit root test with and without a structural break to crime rates for a panel of G7 countries to examine whether there is a natural rate of crime. Our main finding is that when we allow for two structural breaks in the LM unit root test and a structural break in the panel data unit root test, there is strong evidence of a natural rate of crime. The policy implications of our findings is that governments should focus on altering the economic and social structural profile which determines crime in the long run rather than increasing expenditure on law enforcement which will at best reduce crime rates in the short run.Natural rate of crime; Deterrence hypothesis; unit root.

    Confounding factors analysis and compensation for highspeed bearing diagnostics

    Get PDF
    In recent years, machine diagnostics through vibration monitoring is gaining a rising interest. Indeed, in the literature many advanced techniques are available to disclose the fault establishment as well as damage type, location and severity. Unfortunately, in general, these high-level algorithms are not robust to operational and environmental variables, restricting the field of applicability of machine diagnostics. Most of industrial machines in fact, work with variable loads, at variable speeds and in uncontrolled environments, so that the finally measured signals are often non-stationary. The very common time-series features based on statistical moments (such as root mean square, skewness, kurtosis, peak value and crest factor) undergo variations related to changes in the machine operational parameters (e.g. speed, load, …) or in the environmental parameters (e.g. temperature, humidity, …), which can be seen as non-measured, and then latent, confounding factors with respect to the health information of interest. In order to face such issue, statistical techniques like (in a first exploratory stage) the Principal Component Analysis, or the Factor Analysis, are available. The pursuit of features insensitive to these factors, can be also tackled exploiting the cointegration property of non-stationary signals. In this paper, the most common methods for reducing the influence of latent factors are considered, and applied to investigate the data collected over the rig available at the DIRG laboratory, specifically conceived to test high speed aeronautical bearings monitoring vibrations by means of 2 tri-axial accelerometers while controlling the rotational speed (0 – 30000 RPM), the radial load (0 to 1800 N) and recording the lubricant oil temperature. The compensation scheme is based on two procedures which are established in univariate analyses, but not so well documented in multivariate cases: the removal of deterministic trends by subtraction of a regression, and the removal of stochastic trends in difference stationary series by subtraction of the one-step ahead prediction from an autoregressive model. The extension of these methods to the multivariate case is here analysed to find an effective way of enhancing damage patterns when the masking effect due to the non-stationarities induced by latent factors is strong

    The Relationship Between Geopolitical Risk and Credit Default Swap Premium: Evidence from Turkey

    Get PDF
    This study investigates the relationship between the geopolitical risk in Turkey arising out of the war and terror incidents happened in the region during the period 2003:01-2020:06 with the CDS premium. A two-step approach is undertaken for this assessment, in which an ARDL limit test and then a time-varying symmetric and asymmetric causality test are applied to study the possible causality vis-a-vis the subperiods. The ARDL limit test does not reject the hypothesis that there is a co-integrated relationship between CDS premium and geopolitical risk index. In addition, the time-varying symmetric and asymmetric test also identifies causality between CDS premium and geopolitical risk, and establishes periods where the latter influences the former variable both in a positive and negative way. In summary, both the ARDL limit test and the time-varying symmetric and asymmetric test deduce a causal relationship between the studied variables

    Damage assessment of pre-stressed structures: A SVD-based approach to deal with time-varying loading

    Get PDF
    International audienceVibration-based methods are well-established and effective tools to assess the health state of civil, mechanical and aerospace engineering structures. However, their reliability is still affected by the variability of the features commonly used for damage detection. Environmental effects and changes in operational conditions are the main sources of variability in the structural response. As a consequence, the modal identification used to extract damage sensitive features has to face constricting requirements in terms of signals stationarity and performance accuracy. Moreover, with reference to the damage assessment, large variations of monitored features mask subtle effects due to damage, which remain undetected. This study is conceived to address both these issues by focusing, in particular, on the non-stationarity of the loading conditions of tensioned structures, such as cables and pre-stressed beams. The capability of spectral methods to deal with the modal identification of non-stationary systems is enhanced by a curve-fitting procedure based on nonlinear least squares optimization. Wavelet analysis is applied for comparison and validation of the FFT-based technique. Identified natural frequencies are then used for the damage detection, exploiting the capacity of singular values decomposition to discriminate between damage-related events and the intrinsic non-stationary nature of the structural response. A reduced-order realization of the features set is performed to amplify changes not belonging to measurement variability but deriving from exogenous events, such as damage. The proposed methodology is validated by experimental analyses carried out on beams subjected to time-varying loading conditions in order to simulate the health monitoring of quasi and non-stationary systems

    On Nonlinear Cointegration Methods for Structural Health Monitoring

    Get PDF
    Structural health monitoring (SHM) is emerging as a crucial technology for the assessment and management of important assets in various industries. Thanks to the rapid developments of sensing technology and computing machines, large amounts of sensor data are now becoming much easier and cheaper to obtain from monitored structures, which consequently has enabled data-driven methods to become the main work forces for real world SHM systems. However, SHM practitioners soon discover a major problem for in-service SHM systems; that is the effect of environmental and operational variations (EOVs). Most assets (bridges, aircraft engines, wind turbines) are so important that they are too costly to be isolated for testing and examination purposes. Often, their structural properties are heavily in uenced by ambient environmental and operational conditions, or EOVs. So, the most important question raised for an effective SHM system is, how one could tell whether an alarm signal comes from structural damage or from EOVs? Cointegration, a method originating from econometric time series analysis, has proven to be one of the most promising approaches to address the above question. Cointegration is a property of nonstationary time series, it models the long-run relationship among multiple nonstationary time series. The idea of employing the cointegration method in the SHM context relies on the fact that this long-run relationship is immune to the changes caused by EOVs, but when damage occurs, this relationship no longer stands. The work in this thesis aims to further strengthen and extend conventional linear cointegration methods to a nonlinear context, by hybridising cointegration with machine learning and time series models. There are three contributions presented in this thesis: The first part is about a nonlinear cointegration method based on Gaussian process (GP) regression. Instead of using a linear regression, this part attempts to establish a nonlinear cointegrating regression with a GP. GP regression is a powerful Bayesian machine learning approach that can produce probabilistic predictions and avoid overfitting. The proposed method is tested with one simulated case study and with the Z24 Bridge SHM data. The second part concerns developing a regime-switching cointegration approach. Instead of modelling nonlinear cointegration as a smooth function, this part sees cointegration as a piecewise-linear function, which is triggered by some external variable. The model is trained with the aid of the augmented Dickey-Fuller (ADF) test statistics. Two case studies are presented in this part, one simulated mulitidegree-of-freedom system, and also the Z24 Bridge data. The third part of this work introduces a cointegration method for heteroscedastic data. Heteroscedasticity, or time-dependent noise is often observed in SHM data, normally caused by seasonal variations. In order to address this issue, the TBATS (an acronym for key features of the model: Trigonometric, Box-Cox transformation, ARMA error, Trend, Seasonal components) model is employed to decompose the seasonal-corrupted time series, followed by conventional cointegration analysis. A simulated cantilever beam and real measurement data from the NPL Bridge are used to validate the proposed method

    Automated Discovery in Econometrics

    Get PDF
    Our subject is the notion of automated discovery in econometrics. Advances in computer power, electronic communication, and data collection processes have all changed the way econometrics is conducted. These advances have helped to elevate the status of empirical research within the economics profession in recent years and they now open up new possibilities for empirical econometric practice. Of particular significance is the ability to build econometric models in an automated way according to an algorithm of decision rules that allow for (what we call here) heteroskedastic and autocorrelation robust (HAR) inference. Computerized search algorithms may be implemented to seek out suitable models, thousands of regressions and model evaluations may be performed in seconds, statistical inference may be automated according to the properties of the data, and policy decisions can be made and adjusted in real time with the arrival of new data. We discuss some aspects and implications of these exciting, emergent trends in econometrics.Automation, discovery, HAC estimation, HAR inference, model building, online econometrics, policy analysis, prediction, trends

    Energy Consumption and Economic Growth: Parametric and Non-Parametric Causality Testing for the Case of Greece

    Get PDF
    The objective of this paper is to contribute to the understanding of the linear and non-linear causal linkages between total energy consumption and economic activity, making use of annual time series of Greece for the period 1960-2008. Two are the salient features of our study: first, the total energy consumption has been adjusted for qualitative differences among its constituent components through the thermodynamics of energy conversion. In doing so, we rule out the possibility of a misleading inference due to aggregation bias. Second, the investigation of the causal linkage between economic growth and the adjusted for quality total energy consumption is conducted within a non-linear context. Our empirical results reveal significant unidirectional both linear and non-linear causal linkages running from total useful energy to economic growth. These findings may provide valuable information for the contemplation of more effective energy policies with respect to both the consumption of energy and environmental protection
    • …
    corecore