3,967 research outputs found

    Teaching old sensors New tricks: archetypes of intelligence

    No full text
    In this paper a generic intelligent sensor software architecture is described which builds upon the basic requirements of related industry standards (IEEE 1451 and SEVA BS- 7986). It incorporates specific functionalities such as real-time fault detection, drift compensation, adaptation to environmental changes and autonomous reconfiguration. The modular based structure of the intelligent sensor architecture provides enhanced flexibility in regard to the choice of specific algorithmic realizations. In this context, the particular aspects of fault detection and drift estimation are discussed. A mixed indicative/corrective fault detection approach is proposed while it is demonstrated that reversible/irreversible state dependent drift can be estimated using generic algorithms such as the EKF or on-line density estimators. Finally, a parsimonious density estimator is presented and validated through simulated and real data for use in an operating regime dependent fault detection framework

    Structural health monitoring of offshore wind turbines: A review through the Statistical Pattern Recognition Paradigm

    Get PDF
    Offshore Wind has become the most profitable renewable energy source due to the remarkable development it has experienced in Europe over the last decade. In this paper, a review of Structural Health Monitoring Systems (SHMS) for offshore wind turbines (OWT) has been carried out considering the topic as a Statistical Pattern Recognition problem. Therefore, each one of the stages of this paradigm has been reviewed focusing on OWT application. These stages are: Operational Evaluation; Data Acquisition, Normalization and Cleansing; Feature Extraction and Information Condensation; and Statistical Model Development. It is expected that optimizing each stage, SHMS can contribute to the development of efficient Condition-Based Maintenance Strategies. Optimizing this strategy will help reduce labor costs of OWTs׳ inspection, avoid unnecessary maintenance, identify design weaknesses before failure, improve the availability of power production while preventing wind turbines׳ overloading, therefore, maximizing the investments׳ return. In the forthcoming years, a growing interest in SHM technologies for OWT is expected, enhancing the potential of offshore wind farm deployments further offshore. Increasing efficiency in operational management will contribute towards achieving UK׳s 2020 and 2050 targets, through ultimately reducing the Levelised Cost of Energy (LCOE)

    E-comparisons: A new approach to round robin tests for environmental noise assessment in transport infrastructures

    Get PDF
    The increasing importance of pollutant noise has led to the creation of many new noise testing laboratories in recent years. For this reason and due to the legal implications that noise reporting may have, it is necessary to create procedures intended to guarantee the quality of the testing and its results. For instance, the ISO/IEC standard 17025:2005 specifies general requirements for the competence of testing laboratories. In this standard, interlaboratory comparisons are one of the main measures that must be applied to guarantee the quality of laboratories when applying specific methodologies for testing. In the specific case of environmental noise, round robin tests are usually difficult to design, as it is difficult to find scenarios that can be available and controlled while the participants carry out the measurements. Monitoring and controlling the factors that can influence the measurements (source emissions, propagation, background noise…) is not usually affordable, so the most extended solution is to create very effortless scenarios, where most of the factors that can have an influence on the results are excluded (sampling, processing of results, background noise, source detection…) The new approach described in this paper only requires the organizer to make actual measurements (or prepare virtual ones). Applying and interpreting a common reference document (standard, regulation…), the participants must analyze these input data independently to provide the results, which will be compared among the participants. The measurement costs are severely reduced for the participants, there is no need to monitor the scenario conditions, and almost any relevant factor can be included in this methodolog

    Damage identification in structural health monitoring: a brief review from its implementation to the Use of data-driven applications

    Get PDF
    The damage identification process provides relevant information about the current state of a structure under inspection, and it can be approached from two different points of view. The first approach uses data-driven algorithms, which are usually associated with the collection of data using sensors. Data are subsequently processed and analyzed. The second approach uses models to analyze information about the structure. In the latter case, the overall performance of the approach is associated with the accuracy of the model and the information that is used to define it. Although both approaches are widely used, data-driven algorithms are preferred in most cases because they afford the ability to analyze data acquired from sensors and to provide a real-time solution for decision making; however, these approaches involve high-performance processors due to the high computational cost. As a contribution to the researchers working with data-driven algorithms and applications, this work presents a brief review of data-driven algorithms for damage identification in structural health-monitoring applications. This review covers damage detection, localization, classification, extension, and prognosis, as well as the development of smart structures. The literature is systematically reviewed according to the natural steps of a structural health-monitoring system. This review also includes information on the types of sensors used as well as on the development of data-driven algorithms for damage identification.Peer ReviewedPostprint (published version

    Challenges, limitations, and measurement strategies to ensure data quality in deep-sea sensors

    Get PDF
    In this paper we give an overview of factors and limitations impairing deep-sea sensor data, and we show how automatic tests can give sensors self-validation and self-diagnostic capabilities. This work is intended to lay a basis for sophisticated use of smart sensors in long-term autonomous operation in remote deep-sea locations. Deep-sea observation relies on data from sensors operating in remote, harsh environments which may affect sensor output if uncorrected. In addition to the environmental impact, sensors are subject to limitations regarding power, communication, and limitations on recalibration. To obtain long-term measurements of larger deep-sea areas, fixed platform sensors on the ocean floor may be deployed for several years. As for any observation systems, data collected by deep-sea observation equipment are of limited use if the quality or accuracy (closeness of agreement between the measurement and the true value) is not known. If data from a faulty sensor are used directly, this may result in an erroneous understanding of deep water conditions, or important changes or conditions may not be detected. Faulty sensor data may significantly weaken the overall quality of the combined data from several sensors or any derived model. This is particularly an issue for wireless sensor networks covering large areas, where the overall measurement performance of the network is highly dependent on the data quality from individual sensors. Existing quality control manuals and initiatives for best practice typically recommend a selection of (near) real-time automated checks. These are mostly limited to basic and straight forward verification of metadata and data format, and data value or transition checks against pre-defined thresholds. Delayed-mode inspection is often recommended before a final data quality stamp is assigned.publishedVersio

    Feature-based underwater localization using an imaging sonar

    Get PDF
    The ability of an AUV to locate itself in an environment as well as to detect relevant environmental features is of key importance for navigation success. Sonars are one the most common sensing devices for underwater localization and mapping, being used to detect and identify underwater structural features. This study explores the processing and analysis of acoustic images, through the data acquired by a mechanical scanning imaging sonar, in order to extract relevant environmental features that enable location estimation. For this purpose, the performances of different state-of-the art feature extraction algorithms were evaluated. Furthermore, an improvement to the feature matching step is proposed, in order to adapt this procedure to the characteristics of acoustic images. The extracted features are then used to feed a location estimator composed of a Simultaneous Localization and Mapping algorithm implementing an Extended Kalman Filter. Several tests were performed in a structured environment and the results of the feature extraction process and localization are presented

    Self consistent bathymetric mapping from robotic vehicles in the deep ocean

    Get PDF
    Submitted In partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology and Woods Hole Oceanographic Institution June 2005Obtaining accurate and repeatable navigation for robotic vehicles in the deep ocean is difficult and consequently a limiting factor when constructing vehicle-based bathymetric maps. This thesis presents a methodology to produce self-consistent maps and simultaneously improve vehicle position estimation by exploiting accurate local navigation and utilizing terrain relative measurements. It is common for errors in the vehicle position estimate to far exceed the errors associated with the acoustic range sensor. This disparity creates inconsistency when an area is imaged multiple times and causes artifacts that distort map integrity. Our technique utilizes small terrain "submaps" that can be pairwise registered and used to additionally constrain the vehicle position estimates in accordance with actual bottom topography. A delayed state Kalman filter is used to incorporate these sub-map registrations as relative position measurements between previously visited vehicle locations. The archiving of previous positions in a filter state vector allows for continual adjustment of the sub-map locations. The terrain registration is accomplished using a two dimensional correlation and a six degree of freedom point cloud alignment method tailored for bathymetric data. The complete bathymetric map is then created from the union of all sub-maps that have been aligned in a consistent manner. Experimental results from the fully automated processing of a multibeam survey over the TAG hydrothermal structure at the Mid-Atlantic ridge are presented to validate the proposed method.This work was funded by the CenSSIS ERC of the Nation Science Foundation under grant EEC-9986821 and in part by the Woods Hole Oceanographic Institution through a grant from the Penzance Foundation

    First Season QUIET Observations: Measurements of Cosmic Microwave Background Polarization Power Spectra at 43 GHz in the Multipole Range 25 ≤ ℓ ≤ 475

    Get PDF
    The Q/U Imaging ExperimenT (QUIET) employs coherent receivers at 43 GHz and 94 GHz, operating on the Chajnantor plateau in the Atacama Desert in Chile, to measure the anisotropy in the polarization of the cosmic microwave background (CMB). QUIET primarily targets the B modes from primordial gravitational waves. The combination of these frequencies gives sensitivity to foreground contributions from diffuse Galactic synchrotron radiation. Between 2008 October and 2010 December, over 10,000 hr of data were collected, first with the 19 element 43 GHz array (3458 hr) and then with the 90 element 94 GHz array. Each array observes the same four fields, selected for low foregrounds, together covering ≈1000 deg^2. This paper reports initial results from the 43 GHz receiver, which has an array sensitivity to CMB fluctuations of 69 μK√s. The data were extensively studied with a large suite of null tests before the power spectra, determined with two independent pipelines, were examined. Analysis choices, including data selection, were modified until the null tests passed. Cross-correlating maps with different telescope pointings is used to eliminate a bias. This paper reports the EE, BB, and EB power spectra in the multipole range ℓ = 25-475. With the exception of the lowest multipole bin for one of the fields, where a polarized foreground, consistent with Galactic synchrotron radiation, is detected with 3σ significance, the E-mode spectrum is consistent with the ΛCDM model, confirming the only previous detection of the first acoustic peak. The B-mode spectrum is consistent with zero, leading to a measurement of the tensor-to-scalar ratio of r = 0.35^(+1.06)_(–0.87). The combination of a new time-stream "double-demodulation" technique, side-fed Dragonian optics, natural sky rotation, and frequent boresight rotation leads to the lowest level of systematic contamination in the B-mode power so far reported, below the level of r = 0.1
    corecore