39 research outputs found

    Assembly-Based Vulnerability of Buildings and Its Use in Performance Evaluation

    Get PDF
    Assembly-based vulnerability (ABV) is a framework for evaluating the seismic vulnerability and performance of buildings on a building-specific basis. It utilizes the damage to individual building components and accounts for the building's seismic setting, structural and nonstructural design and use. A simulation approach to implementing ABV first applies a ground motion time history to a structural model to determine structural response. The response is applied to assembly fragility functions to simulate damage to each structural and nonstructural element in the building, and to its contents. Probabilistic construction cost estimation and scheduling are used to estimate repair cost and loss-of-use duration as random variables. It also provides a framework for accumulating post-earthquake damage observations in a statistically systematic and consistent manner. The framework and simulation approach are novel in that they are fully probabilistic, address damage at a highly detailed and building-specific level, and do not rely extensively on expert opinion. ABV is illustrated using an example pre-Northridge welded-steel-moment-frame office building

    Embedding damage detection algorithms in a wireless sensing unit for operational power efficiency

    Full text link
    A low-cost wireless sensing unit is designed and fabricated for deployment as the building block of wireless structural health monitoring systems. Finite operational lives of portable power supplies, such as batteries, necessitate optimization of the wireless sensing unit design to attain overall energy efficiency. This is in conflict with the need for wireless radios that have far-reaching communication ranges that require significant amounts of power. As a result, a penalty is incurred by transmitting raw time-history records using scarce system resources such as battery power and bandwidth. Alternatively, a computational core that can accommodate local processing of data is designed and implemented in the wireless sensing unit. The role of the computational core is to perform interrogation tasks of collected raw time-history data and to transmit via the wireless channel the analysis results rather than time-history records. To illustrate the ability of the computational core to execute such embedded engineering analyses, a two-tiered time-series damage detection algorithm is implemented as an example. Using a lumped-mass laboratory structure, local execution of the embedded damage detection method is shown to save energy by avoiding utilization of the wireless channel to transmit raw time-history data.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/49012/2/sms4_4_018.pd

    Benefit-Cost Analysis of FEMA Hazard Mitigation Grants

    Get PDF
    Mitigation ameliorates the impact of natural hazards on communities by reducing loss of life and injury, property and environmental damage, and social and economic disruption. The potential to reduce these losses brings many benefits, but every mitigation activity has a cost that must be considered in our world of limited resources. In principle benefit-cost analysis (BCA) can be used to assess a mitigation activity’s expected net benefits (discounted future benefits less discounted costs), but in practice this often proves difficult. This paper reports on a study that refined BCA methodologies and applied them to a national statistical sample of FEMA mitigation activities over a ten-year period for earthquake, flood, and wind hazards. The results indicate that the overall benefit-cost ratio for FEMA mitigation grants is about 4 to 1, though the ratio varies according to hazard and mitigation type.

    The occupation of a box as a toy model for the seismic cycle of a fault

    Full text link
    We illustrate how a simple statistical model can describe the quasiperiodic occurrence of large earthquakes. The model idealizes the loading of elastic energy in a seismic fault by the stochastic filling of a box. The emptying of the box after it is full is analogous to the generation of a large earthquake in which the fault relaxes after having been loaded to its failure threshold. The duration of the filling process is analogous to the seismic cycle, the time interval between two successive large earthquakes in a particular fault. The simplicity of the model enables us to derive the statistical distribution of its seismic cycle. We use this distribution to fit the series of earthquakes with magnitude around 6 that occurred at the Parkfield segment of the San Andreas fault in California. Using this fit, we estimate the probability of the next large earthquake at Parkfield and devise a simple forecasting strategy.Comment: Final version of the published paper, with an erratum and an unpublished appendix with some proof

    The continuous wavelet transform as a stochastic process for damage detection

    No full text
    This paper presents the formulation of a novel statistical model for the wavelet transform of the acceleration response of a structure based on Gaussian Process Theory. The model requires no prior knowledge of the structural properties and all the model parameters are learned directly from the measured data using Maximum Likelihood Estimation. The proposed model is applied to the data obtained from a series of shake table tests and the results are presented. The results, even at a proof-of-concept level, appear to correlate well with the ocurrence of damage, which is an indication of the validity of the underlying model. The results from the use of a simple metric for the detection of damage are presented as well.Non UBCUnreviewedThis collection contains the proceedings of ICASP12, the 12th International Conference on Applications of Statistics and Probability in Civil Engineering held in Vancouver, Canada on July 12-15, 2015. Abstracts were peer-reviewed and authors of accepted abstracts were invited to submit full papers. Also full papers were peer reviewed. The editor for this collection is Professor Terje Haukaas, Department of Civil Engineering, UBC Vancouver.Facult

    Application of System Reliability Theory in the Seismic Analysis of Structures

    No full text

    Use of Wavelet-Based Damage-Sensitive Features for Structural Damage Diagnosis Using Strong Motion Data

    No full text
    This paper introduces three wavelet-based damage-sensitive features (DSFs) extracted from structural responses recorded during earthquakes to diagnose structural damage. Because earthquake excitations are nonstationary, the wavelet transform, which represents data as a weighted sum of time-localized waves, is used to model the structural responses. These DSFs are defined as functions of wavelet energies at particular frequencies and specific times. The first DSF (DSF 1) indicates how the wavelet energy at the original natural frequency of the structure changes as the damage progresses. The second DSF (DSF 2) indicates how much the wavelet energy is spread out in time. The third DSF (DSF 3) reflects how slowly the wavelet energy decays with time. The performance of these DSFs is validated using two sets of shake-table test data. The results show that as the damage extent increases, the DSF 1 value decreases and the DSF 2 and DSF 3 values increase. Thus, these DSFs can be used to diagnose structural damage. The robustness of these DSFs to different input ground motions is also investigated using a set of simulated data. © 2011 American Society of Civil Engineers

    Design and Performance Validation of a Wireless Sensing Unit for Structural Monitoring Applications

    No full text
    There exists a clear need to monitor the performance of civil structures over their operational lives. Current commercial monitoring systems suffer from various technological and economic limitations that prevent their widespread adoption. The wires used to route measurements from system sensors to the centralized data server represent one of the greatest limitations since they are physically vulnerable and expensive from an installation and maintenance standpoint. In lieu of cables, the introduction of low-cost wireless communications is proposed. The result is the design of a prototype wireless sensing unit that can serve as the fundamental building block of wireless modular monitoring systems (WiMMS). An additional feature of the wireless sensing unit is the incorporation of computational power in the form of state-of-art microcontrollers. The prototype unit is validated with a series of laboratory and field tests. The Alamosa Canyon Bridge is employed to serve as a full-scale benchmark structure to validate the performance of the wireless sensing unit in the field. A traditional cable-based monitoring system is installed in parallel with the wireless sensing units for performance comparison
    corecore