4,154 research outputs found

    Detection of Building Damages in High Resolution SAR Images based on SAR Simulation

    Get PDF

    On the classifier performance for simulation based debris detection in sar imagery

    Get PDF
    Urban areas struck by disasters such as earthquakes are in need of a fast damage detection assessment. A post-event SAR image often is the first available image, most likely with no matching pre-event image to perform change detection. In previous work we have introduced a debris detection algorithm for this scenario that is trained exclusively with synthetically generated training data. A classification step is employed to separate debris from similar textures such as vegetation. In order to verify the use of a random forest classifier for this context, we conduct a performance comparison with two alternative popular classifiers, a support vector machine and a convolutional neural network. With the direct comparison revealing the random forest classifier to be best suited, the effective performance on the prospect of debris detection is investigated for the post-earthquake Christchurch scene. Results show a good separation of debris from vegetation and gravel, thus reducing the false alarm rate in the damage detection operation considerably

    Machine Learning-Based Data and Model Driven Bayesian Uncertanity Quantification of Inverse Problems for Suspended Non-structural System

    Get PDF
    Inverse problems involve extracting the internal structure of a physical system from noisy measurement data. In many fields, the Bayesian inference is used to address the ill-conditioned nature of the inverse problem by incorporating prior information through an initial distribution. In the nonparametric Bayesian framework, surrogate models such as Gaussian Processes or Deep Neural Networks are used as flexible and effective probabilistic modeling tools to overcome the high-dimensional curse and reduce computational costs. In practical systems and computer models, uncertainties can be addressed through parameter calibration, sensitivity analysis, and uncertainty quantification, leading to improved reliability and robustness of decision and control strategies based on simulation or prediction results. However, in the surrogate model, preventing overfitting and incorporating reasonable prior knowledge of embedded physics and models is a challenge. Suspended Nonstructural Systems (SNS) pose a significant challenge in the inverse problem. Research on their seismic performance and mechanical models, particularly in the inverse problem and uncertainty quantification, is still lacking. To address this, the author conducts full-scale shaking table dynamic experiments and monotonic & cyclic tests, and simulations of different types of SNS to investigate mechanical behaviors. To quantify the uncertainty of the inverse problem, the author proposes a new framework that adopts machine learning-based data and model driven stochastic Gaussian process model calibration to quantify the uncertainty via a new black box variational inference that accounts for geometric complexity measure, Minimum Description length (MDL), through Bayesian inference. It is validated in the SNS and yields optimal generalizability and computational scalability

    Application of association rules to determine building typological classes for seismic damage predictions at regional scale. The case study of Basel

    Get PDF
    Assessing seismic vulnerability at large scales requires accurate attribution of individual buildings to more general typological classes that are representative of the seismic behavior of the buildings sharing same attributes. One-by-one evaluation of all buildings is a time-and-money demanding process. Detailed individual evaluations are only suitable for strategic buildings, such as hospitals and other buildings with a central role in the emergency post-earthquake phase. For other buildings simplified approaches are needed. The definition of a taxonomy that contains the most widespread typological classes as well as performing the attribution of the appropriate class to each building are central issues for reliable seismic assessment at large scales. A fast, yet accurate, survey process is needed to attribute a correct class to each building composing the urban system. Even surveying buildings with the goal to determine classes is not as time demanding as detailed evaluations of each building, this process still requires large amounts of time and qualified personnel. However, nowadays several databases are available and provide useful information. In this paper, attributes that are available in such public databases are used to perform class attribution at large scales based on previous data-mining on a small subset of an entire city. The association-rule learning (ARL) is used to find links between building attributes and typological classes. Accuracy of wide spreading these links learned on <250 buildings of a specific district is evaluated in terms of class attribution and seismic vulnerability prediction. By considering only three attributes available on public databases (i.e., period of construction, number of floors, and shape of the roof) the time needed to provide seismic vulnerability scenarios at city scale is significantly reduced, while accuracy is reduced by <5%

    Model Validation and Simulation

    Get PDF
    The Bauhaus Summer School series provides an international forum for an exchange of methods and skills related to the interaction between different disciplines of modern engineering science. The 2012 civil engineering course was held in August over two weeks at Bauhaus-Universität Weimar. The overall aim was the exchange of research and modern scientific approaches in the field of model validation and simulation between well-known experts acting as lecturers and active students. Besides these educational intentions the social and cultural component of the meeting has been in the focus. 48 graduate and doctoral students from 20 different countries and 22 lecturers from 12 countries attended this summer school. Among other aspects, this activity can be considered successful as it raised the sensitivity towards both the significance of research in civil engineering and the role of intercultural exchange. This volume summarizes and publishes some of the results: abstracts of key note papers presented by the experts and selected student research works. The overview reflects the quality of this summer school. Furthermore the individual contributions confirm that for active students this event has been a research forum and a special opportunity to learn from the experiences of the researchers in terms of methodology and strategies for research implementation in their current work

    A stochastic rupture earthquake code based on the fiber bundle model (TREMOL v0.1): application to Mexican subduction earthquakes

    Get PDF
    In general terms, earthquakes are the result of brittle failure within the heterogeneous crust of the Earth. However, the rupture process of a heterogeneous material is a complex physical problem that is difficult to model deterministically due to numerous parameters and physical conditions, which are largely unknown. Considering the variability within the parameterization, it is necessary to analyze earthquakes by means of different approaches. Computational physics may offer alternative ways to study brittle rock failure by generating synthetic seismic data based on physical and statistical models and through the use of only few free parameters. The fiber bundle model (FBM) is a stochastic discrete model of material failure, which is able to describe complex rupture processes in heterogeneous materials. In this article, we present a computer code called the stochasTic Rupture Earthquake MOdeL, TREMOL. This code is based on the principle of the FBM to investigate the rupture process of asperities on the earthquake rupture surface. In order to validate TREMOL, we carried out a parametric study to identify the best parameter configuration while minimizing computational efforts. As test cases, we applied the final configuration to 10 Mexican subduction zone earthquakes in order to compare the synthetic results by TREMOL with seismological observations. According to our results, TREMOL is able to model the rupture of an asperity that is essentially defined by two basic dimensions: (1) the size of the fault plane and (2) the size of the maximum asperity within the fault plane. Based on these data and few additional parameters, TREMOL is able to generate numerous earthquakes as well as a maximum magnitude for different scenarios within a reasonable error range. The simulated earthquake magnitudes are of the same order as the real earthquakes. Thus, TREMOL can be used to analyze the behavior of a single asperity or a group of asperities since TREMOL considers the maximum magnitude occurring on a fault plane as a function of the size of the asperity. TREMOL is a simple and flexible model that allows its users to investigate the role of the initial stress configuration and the dimensions and material properties of seismic asperities. Although various assumptions and simplifications are included in the model, we show that TREMOL can be a powerful tool to deliver promising new insights into earthquake rupture processes.The authors are grateful to two anonymous reviewers and the editor for their relevant and constructive comments that have greatly contributed to improving the paper. M. Monterrubio-Velasco and J. de la Puente thank the European Union’s Horizon 2020 Programme under the ChEESE Project (https://cheese-coe.eu/, last access: 1 May 2019), grant agreement no. 823844, for partially funding this work. M. Monterrubio- Velasco and A. Aguilar-Meléndez thank CONACYT for support of this research project. Quetzalcoatl Rodríguez-Pérez was supported by the Mexican National Council for Science and Technology (CONACYT) (Catedras program, project 1126). This project has received funding from the European Union’s Horizon 2020 research and innovation program under Marie Skłodowska-Curie grant agreement no. 777778, MATHROCKS, and from the Spanish Ministry project TIN2016-80957-P. Initial funding for the project through grant UNAM-PAPIIT IN108115 is also gratefully acknowledged.Peer ReviewedPostprint (published version

    Understanding cytoskeletal avalanches using mechanical stability analysis

    Full text link
    Eukaryotic cells are mechanically supported by a polymer network called the cytoskeleton, which consumes chemical energy to dynamically remodel its structure. Recent experiments in vivo have revealed that this remodeling occasionally happens through anomalously large displacements, reminiscent of earthquakes or avalanches. These cytoskeletal avalanches might indicate that the cytoskeleton's structural response to a changing cellular environment is highly sensitive, and they are therefore of significant biological interest. However, the physics underlying "cytoquakes" is poorly understood. Here, we use agent-based simulations of cytoskeletal self-organization to study fluctuations in the network's mechanical energy. We robustly observe non-Gaussian statistics and asymmetrically large rates of energy release compared to accumulation in a minimal cytoskeletal model. The large events of energy release are found to correlate with large, collective displacements of the cytoskeletal filaments. We also find that the changes in the localization of tension and the projections of the network motion onto the vibrational normal modes are asymmetrically distributed for energy release and accumulation. These results imply an avalanche-like process of slow energy storage punctuated by fast, large events of energy release involving a collective network rearrangement. We further show that mechanical instability precedes cytoquake occurrence through a machine learning model that dynamically forecasts cytoquakes using the vibrational spectrum as input. Our results provide the first connection between the cytoquake phenomenon and the network's mechanical energy and can help guide future investigations of the cytoskeleton's structural susceptibility.Comment: 35 pages, 18 figure

    Energy refurbishment planning of Italian school buildings using data-driven predictive models

    Get PDF
    In the current practice, the design of energy refurbishment interventions for existing buildings is typically addressed by performing time-consuming software-based numerical simulations. However, this approach may be not suitable for preliminary assessment studies, especially when large building portfolios are involved. Therefore, this research work aims at developing simplified data-driven predictive models to estimate the energy consumption of existing school buildings in Italy and support the decision-making process in energy refurbishment intervention planning at a large scale. To accomplish this, an extensive database is assembled through comprehensive on-site surveys of school buildings in Southern Italy. For each school, a Building Information Modelling (BIM) model is developed and validated considering real energy consumption data. These BIM models serve in the design of suitable energy refurbishment interventions. Moreover, a comprehensive parametric investigation based on refined energy analyses is carried out to significantly improve and integrate the dataset. To derive the predictive models, firstly the most relevant parameters for energy consumption are identified by performing sensitivity analyses. Based on these findings, predictive models are generated through a multiple linear regression method. The suggested models provide an estimation of the energy consumption of the “as-built” configuration, as well as the costs and benefits of alternative energy refurbishment scenarios. The reliability of the proposed simplified relationships is substantiated through a statistical analysis of the main error indices. Results highlight that the building's shape factor (i.e., the ratio between the building's envelope area and its volume) and the area-weighted average of the thermal properties of the building envelope significantly affect both the energy consumption of school buildings and the achievable savings through retrofitting interventions. Finally, a framework for the preliminary design of energy refurbishment of buildings, based on the implementation of the herein developed predictive model, is proposed and illustrated through a worked example application. Worth noting that, while the proposed approach is currently limited to school buildings, the methodology can conceptually be extended to any building typology, provided that suitable data on energy consumption are available

    A state of the art review of modal-based damage detection in bridges: development, challenges, and solutions

    Get PDF
    Traditionally, damage identification techniques in bridges have focused on monitoring changes to modal-based Damage Sensitive Features (DSFs) due to their direct relationship with structural stiffness and their spatial information content. However, their progression to real-world applications has not been without its challenges and shortcomings, mainly stemming from: (1) environmental and operational variations; (2) inefficient utilization of machine learning algorithms for damage detection; and (3) a general over-reliance on modal-based DSFs alone. The present paper provides an in-depth review of the development of modal-based DSFs and a synopsis of the challenges they face. The paper then sets out to addresses the highlighted challenges in terms of published advancements and alternatives from recent literature.Peer ReviewedPostprint (published version
    • …
    corecore