7,532 research outputs found

    Extending displacement-based earthquake loss assessment (DBELA) for the computation of fragility curves

    Get PDF
    This paper presents a new procedure to derive fragility functions for populations of buildings that relies on the displacement-based earthquake loss assessment (DBELA) methodology. In the method proposed herein, thousands of synthetic buildings have been produced considering the probabilistic distribution describing the variability in geometrical and material properties. Then, their nonlinear capacity has been estimated using the DBELA method and their response against a large set of ground motion records has been estimated. Global limit states are used to estimate the distribution of buildings in each damage state for different levels of ground motion, and a regression algorithm is applied to derive fragility functions for each limit state. The proposed methodology is demonstrated for the case of ductile and non-ductile Turkish reinforced concrete frames with masonry infills

    Form and Data - from linear Calculus to cybernetic Computation and Interaction

    Get PDF
    Digital architecture developed in the 1960s and, supported by CAAD the 1990s, has created the path towards an architecture produced by computer and architect in a mutual relationship. The evolution of architecture since the 1970s led to the beginning of the first digital turn in the 1990s, and subsequently to the emergence of new typologies of buildings, architects and design tools; atom-based, bit-based (virtual) [1], and cyber-physical as a combination of both. The paper provides an insight into historical foundations of CAAD insofar as it engages with complexity in mechanics, geometry, and space between the 1600s and 1950s. I will address a selection of principles discovered, and mechanisms invented before computer-aided-architectural-design; those include the typewriter, the Cartesian grid and a pre-cyber-physical system by Hermann von Helmholtz. The paper concludes with a summary and an outlook to the future of CAAD challenged by the variety of correlations of disparate data sets

    Setting intelligent city tiling strategies for urban shading simulations

    Get PDF
    Assessing accurately the solar potential of all building surfaces in cities, including shading and multiple reflections between buildings, is essential for urban energy modelling. However, since the number of surface interactions and radiation exchanges increase exponentially with the scale of the district, innovative computational strategies are needed, some of which will be introduced in the present work. They should hold the best compromise between result accuracy and computational efficiency, i.e. computational time and memory requirements. In this study, different approaches that may be used for the computation of urban solar irradiance in large areas are presented. Two concrete urban case studies of different densities have been used to compare and evaluate three different methods: the Perez Sky model, the Simplified Radiosity Algorithm and a new scene tiling method implemented in our urban simulation platform SimStadt, used for feasible estimations on a large scale. To quantify the influence of shading, the new concept of Urban Shading Ratio has been introduced and used for this evaluation process. In high density urban areas, this index may reach 60% for facades and 25% for roofs. Tiles of 500 m width and 200 m overlap are a minimum requirement in this case to compute solar irradiance with an acceptable accuracy. In medium density areas, tiles of 300 m width and 100 m overlap meet perfectly the accuracy requirements. In addition, the solar potential for various solar energy thresholds as well as the monthly variation of the Urban Shading Ratio have been quantified for both case studies, distinguishing between roofs and facades of different orientations

    An efficient methodology to estimate probabilistic seismic damage curves

    Get PDF
    The incremental dynamic analysis (IDA) is a powerful methodology that can be easily extended for calculating probabilistic seismic damage curves. These curves are metadata to assess the seismic risk of structures. Although this methodology requires a relevant computational effort, it should be the reference to correctly estimate the seismic risk of structures. Nevertheless, it would be of high practical interest to have a simpler methodology, based for instance on the pushover analysis (PA), to obtain similar results to those based on IDA. In this article, PA is used to obtain probabilistic seismic damage curves from the stiffness degradation and the energy of the nonlinear part of the capacity curve. A fully probabilistic methodology is tackled by means of Monte Carlo simulations with the purpose of establishing that the results based on the simplified proposed approach are compatible with those obtained with the IDA. Comparisons between the results of both approaches are included for a low- to midrise reinforced concrete building. The proposed methodology significantly reduces the computational effort when calculating probabilistic seismic damage curves.Peer ReviewedPostprint (author's final draft

    Effective Physical Processes and Active Information in Quantum Computing

    Get PDF
    The recent debate on hypercomputation has arisen new questions both on the computational abilities of quantum systems and the Church-Turing Thesis role in Physics. We propose here the idea of "effective physical process" as the essentially physical notion of computation. By using the Bohm and Hiley active information concept we analyze the differences between the standard form (quantum gates) and the non-standard one (adiabatic and morphogenetic) of Quantum Computing, and we point out how its Super-Turing potentialities derive from an incomputable information source in accordance with Bell's constraints. On condition that we give up the formal concept of "universality", the possibility to realize quantum oracles is reachable. In this way computation is led back to the logic of physical world.Comment: 10 pages; Added references for sections 2 and

    Probabilistic approach to provide scenarios of earthquake-induced slope failures (PARSIFAL) applied to the Alcoy Basin (South Spain)

    Get PDF
    The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake-Induced slope FAiLures) approach was applied in the basin of Alcoy (Alicante, South Spain), to provide a comprehensive scenario of earthquake-induced landslides. The basin of Alcoy is well known for several historical landslides, mainly represented by earth-slides, that involve urban settlement as well as infrastructures (i.e., roads, bridges). The PARSIFAL overcomes several limits existing in other approaches, allowing the concomitant analyses of: (i) first-time landslides (due to both rock-slope failures and shallow earth-slides) and reactivations of existing landslides; (ii) slope stability analyses of different failure mechanisms; (iii) comprehensive mapping of earthquake-induced landslide scenarios in terms of exceedance probability of critical threshold values of co-seismic displacements. Geotechnical data were used to constrain the slope stability analysis, while specific field surveys were carried out to measure jointing and strength conditions of rock masses and to inventory already existing landslides. GIS-based susceptibility analyses were performed to assess the proneness to shallow earth-slides as well as to verify kinematic compatibility to planar or wedge rock-slides and to topples. The experienced application of PARSIFAL to the Alcoy basin: (i) confirms the suitability of the approach at a municipality scale, (ii) outputs the main role of saturation in conditioning slope instabilities in this case study, (iii) demonstrates the reliability of the obtained results respect to the historical dat
    • 

    corecore