100 research outputs found

    Corporate Social Responsibility and Performance Measurement: three studies from a stakeholder management perspective

    Get PDF
    Background. Does Corporate Social Performance yield any tangible financial benefit during a crisis? The Impact of CSP on Forced CEO Turnover: Buffer or Intensifier? The Determinants of Organizational Effectiveness: Stakeholder Dialogue and Monitoring in Museums

    Synthetic accelerograms for hazard evaluation and response-history analysis of buildings

    Get PDF
    Non-linear Time History Analysis (NLTHA) of structures is the most sophisticated tool used to understand the real dynamic behaviour of structures (FIB, 2012). The goodness of results relies on an accurate definition of the materials properties, their hysteretic behaviour and the geometry of the structure to be examined, as well as on the definition of the dynamic excitations represented by acceleration time histories. These accelerograms must represent, on average, the hazard of the site under examination, commonly represented by an acceleration response spectrum. Usually the target response spectrum is defined, in a Probabilistic (PSHA) or Deterministic (DSHA) Seismic Hazard Assessment, through Ground Motion Prediction Equations (GMPEs). Therefore, ground motions should have magnitude, source distance and focal mechanism consistent with the sources that control the hazard at the site of interest. Moreover, site soil conditions and the possibility of experiencing near fault effects such as directivity and fling-step needs to be considered (NIST, 2011). Usually, acceleration time histories are selected from databases of records (e.g. the European Strong Motion (ESM) database (Luzi et al., 2016) in order to satisfy all the above-mentioned characteristics and to match, over a defined range of periods, the target response spectrum. As the tolerance on the variability of the selection parameters becomes stronger, the lack of data becomes evident and some modifications (e.g. linear scaling) of the original recorded ground motions are needed if an adequate number of ground motion is to be used. A source of time histories could be the generation of artificial accelerograms (Gasparini and Vanmarke, 1976) or the use of the \u201cresponse spectrum matching\u201d technique (Al Atik and Abrahamson, 2010; Grant and Diaferia, 2013). However, these techniques have no physical meaning and there are concerns that their use could lead to biased results (Bazzurro and Luco, 2006; Iervolino et al., 2010). A viable alternative is to use synthetic accelerograms generated from a simulation of the source rupture and wave propagation. In this work, a direct link between hazard and response-history analysis is established. Synthetic seismograms are used to define the hazard as described by the Neo Deterministic Seismic Hazard Assessment (NDSHA) (Panza et. al., 2001, 2012; Fasan et al., 2016) and, as a logical consequence, to perform NLTHA on a selected building. A comparison of the results of NLTHAs obtained with natural and synthetic records confirms that physics-based simulations are a valuable tool in structural analysis. Moreover, the NDSHA method is applied to the site of Norcia and predicted spectral acceleration are compared with the recorded one during the event of the 30th of October 2016. Using NLTHAs, structural demands predicted using the real records and the synthetic ones used in the NDSHA are compared, showing that simulated accelerograms can be used to predict real non-linear demands of future earthquakes

    Body Motion Sensor Analysis of Human-Induced Dynamic Load Factor (DLF) for Normal Walks on Slender Transparent Floors

    Get PDF
    Modern constructions are often characterized by the presence of slender and aesthetically fascinating components and assemblies. For pedestrian systems in particular, such constructions are notoriously associated with possible vibration issues, and thus require special calculations. When these slender systems are made of structural glass, additional effects due to transparency may also affect human behaviours and motions. In this paper, based on a single body motion, a microelectromechanical system (MEMS) sensor in the body’s centre of mass (CoM) is introduced, an extended, original experimental investigation is presented, and human-induced effects on slender transparent floors are discussed. Major attention is given to the well-known dynamic load factor (DLF) induced by a single pedestrian’s normal walk; a fixed walking rate is assigned, and different substructures (with major variations in their structural dynamic parameters) are taken into account. A discussion of experimental results is proposed for rigid reinforced concrete (RC), and a laboratory contrast system (SLAB#1), which is used as a reference for the analysis of DLF trends on relatively light and flexible transparent glass flooring systems (SLAB#2 and SLAB#3). It is shown that structural frequency and mass, but also possibly transparency, can affect human motion and result in a quantitative modification of measured DLF values, especially for the first and second harmonics of vertical force components

    Numerical assessment of slab-interaction effects on the behaviour of steel-concrete composite joints

    Get PDF
    In current design practice for seismic resistant steel braced frames, general rules and standard provisions are aimed to ensure a structural behaviour for beam-to-column joints of non-braced spans as close as possible to perfect hinges. This is done to prevent any kind of interaction with the bracing systems, in particular under horizontal loads. However, the global performance of composite joints is markedly affected by the structural interaction between the concrete slab and the steel components and - especially during seismic events - struts can occur in the slab at the beam-to-column intersection. In this paper, the possibility of realizing a composite joint that behaves as moment-resisting under gravitational loads and essentially as hinged under horizontal loads is investigated. Aiming to assess the actual slab-interaction effects on the overall response, a full 3D Finite Element (FE) model representative of a beam-to-column composite joint taking part of a braced frame is described in ABAQUS and validated towards past full-scale experiments. A parametric study is hence proposed, by accounting for three geometrical configurations, being characterized by (i) isolated slab with absence of rebar continuity (i.e. fully disconnected slab and steel joint only), (ii) presence of slab with partial column interaction (i.e. isolated slab and continuity of rebar), (iii) presence of fully interacting slab. It is shown that, if properly detailed, a joint with isolated slab and continuous rebars can be used in nonbraced spans of composite braced frames without affecting the behaviour of the bracing system (i.e. as in presence of a hinge). Nonetheless, the composite beam can be designed as continuous on multiple supports under vertical loads, hence leading to a reduction of the steel cross-sectional size

    ADVANCED SEISMOLOGICAL AND ENGINEERING ANALYSIS FOR STRUCTURAL SEISMIC DESIGN

    Get PDF
    Nowadays, standard \u201cPerformance Based Seismic Design\u201d (PBSD) procedures rely on a \u201cProbabilistic Seismic Hazard Analysis\u201d (PSHA) to define the seismic input. Many assumptions underlying the probabilistic method have been proven wrong. Many earthquakes, not least the Italian earthquake sequence of 2016 (still in progress), have shown the limits of a PBSD procedure based on PSHA. Therefore, a different method to define the seismic hazard should be defined and used in a PBSD framework. This thesis tackles this aspect. In the first chapter a review of the standard PBSD procedures is done, focusing on the link between the seismic input and the acceptable structural performance level for a building. It is highlighted how, at least when evaluating the Collapse Prevention Level (CP), the use of a probabilistic seismic input should be avoided. Instead, the concept of \u201cMaximum Credible Seismic Input\u201d (MCSI) is introduced. This input should supply Maximum Credible Earthquake (MCE) level scenario ground motions, in other words an \u201cupper bound\u201d to possible future earthquake scenarios. In the second chapter an upgrade of the \u201cNeo Deterministic Seismic Hazard Assessment\u201d (NDSHA) is proposed to compute NDSHA-MCSI, henceforth shortly called MCSI. In other words, MCSI is fully bolted to NDSHA and aims to define a reliable and effective design seismic input. NDSHA is a physics-based approach where the ground motion parameters of interest (e.g. PGA, SA, SD etc.) are derived from the computation of thousands of physics-based synthetic seismograms calculated as the tensor product between the tensor representing in a formal way the earthquake source and the Green\u2019s function of the medium. NDSHA accommodates the complexity of the source process, as well as site and topographical effects. The comparison between the MCSI response spectra, the Italian Building Code response spectra and the response spectra of the three strongest events of the 2016 central Italy seismic sequence is discussed. Exploiting the detailed site-specific mechanical conditions around the recording station available in literature, the methodology to define MCSI is applied to the town of Norcia (about five km from the strongest event). The results of the experiment confirm the inadequacy of the probabilistic approach that strongly underestimated the spectral accelerations for all three events. On the contrary, MCSI supplies spectral accelerations well comparable with those generated by the strongest event and confirms the reliability of the NDSHA methodology, as happened in previous earthquakes (e.g. Aquila 2009 and Emilia 2012). In the third chapter a review of the PBSD is done. It emphasizes the arbitrariness with which different choices, at present taken for granted all around the world, were taken. A new PBSD framework based on the use of MCSI is then proposed. This procedure is independent from the arbitrary choice of the reference life and the probability of exceedance. From an engineering point of view, seismograms provided by NDSHA simulations also allow to run time history analysis using site specific inputs even where no records are available. This aspect is evidenced in chapter four where a comparison between some Engineering Demand Parameters (EDP) on a steel moment resisting frame due to natural and synthetic accelerograms are compared. This thesis shows that, at least when assessing the CP level, the use of PSHA in a PBSD approach should be avoided. The new PBSD framework proposed in thesis and based on MCSI computation, if used, could help to prevent collapse of buildings and human losses, hence to build seismic resilient systems and to overcome the limits of probabilistic approaches. Not least, the availability of site specific accelerograms could lead to wider use of Non-Linear Time History Analysis (NLTHA), therefore to a better understanding of the seismic behaviour of structures.Nowadays, standard \u201cPerformance Based Seismic Design\u201d (PBSD) procedures rely on a \u201cProbabilistic Seismic Hazard Analysis\u201d (PSHA) to define the seismic input. Many assumptions underlying the probabilistic method have been proven wrong. Many earthquakes, not least the Italian earthquake sequence of 2016 (still in progress), have shown the limits of a PBSD procedure based on PSHA. Therefore, a different method to define the seismic hazard should be defined and used in a PBSD framework. This thesis tackles this aspect. In the first chapter a review of the standard PBSD procedures is done, focusing on the link between the seismic input and the acceptable structural performance level for a building. It is highlighted how, at least when evaluating the Collapse Prevention Level (CP), the use of a probabilistic seismic input should be avoided. Instead, the concept of \u201cMaximum Design Seismic Input\u201d (MDSI) is introduced. This input should supply Maximum Credible Earthquake (MCE) level scenario ground motions, in other words an \u201cupper bound\u201d to possible future earthquake scenarios. In the second chapter an upgrade of the \u201cNeo Deterministic Seismic Hazard Assessment\u201d (NDSHA) is proposed to find MDSI, henceforth called NDSHA-MDSI. NDSHA is a physics-based approach where the ground motion parameters of interest (e.g. PGA, SA, SD etc.) are derived from the computation of thousands of physics-based synthetic seismograms calculated as the tensor product between the tensor representing in a formal way the earthquake source and the Green\u2019s function of the medium. NDSHA accommodates the complexity of the source process, as well as site and topographical effects. The comparison between the NDSHA-MDSI response spectra, the Italian Building Code response spectra and the response spectra of the three strongest events of the 2016 central Italy seismic sequence is discussed. Exploiting the detailed site-specific mechanical conditions around the recording station available in literature, the methodology to define NDSHA-MDSI is applied to the town of Norcia (about five km from the strongest event). The results of the experiment confirm the inadequacy of the probabilistic approach that strongly underestimated the spectral accelerations for all three events. On the contrary, NDSHA-MDSI supplies spectral accelerations well comparable with those generated by the strongest event and confirms the reliability of the NDSHA methodology, as happened in previous earthquakes (e.g. Aquila 2009 and Emilia 2012). In the third chapter a review of the PBSD is done. It emphasizes the arbitrariness with which different choices, at present taken for granted all around the world, were taken. A new PBSD framework based on the use of MDSI is then proposed. This procedure is independent from the arbitrary choice of the reference life and the probability of exceedance. From an engineering point of view, seismograms provided by NDSHA simulations also allow to run time history analysis using site specific inputs even where no records are available. This aspect is evidenced in chapter four where a comparison between some Engineering Demand Parameters (EDP) on a steel moment resisting frame due to natural and synthetic accelerograms are compared. This thesis shows that, at least when assessing the CP level, the use of PSHA in a PBSD approach should be avoided. The new PBSD framework proposed in thesis and based on NDSHA-MDSI computation, if used, could help to prevent collapse of buildings and human losses hence to build seismic resilient systems and to overcome the limits of probabilistic approaches. Not least, the availability of site specific accelerograms could lead to wider use of Non-Linear Time History Analysis (NLTHA) hence to a better understanding of the seismic behaviour of structures

    Living Up to Your Codes? Corporate Codes of Ethics and the Cost of Equity Capital

    Get PDF
    Purpose- Previous literature provides mixed evidence about the effectiveness of a code of ethics in limiting managerial opportunism. While some studies find that code of ethics is merely window-dressing, others find that they do influence managers\u27 behavior. The present study investigates whether the quality of a code of ethics decreases the cost of equity by limiting managerial opportunism. Design/methodology/approach- In order to test the hypothesis, the authors perform an empirical analysis on a sample of US companies in the 2004–2012 period. The results are robust to a battery of robustness analyses that the authors performed in order to take care of endogeneity. Findings- Empirical results indicate that a higher quality code of ethics is associated with a lower cost of equity. In other words, firms with a more comprehensive code of ethics and better-designed implementation procedures limit managerial opportunism and pay a lower cost of equity because they are perceived by investors to be less risky. Research limitations/implications- Practical implications- Social implications- Originality/value- The authors contribute to the literature in two ways. First, by looking at the market reaction to the code of ethics, thus capturing all its indirect possible benefits and second, by measuring not only the existence but also the quality of a code of ethics. Based on the results, policymakers may choose to further promote codes of ethics as an effective corporate governance mechanism

    Living up to your codes? Corporate codes of ethics and the cost of equity capital

    Get PDF
    Purpose Previous literature provides mixed evidence about the effectiveness of a code of ethics in limiting managerial opportunism. While some studies find that code of ethics is merely window-dressing, others find that they do influence managers\u27 behavior. The present study investigates whether the quality of a code of ethics decreases the cost of equity by limiting managerial opportunism. Design/methodology/approach In order to test the hypothesis, the authors perform an empirical analysis on a sample of US companies in the 2004–2012 period. The results are robust to a battery of robustness analyses that the authors performed in order to take care of endogeneity. Findings Empirical results indicate that a higher quality code of ethics is associated with a lower cost of equity. In other words, firms with a more comprehensive code of ethics and better-designed implementation procedures limit managerial opportunism and pay a lower cost of equity because they are perceived by investors to be less risky. Research limitations/implications Practical implications Social implications Originality/value The authors contribute to the literature in two ways. First, by looking at the market reaction to the code of ethics, thus capturing all its indirect possible benefits and second, by measuring not only the existence but also the quality of a code of ethics. Based on the results, policymakers may choose to further promote codes of ethics as an effective corporate governance mechanism

    Improving the seismic capacity of steel–concrete composite frames with spiral-confined slabs

    Get PDF
    The seismic performance analysis of steel–concrete composite frames involves, as known, the interaction of several load-bearing components that should be properly designed, with multiple geometrical and mechanical parameters to account. Practical recommendations are given by the Eurocode 8 (EC8) – Annex C for the optimal detailing of transverse rebars, so as to ensure the activation of conventional resisting mechanisms. In this paper, the attention is focused on the analysis of effects and benefits due to a novel confinement solution for the reinforced concrete (RC) slab. The intervention is based on the use of diagonal steel spirals, that are expected to enforce the overall compressive response of the RC slab, thanks to the activation of an optimized strut-and-tie resisting mechanism. The final expectation is to first increase the resistance capacity of the slab that can thus transfer higher compressive actions under seismic loads. Further, as shown, the same resisting mechanism can be beneficial for the yielding of steel rebars, depending on the final detailing of components, and thus possibly improve the ductility of the system. To this aim, a refined finite element (FE) numerical analysis is carried out for several configurations of technical interest. The in-plane compressive behaviour and the activation of resisting mechanisms are explored for several spiral-confined slabs, based on various arrangements. Major advantage is taken from literature experimental data on RC slabs that are further investigated by introducing the examined confinement technique. The attention is hence given to the local and the global structural effects due to different arrangements for the proposed steel spirals. As shown, once the spirals are optimally placed into the slab, the strength and ductility parameters of the concrete struts can be efficiently improved, with marked benefits for the overall resisting mechanisms of the slab, and thus for the steel–composite frames as a whole

    Role of In-Field Experimental Diagnostic Analysis for the Derivation of Residual Capacity Indexes in Existing Pedestrian Glass Systems

    Get PDF
    The use of simplified tools in support of the mechanical performance assessment of pedestrian structures is strongly attractive for designers due to practical efficiency, as well as for researchers in terms of innovation and the assessment of new proposals. On the side of design, the vibration serviceability requires that specific comfort levels for pedestrians are satisfied by taking into account conventional performance indicators and the class of use, or the structural typology for pedestrian systems’ object of analysis. A major issue, in this context, is represented by long-term performance of systems (especially pedestrian) that are based on innovative or sensitive materials and possibly affected by degradation or even damage, and thus potentially unsafe. Consequently, it is clear that, especially for in-service structures, the availability of standardized non-destructive protocols for a reliable (and possibly rapid) structural assessment can represent an efficient support for diagnostics. This perspective paper poses the attention on the residual capacity quantification of laminated glass (LG) pedestrian structures, and on the assessment of experimental and/or numerical tools for their analysis. To this aim, three modular units belonging to two different indoor, in-service pedestrian systems are taken into account like pilot studies. On the practical side, as shown, a primary role is assigned to Operational Modal Analysis (OMA) procedures, which are used on-site, to quantify their structural performance based on vibration response, including damage detection and inverse characterization of materials degradation. As shown, based on earlier detailed validation, it is proven that a rapid structural assessment can be based on a single triaxial Micro Electro-Mechanical System (MEMS) accelerometer, which can be used to derive relevant capacity measures and indicators. To develop possible general recommendations of technical interest for in-service LG pedestrian systems, the so-calculated experimental performance indicators are assessed towards various traditional design procedures and literature approaches of classical use for structural diagnostic purposes, which are presently extended to the structural typology of LG systems
    • …
    corecore