3,818 research outputs found

    Integration of graphical, physics-based, and machine learning methods for assessment of impact and recovery of the built environment from wind hazards

    Get PDF
    2019 Summer.Includes bibliographical references.The interaction between a natural hazard and a community has the potential to result in a natural disaster with substantial socio-economic losses. In order to minimize disaster impacts, researchers have been improving building codes and exploring further concepts of community resilience. Community resilience refers to a community's ability to absorb a hazard (minimize impacts) and "bounce back" afterwards (quick recovery time). Therefore, the two main components in modeling resilience are: the initial impact and subsequent recovery time. With respect to a community's building stock, this entails the building damage state sustained and how long it takes to repair and reoccupy that building. In modeling these concepts, probabilistic and physics-based methods have been the traditional approach. With advancements in artificial intelligence and machine learning, as well as data availability, it may be possible to model impact and recovery differently. Most current methods are highly constrained by their topic area, for example a damage state focuses on structural loading and resistance, while social vulnerability independently focus on certain social demographics. These models currently perform independently and are then aggregated together, but with the complex connectivity available through machine learning, structural and social characteristics may be combined simultaneously in one network model. The popularity of machine learning predictive modeling across multiple different applications has risen due to the benefit of modeling complex networks and perhaps identifying critical variables that were previously unknown, or the mechanism behind how these variables interacted within the predictive problem being modeled. The research presented herein outlines a method of using artificial neural networks to model building damage and recovery times. The incorporation of graph theory to analyze the resulting models also provides insight into the "black box" of artificial intelligence and the interaction of socio-technical parameters within the concept of community resilience. The subsequent neural network models are then verified through hindcasting the 2011 Joplin tornado for individual building damage and the time it took to repair and reoccupy each building. The results of this research show viability for using these methods to model damage, but more research work may be needed to model recovery at the same level of accuracy as damage. It is therefore recommended that artificial neural networks be primarily used for problems where the variables are well known but their interactions are not as easily understood or modeled. The graphical analysis also reveals an importance of social parameters across all points in the resilience process, while the structural components remain mostly important in determining the initial impact. Final importance factors are determined for each of the variables evaluated herein. It is suggested moving forward, that modeling approaches consider integrating how a community interacts with its infrastructure, since the human components are what make a natural hazard a disaster, and tracing artificial neural network connections may provide a starting point for such integration into current traditional modeling approaches

    New hurricane impact level ranking system using artificial neural networks, A

    Get PDF
    2015 Spring.Includes bibliographical references.Tropical cyclones are intense storm systems that form over warm water but have the potential to bring multiple related hazards ashore. While significant advancements have been made for forecasting of such extreme weather, the estimation for the resulting damage and impact to society is significantly complex and requires substantial improvements. This is primarily due to the intricate interaction of multiple variables contributing to the socio-economic damage on multiple scales. Subsequently, this makes communicating the risk, location vulnerability, and the resulting impact of such an event inherently difficult. To date, the Saffir-Simpson Scale, based off of wind speed, is the main ranking system used in the United States to describe an oncoming tropical cyclone event. There are models currently in use to predict loss by using more parameters than just wind speed. However, they are not actively used as a means to concisely categorize these events. This is likely due to the scrutiny the model would be placed under for possibly outputting an incorrect damage total. These models use parameters such as; wind speed, wind driven rain, and building stock to determine losses. The relationships between meteorological and locational parameters (population, infrastructure, and geography) are well recognized, which is why many models attempt to account for so many variables. With the help of machine learning, in the form of artificial neural networks, these intuitive connections could be recreated. Neural networks form patterns for nonlinear problems much as the human brain would, based off of historical data. By using 66 historical hurricane events, this research will attempt to establish these connections through machine learning. In order to link these variables to a concise output, the proposed Impact Level Ranking System will be introduced. This categorization system will use levels, or thresholds, of economic damage to group historical events in order to provide a comparative level for a new tropical cyclone event within the United States. Discussed herein, are the effects of multiple parameters contributing to the impact of hurricane events, the use and application of artificial neural networks, the development of six possible neural network models for hurricane impact prediction, the importance of each parameter to the neural network process, the determination of the type of neural network problem, and finally the proposed Impact Level Ranking System Model and its potential applications

    Constraints on large scalar multiplets from perturbative unitarity

    Full text link
    We determine the constraints on the isospin and hypercharge of a scalar electroweak multiplet from partial-wave unitarity of tree-level scattering diagrams. The constraint from SU(2)_L interactions yields T <= 7/2 (i.e., n <= 8) for a complex scalar multiplet and T <= 4 (i.e., n <= 9) for a real scalar multiplet, where n = 2T+1 is the number of isospin states in the multiplet.Comment: 10 pages, 1 figure. v2: refs added, minor additions to text, submitted to PR

    Developing the role concept for computer-supported collaborative learning

    Get PDF
    The role concept has attracted a lot of attention as a construct for facilitating and analysing interactions in the context of Computer-Supported Collaborative Learning (CSCL). So far much of this research has been carried out in isolation and the focus on roles lacks cohesion. In this article we present a conceptual framework to synthesise the contemporary conceptualisation of roles, by discerning three levels of the role concept: micro (role as task), meso (role as pattern) and macro (role as stance). As a first step to further conceptualise ā€˜role as a stanceā€™, we present a framework of eight participative stances defined along three dimensions: group size, orientation and effort. The participative stances ā€“ Captain, Over-rider, Free-rider, Ghost, Pillar, Generator, Hanger-on and Lurker ā€“ were scrutinised on two data sets using qualitative analysis. The stances aim to facilitate meaningful description of student behaviour, stimulate both teacher and student awareness of roles at the macro-level in terms of participative stances, and evaluate or possibly change the participation to collaborative learning on all levels

    Reinstating the 'no-lose' theorem for NMSSM Higgs discovery at the LHC

    Full text link
    The simplest supersymmetric model that solves the mu problem and in which the GUT-scale parameters need not be finely tuned in order to predict the correct value of the Z boson mass at low scales is the Next-to-Minimal Supersymmetric Standard Model (NMSSM). However, in order that fine tuning be absent, the lightest CP-even Higgs boson h should have mass ~100 GeV and SM couplings to gauge bosons and fermions. The only way that this can be consistent with LEP limits is if h decays primarily via h->aa->4 tau or 4j but not 4b, where a is the lighter of the two pseudo-scalar Higgses that are present in the NMSSM. Interestingly, m_a 2 m_tau somewhat preferred. Thus, h -> 4 tau becomes a key mode of interest. Meanwhile, all other Higgs bosons of the NMSSM are typically quite heavy. Detection of any of the NMSSM Higgs bosons at the LHC in this preferred scenario will be very challenging using conventional channels. In this paper, we demonstrate that the h -> aa -> 4 tau decay chain should be visible if the Higgs is produced in the process pp -> p+h+p with the final state protons being measured using suitably installed forward detectors. Moreover, we show that the mass of both the h and the a can be determined on an event-by-event basis.Comment: 23 page

    The stellar metallicity distribution of disc galaxies and bulges in cosmological simulations

    Get PDF
    By means of high-resolution cosmological hydrodynamical simulations of Milky Way-like disc galaxies, we conduct an analysis of the associated stellar metallicity distribution functions (MDFs). After undertaking a kinematic decomposition of each simulation into spheroid and disc sub-components, we compare the predicted MDFs to those observed in the solar neighbourhood and the Galactic bulge. The effects of the star formation density threshold are visible in the star formation histories, which show a modulation in their behaviour driven by the threshold. The derived MDFs show median metallicities lower by 0.2-0.3 dex than the MDF observed locally in the disc and in the Galactic bulge. Possible reasons for this apparent discrepancy include the use of low stellar yields and/or centrally-concentrated star formation. The dispersions are larger than the one of the observed MDF; this could be due to simulated discs being kinematically hotter relative to the Milky Way. The fraction of low metallicity stars is largely overestimated, visible from the more negatively skewed MDF with respect to the observational sample. For our fiducial Milky Way analog, we study the metallicity distribution of the stars born "in situ" relative to those formed via accretion (from disrupted satellites), and demonstrate that this low-metallicity tail to the MDF is populated primarily by accreted stars. Enhanced supernova and stellar radiation energy feedback to the surrounding interstellar media of these pre-disrupted satellites is suggested as an important regulator of the MDF skewness.Comment: 20 pages, 14 figures, MNRAS, accepte

    The politics of in/visibility: carving out queer space in Ul'yanovsk

    Get PDF
    &lt;p&gt;In spite of a growing interest within sexualities studies in the concept of queer space (Oswin 2008), existing literature focuses almost exclusively on its most visible and territorialised forms, such as the gay scene, thus privileging Western metropolitan areas as hubs of queer consumer culture (Binnie 2004). While the literature has emphasised the political significance of queer space as a site of resistance to hegemonic gender and sexual norms, it has again predominantly focused on overt claims to public space embodied in Pride events, neglecting other less open forms of resistance.&lt;/p&gt;&lt;p&gt; This article contributes new insights to current debates about the construction and meaning of queer space by considering how city space is appropriated by an informal queer network in Ulā€™ianovsk. The group routinely occupied very public locations meeting and socialising on the street or in mainstream cafĆ©s in central Ulā€™ianovsk, although claims to these spaces as queer were mostly contingent, precarious or invisible to outsiders. The article considers how provincial location affects tactics used to carve out communal space, foregrounding the importance of local context and collective agency in shaping specific forms of resistance, and questioning ethnocentric assumptions about the empowering potential of visibility.&lt;/p&gt

    Transport behavior of holes in boron delta-doped diamond structures

    Get PDF
    Boron delta-doped diamond structures have been synthesized using microwave plasma chemical vapor deposition and fabricated into FET and gated Hall bar devices for assessment of the electrical characteristics. A detailed study of variable temperature Hall, conductivity, and field-effect mobility measurements was completed. This was supported by Schrā‚¬dinger-Poisson and relaxation time o calculations based upon application of Fermiā€™s golden rule. A two carrier-type model was developed with an activation energy of 0.2eVbetweenthedeltalayerlowestsubbandwithmobility0.2 eV between the delta layer lowest subband with mobility 1 cm2/Vs and the bulk valence band with high mobility. This new understanding of the transport of holes in such boron delta-doped structures has shown that although Hall mobility as high as 900 cm2/Vs was measured at room temperature, this dramatically overstates the actual useful performance of the device

    Nanomaterials: amyloids reflect their brighter side

    Get PDF
    Amyloid fibrils belong to the group of ordered nanostructures that are self-assembled from a wide range of polypeptides/proteins. Amyloids are highly rigid structures possessing a high mechanical strength. Although amyloids have been implicated in the pathogenesis of several human diseases, growing evidence indicates that amyloids may also perform native functions in host organisms. Discovery of such amyloids, referred to as functional amyloids, highlight their possible use in designing novel nanostructure materials. This review summarizes recent advances in the application of amyloids for the development of nanomaterials and prospective applications of such materials in nanotechnology and biomedicine
    • ā€¦
    corecore