270 research outputs found

    Multi-Scale Methodologies for Probabilistic Resilience Assessment and Enhancement of Bridges and Transportation Systems

    Get PDF
    When an extreme event occurs, such as an earthquake or a tsunami, the amount of socioeconomic losses due to reduced functionality of infrastructure systems over time is comparable to or even higher than the immediate loss due to the extreme event itself. Therefore, one of the highest priorities of owners, disaster management officials, and decision makers in general is to have a prediction of the disaster performance of lifelines and infrastructures a priory considering different scenarios, and be able to restore the functionality in an efficient manner to the normal condition, or at least to an acceptable level during the emergency, in the aftermath of a catastrophe. Along the line of this need, academic research has been focused on the concept of infrastructure resilience, which reflects the ability of structures, infrastructure systems, and communities to both withstand against and quickly recover functionality after an extreme event. Among infrastructure systems, transportation networks are of utmost importance as they allow people to move from damaged to safe areas and rescue/recovery teams to effectively accomplish their mission. Moreover, the functionality and restoration of several other infrastructure systems and socio-economic units of the community is highly interdependent with transportation network performance. Among different components of transportation networks, bridges are among of the most vulnerable and need a particular attention. To this respect, this research is mostly focused on quantification, and optimization of the functionality and resilience of bridges and transportation networks in the aftermath of extreme events, and in particular earthquakes, considering the underlying uncertainties. The scope of the study includes: (i) accurate\efficient assessment of the seismic fragility of individual bridges; (ii) development of a technique for assessment of bridge functionality and its probabilistic characteristics following an earthquake and during the restoration process; (iii) development of efficient optimization techniques for post-event restoration and pre-event retrofit prioritization of bridges; (iv) development of metrics and formulations for realistic quantification of the functionality and resilience of bridges and transportation networks.The evaluation of the damage and its probabilistic characteristics is the first step towards the assessment of the functionality of a bridge. In this regard, a simulation-based methodology was introduced for probabilistic seismic demand and fragility analyses, aimed at improving the accuracy of the resilience and life-cycle loss assessment of highway bridges. The impact of different assumptions made on the demand was assessed to determine if they are acceptable. The results show that among different assumptions, the power model and constant dispersion assumption introduce a considerable amount of error to the estimated probabilistic characteristics of demand and fragility. The error can be prevented using the introduced simulation-based technique, which takes advantage of the computational resources widely available nowadays.A new framework was presented to estimate probabilistic restoration functions of damaged bridges. This was accomplished by simulating different restoration project scenarios, considering the construction methods common in practice and the amount of resource availability. Moreover, two scheduling schemes were proposed to handle the uncertainties in the project scheduling and planning. The application of the proposed methodology was presented for the case of a bridge under a seismic scenario. The results show the critical impact of temporary repair solutions (e.g., temporary shoring) on the probabilistic characteristics of the functionality of the bridge during the restoration. Thus, the consideration of such solutions in probabilistic functionality and resilience analyses of bridges is necessary. Also, a considerable amount of nonlinearity was recognized among the restoration resource availability, duration of the restoration, and the bridge functionality level during the restoration process.A new tool called “Functionality-Fragility Surface” (FFS) was introduced for pre-event probabilistic recovery and resilience prediction of damaged structure, infrastructure systems, and communities. FFS combines fragility and restoration functions and presents the probability of suffering a certain functionality loss after a certain time elapsed from the occurrence of the extreme event, and given the intensity of the event. FFSs were developed for an archetype bridge to showcase the application of the proposed tool and formulation. Regarding network level analysis, a novel evolutionary optimization methodology for scheduling independent tasks considering resource and time constraints was proposed. The application of the proposed methodology to multi-phase optimal resilience restoration of highway bridges was presented and discussed. The results show the superior performance of the presented technique compared to other formulations both in terms of convergence rate and optimality of the solution. Also, the computed resilience-optimal restoration schedules are more practical and easier to interpret. Moreover, new connectivity-based metrics were introduced to measure the functionality and resilience of transportation networks, to take into account the priorities typically considered during the medium term of the disaster management.A two-level simulation-based optimization framework for bridge retrofit prioritization is presented. The objectives of the upper-level optimization are the minimization of the cost of bridge retrofit strategy, and probabilistic resilience failure defined as the probability of post-event optimal resilience being less than a critical value. The combined effect of the uncertainties in the seismic event characteristics and resulting damage state of bridges are taken into account by using an advanced efficient sampling technique, and fragility analysis. The proposed methodology was applied to a transportation network and different optimal bridge retrofit strategies were computed. The technique showed to be effective and efficient in computing the optimal bridge retrofit solutions of the example transportation network

    Pre-Flight Calibration of the Mars 2020 Rover Mastcam Zoom (Mastcam-Z) Multispectral, Stereoscopic Imager

    Get PDF
    The NASA Perseverance rover Mast Camera Zoom (Mastcam-Z) system is a pair of zoomable, focusable, multi-spectral, and color charge-coupled device (CCD) cameras mounted on top of a 1.7 m Remote Sensing Mast, along with associated electronics and two calibration targets. The cameras contain identical optical assemblies that can range in focal length from 26 mm (25.5∘×19.1∘ FOV) to 110 mm (6.2∘×4.2∘ FOV) and will acquire data at pixel scales of 148-540 μm at a range of 2 m and 7.4-27 cm at 1 km. The cameras are mounted on the rover’s mast with a stereo baseline of 24.3±0.1 cm and a toe-in angle of 1.17±0.03∘ (per camera). Each camera uses a Kodak KAI-2020 CCD with 1600×1200 active pixels and an 8 position filter wheel that contains an IR-cutoff filter for color imaging through the detectors’ Bayer-pattern filters, a neutral density (ND) solar filter for imaging the sun, and 6 narrow-band geology filters (16 total filters). An associated Digital Electronics Assembly provides command data interfaces to the rover, 11-to-8 bit companding, and JPEG compression capabilities. Herein, we describe pre-flight calibration of the Mastcam-Z instrument and characterize its radiometric and geometric behavior. Between April 26thth and May 9thth, 2019, ∼45,000 images were acquired during stand-alone calibration at Malin Space Science Systems (MSSS) in San Diego, CA. Additional data were acquired during Assembly Test and Launch Operations (ATLO) at the Jet Propulsion Laboratory and Kennedy Space Center. Results of the radiometric calibration validate a 5% absolute radiometric accuracy when using camera state parameters investigated during testing. When observing using camera state parameters not interrogated during calibration (e.g., non-canonical zoom positions), we conservatively estimate the absolute uncertainty to be 0.2 design requirement. We discuss lessons learned from calibration and suggest tactical strategies that will optimize the quality of science data acquired during operation at Mars. While most results matched expectations, some surprises were discovered, such as a strong wavelength and temperature dependence on the radiometric coefficients and a scene-dependent dynamic component to the zero-exposure bias frames. Calibration results and derived accuracies were validated using a Geoboard target consisting of well-characterized geologic samples

    Earth survey applications division: Research leading to the effective use of space technology in applications relating to the Earth's surface and interior

    Get PDF
    Accomplishments and future plans are described for the following areas: (1) geology - geobotanical indicators and geopotential data; (2) modeling magnetic fields; (3) modeling the structure, composition, and evolution of the Earth's crust; (4) global and regional motions of the Earth's crust and earthquake occurrence; (5) modeling geopotential from satellite tracking data; (6) modeling the Earth's gravity field; (7) global Earth dynamics; (8) sea surface topography, ocean dynamics; and geophysical interpretation; (9) land cover and land use; (10) physical and remote sensing attributes important in detecting, measuring, and monitoring agricultural crops; (11) prelaunch studies using LANDSAT D; (12) the multispectral linear array; (13) the aircraft linear array pushbroom radiometer; and (14) the spaceborne laser ranging system

    Satellite Propulsion Spectral Signature Detection and Analysis for Space Situational Awareness using Small Telescopes

    Get PDF
    Safe satellite operations are of utmost importance. Maintaining precise orbital maintenance places stringent performance requirements on current propulsion systems, which are often electric propulsion systems. Electron temperature is a commonly used diagnostic to determine the performance of a Hall thruster, and recent work has correlated near infrared (NIR) spectral measurements of ionization lines of xenon and krypton to electron temperature measurements. In the research herein, appropriate line spectra ratios are identified for each propellant type when used with remote space-to-ground observations. NIR plume emissions were used to characterize a 600 Watt Hall thruster for a variety of observation angles and operating power levels. An end-to-end model was developed to predict the signal-to-noise ratio (SNR) of an on-orbit operating Hall thruster when viewed from a terrestrial telescope. Good agreement between the models SNR prediction and several observed star spectra was achieved. It was concluded that the operating power of a Hall thruster could be determined but telescopes were incapable of achieving the desired SNR

    A survey of statistical network models

    Full text link
    Networks are ubiquitous in science and have become a focal point for discussion in everyday life. Formal statistical models for the analysis of network data have emerged as a major topic of interest in diverse areas of study, and most of these involve a form of graphical representation. Probability models on graphs date back to 1959. Along with empirical studies in social psychology and sociology from the 1960s, these early works generated an active network community and a substantial literature in the 1970s. This effort moved into the statistical literature in the late 1970s and 1980s, and the past decade has seen a burgeoning network literature in statistical physics and computer science. The growth of the World Wide Web and the emergence of online networking communities such as Facebook, MySpace, and LinkedIn, and a host of more specialized professional network communities has intensified interest in the study of networks and network data. Our goal in this review is to provide the reader with an entry point to this burgeoning literature. We begin with an overview of the historical development of statistical network modeling and then we introduce a number of examples that have been studied in the network literature. Our subsequent discussion focuses on a number of prominent static and dynamic network models and their interconnections. We emphasize formal model descriptions, and pay special attention to the interpretation of parameters and their estimation. We end with a description of some open problems and challenges for machine learning and statistics.Comment: 96 pages, 14 figures, 333 reference

    16. Early Clopidogrel Therapy in Acute Ischemic Stroke

    Get PDF

    Beyond the Conventional Quark Model: Using QCD Sum Rules to Explore the Spectrum of Exotic Hadrons

    Get PDF
    Exotic hadrons are theoretical structures allowed by our current understanding of Quantum Chromodynamics (QCD), lying outside the traditional qqˉq\bar{q}, qqqqqq, or qˉqˉqˉ\bar{q}\bar{q}\bar{q} understanding of mesons and baryons. These exotic hadrons potentially give us a unique window into the properties of the gluon, the nature of color confinement, and the strong interaction. As we progress through the precision-era of particle physics and experiments such as BESIII, Belle, BaBar, LHCb, GlueX, and PANDA amass experimental data across the expected mass ranges of exotic hadrons (such as hybrid mesons with both qˉq\bar{q}q quark content and a gluonic component), theoretical predictions of the individual mass states and the overall multiplet structure are crucial in identifying exotic states as well as departures from predicted behaviour. Using the methodology of QCD sum-rules (QCDSRs), we explore the properties of exotic hadrons, and discuss the QCDSR methodology and its extensions
    corecore